Apr 24 21:13:57.887721 ip-10-0-132-81 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 21:13:57.887737 ip-10-0-132-81 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 21:13:57.887747 ip-10-0-132-81 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 21:13:57.888101 ip-10-0-132-81 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 21:14:08.074728 ip-10-0-132-81 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 21:14:08.074749 ip-10-0-132-81 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 7ec5c3997d26427ca5d917ad6a6eb045 -- Apr 24 21:16:22.023983 ip-10-0-132-81 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:16:22.493918 ip-10-0-132-81 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:22.493918 ip-10-0-132-81 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:16:22.493918 ip-10-0-132-81 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:22.493918 ip-10-0-132-81 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:16:22.493918 ip-10-0-132-81 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:22.497500 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.497412 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:16:22.499832 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499815 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:22.499832 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499831 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:22.499832 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499834 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499839 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499842 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499845 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499849 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499852 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499855 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499858 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499860 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499863 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499871 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499874 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499876 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499879 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499882 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499885 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499887 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499890 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499893 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499895 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:22.499930 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499898 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499901 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499903 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499906 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499909 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499912 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499914 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499917 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499919 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499922 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499924 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499928 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499932 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499935 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499938 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499941 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499944 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499947 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499949 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:22.500412 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499952 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499955 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499957 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499960 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499962 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499965 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499968 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499970 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499973 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499975 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499978 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499980 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499983 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499985 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499988 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499992 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499995 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.499998 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500000 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500003 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:22.500875 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500005 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500008 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500010 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500013 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500016 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500018 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500021 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500024 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500026 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500030 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500032 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500034 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500037 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500040 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500043 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500045 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500048 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500050 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500053 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:22.501486 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500056 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500060 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500063 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500066 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500069 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.500072 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501368 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501375 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501378 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501381 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501384 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501387 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501390 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501393 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501396 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501399 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501402 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501405 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501408 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501411 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:22.501956 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501414 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501417 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501420 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501423 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501425 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501428 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501431 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501434 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501437 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501439 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501442 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501445 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501447 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501450 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501452 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501455 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501458 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501460 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501462 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501466 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:22.502428 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501468 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501471 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501474 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501476 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501479 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501481 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501484 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501487 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501489 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501492 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501495 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501499 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501502 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501505 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501509 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501511 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501514 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501517 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501519 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501522 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:22.502976 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501525 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501528 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501530 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501534 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501538 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501541 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501543 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501546 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501549 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501552 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501554 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501557 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501561 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501563 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501565 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501568 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501570 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501573 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501576 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:22.503476 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501580 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501583 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501586 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501588 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501591 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501593 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501596 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501598 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501600 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501603 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501606 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501608 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.501610 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502426 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502435 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502442 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502447 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502452 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502455 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502460 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502465 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:16:22.503974 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502468 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502472 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502475 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502480 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502483 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502486 2578 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502489 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502492 2578 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502495 2578 flags.go:64] FLAG: --cloud-config="" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502498 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502501 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502506 2578 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502509 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502513 2578 flags.go:64] FLAG: --config-dir="" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502516 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502519 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502524 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502527 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502530 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502534 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502537 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502540 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502543 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502546 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502549 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:16:22.504491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502554 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502557 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502560 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502563 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502566 2578 flags.go:64] FLAG: --enable-server="true" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502569 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502574 2578 flags.go:64] FLAG: --event-burst="100" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502578 2578 flags.go:64] FLAG: --event-qps="50" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502583 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502587 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502593 2578 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502598 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502603 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502607 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502613 2578 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502617 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502623 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502628 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502638 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502643 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502648 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502653 2578 flags.go:64] FLAG: --feature-gates="" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502659 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502664 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502670 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:16:22.505117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502689 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502695 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502700 2578 flags.go:64] FLAG: --help="false" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502705 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-132-81.ec2.internal" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502710 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502716 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502721 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502727 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502733 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502738 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502743 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502747 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502753 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502758 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502763 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502768 2578 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502774 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502779 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502784 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502789 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502794 2578 flags.go:64] FLAG: --lock-file="" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502799 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502804 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502807 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:16:22.505729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502814 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502819 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502822 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502825 2578 flags.go:64] FLAG: --logging-format="text" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502828 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502831 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502834 2578 flags.go:64] FLAG: --manifest-url="" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502837 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502842 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502845 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502849 2578 flags.go:64] FLAG: --max-pods="110" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502852 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502855 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502858 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502861 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502864 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502867 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502871 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502879 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502882 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502886 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502889 2578 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502892 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:16:22.506344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502898 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502901 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502904 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502907 2578 flags.go:64] FLAG: --port="10250" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502910 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502913 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e23b6a6ffe87946f" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502917 2578 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502920 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502923 2578 flags.go:64] FLAG: --register-node="true" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502926 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502929 2578 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502934 2578 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502937 2578 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502940 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502943 2578 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502947 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502950 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502953 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502956 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502959 2578 flags.go:64] FLAG: --runonce="false" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502962 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502965 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502968 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502971 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502974 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502978 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:16:22.506897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502981 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502985 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502989 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502992 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502995 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.502998 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.503001 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.503004 2578 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.503007 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.503012 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.503015 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.503018 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.503023 2578 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.503026 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.503029 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.503032 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.503035 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.503039 2578 flags.go:64] FLAG: --v="2" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.503043 2578 flags.go:64] FLAG: --version="false" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.503047 2578 flags.go:64] FLAG: --vmodule="" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.503052 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.503055 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503153 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503157 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:22.507524 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503160 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503163 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503165 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503168 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503170 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503173 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503175 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503178 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503181 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503183 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503186 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503189 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503191 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503193 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503196 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503198 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503201 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503203 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503206 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503208 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:22.508106 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503211 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503213 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503216 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503218 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503221 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503224 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503226 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503229 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503232 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503234 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503237 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503239 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503242 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503244 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503247 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503249 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503252 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503254 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503257 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503260 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:22.508611 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503263 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:22.509123 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503265 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:22.509123 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503267 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:22.509123 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503270 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:22.509123 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503274 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:22.509123 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503278 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:22.509123 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503281 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:22.509123 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503284 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:22.509123 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503287 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:22.509123 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503289 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:22.509123 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503293 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:22.509123 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503295 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:22.509123 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503298 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:22.509123 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503300 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:22.509123 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503303 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:22.509123 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503307 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:22.509123 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503311 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:22.509123 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503314 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:22.509123 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503317 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503320 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503325 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503327 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503330 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503333 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503336 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503338 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503340 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503343 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503346 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503348 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503351 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503354 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503356 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503358 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503361 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503363 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503366 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503369 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:22.509653 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503371 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:22.510214 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503374 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:22.510214 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503377 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:22.510214 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503379 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:22.510214 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503381 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:22.510214 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.503384 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:22.510214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.503389 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:22.511401 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.511371 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:16:22.511401 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.511393 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511446 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511452 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511456 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511459 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511462 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511466 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511469 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511471 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511474 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511477 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511480 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511483 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511486 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511488 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511491 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511494 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511497 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511500 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511503 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:22.511536 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511506 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511508 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511512 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511515 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511517 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511520 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511523 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511525 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511528 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511531 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511533 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511536 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511538 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511542 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511545 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511547 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511550 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511553 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511556 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:22.512058 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511558 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511561 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511563 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511566 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511568 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511571 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511573 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511576 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511578 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511581 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511584 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511587 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511589 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511592 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511594 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511598 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511600 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511603 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511605 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511610 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:22.512534 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511614 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511616 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511619 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511622 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511625 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511627 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511630 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511632 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511635 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511637 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511640 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511642 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511645 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511647 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511650 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511652 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511655 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511657 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511660 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511664 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:22.513049 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511669 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:22.513541 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511672 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:22.513541 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511809 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:22.513541 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511814 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:22.513541 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511817 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:22.513541 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511820 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:22.513541 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511823 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:22.513541 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511826 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:22.513541 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.511831 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:22.513541 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511937 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:22.513541 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511942 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:22.513541 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511945 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:22.513541 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511948 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:22.513541 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511951 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:22.513541 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511954 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:22.513541 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511956 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:22.513541 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511959 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511962 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511964 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511967 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511969 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511972 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511974 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511977 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511979 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511982 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511984 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511987 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511989 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511992 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511995 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.511998 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512000 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512003 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512007 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:22.513952 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512011 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512013 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512016 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512019 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512022 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512025 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512028 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512031 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512033 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512036 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512039 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512042 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512044 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512047 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512049 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512052 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512054 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512057 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512060 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512062 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:22.514426 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512065 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512068 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512071 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512075 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512077 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512080 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512082 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512085 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512088 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512091 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512093 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512096 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512098 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512101 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512103 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512106 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512109 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512111 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512114 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512116 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:22.514939 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512119 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512121 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512124 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512126 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512129 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512131 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512134 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512136 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512138 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512141 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512143 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512146 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512148 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512151 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512153 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512155 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512158 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512168 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512172 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:22.515433 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:22.512174 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:22.515912 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.512179 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:22.515912 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.512992 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:16:22.516737 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.516722 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:16:22.517800 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.517788 2578 server.go:1019] "Starting client certificate rotation" Apr 24 21:16:22.517904 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.517886 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:16:22.517943 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.517921 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:16:22.544270 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.544246 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:16:22.546034 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.546005 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:16:22.558899 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.558871 2578 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:16:22.566691 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.566615 2578 log.go:25] "Validated CRI v1 image API" Apr 24 21:16:22.569969 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.569943 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:16:22.573382 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.573358 2578 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9788decb-3c0c-4a86-adbc-3f3915e30791:/dev/nvme0n1p4 dbfbcbaf-7b7a-4cc7-a928-45c4fb837d1f:/dev/nvme0n1p3] Apr 24 21:16:22.573469 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.573380 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:16:22.578314 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.578201 2578 manager.go:217] Machine: {Timestamp:2026-04-24 21:16:22.577121492 +0000 UTC m=+0.428309982 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3105183 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec228f92ba66e8ef06888755a4600b98 SystemUUID:ec228f92-ba66-e8ef-0688-8755a4600b98 BootID:7ec5c399-7d26-427c-a5d9-17ad6a6eb045 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7b:09:c7:a3:b9 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7b:09:c7:a3:b9 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1a:b4:d2:2e:59:11 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:16:22.578314 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.578310 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:16:22.578420 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.578404 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:16:22.579328 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.579301 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:16:22.579480 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.579331 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-81.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:16:22.580115 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.580105 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:16:22.580151 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.580118 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:16:22.580151 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.580131 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:16:22.580787 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.580770 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:16:22.581215 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.581204 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:16:22.582760 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.582747 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:16:22.582878 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.582869 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:16:22.585243 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.585233 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:16:22.585278 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.585250 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:16:22.585278 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.585264 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:16:22.585278 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.585272 2578 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:16:22.585406 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.585281 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:16:22.586389 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.586371 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:16:22.586389 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.586390 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:16:22.589660 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.589644 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:16:22.591148 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.591131 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:16:22.593198 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.593181 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:16:22.593277 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.593201 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:16:22.593277 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.593210 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:16:22.593277 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.593219 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:16:22.593277 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.593227 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:16:22.593277 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.593237 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:16:22.593277 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.593245 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:16:22.593277 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.593264 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:16:22.593277 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.593275 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:16:22.593523 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.593286 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:16:22.593523 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.593306 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:16:22.593523 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.593320 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:16:22.594202 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.594191 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:16:22.594268 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.594204 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:16:22.595954 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:22.595926 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-81.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:16:22.596039 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:22.595951 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:16:22.598105 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.598089 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-81.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:16:22.598282 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.598271 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:16:22.598324 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.598309 2578 server.go:1295] "Started kubelet" Apr 24 21:16:22.598420 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.598370 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:16:22.598474 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.598401 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:16:22.598519 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.598492 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:16:22.599379 ip-10-0-132-81 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:16:22.599622 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.599606 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:16:22.601332 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.601315 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:16:22.601512 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.601484 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-p56d9" Apr 24 21:16:22.605927 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.605904 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:16:22.606704 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.606661 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:16:22.609298 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.609276 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:16:22.609479 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.609196 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:16:22.609559 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.609550 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:16:22.609763 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:22.609474 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-81.ec2.internal\" not found" Apr 24 21:16:22.610038 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.610026 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:16:22.610122 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.610056 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:16:22.610556 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.610254 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-p56d9" Apr 24 21:16:22.610650 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.610573 2578 factory.go:153] Registering CRI-O factory Apr 24 21:16:22.610650 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.610597 2578 factory.go:223] Registration of the crio container factory successfully Apr 24 21:16:22.610812 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.610665 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:16:22.610812 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.610688 2578 factory.go:55] Registering systemd factory Apr 24 21:16:22.610812 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.610697 2578 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:16:22.610812 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.610726 2578 factory.go:103] Registering Raw factory Apr 24 21:16:22.610812 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.610741 2578 manager.go:1196] Started watching for new ooms in manager Apr 24 21:16:22.611434 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:22.610313 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-81.ec2.internal.18a9678cd86a0f91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-81.ec2.internal,UID:ip-10-0-132-81.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-81.ec2.internal,},FirstTimestamp:2026-04-24 21:16:22.598283153 +0000 UTC m=+0.449471643,LastTimestamp:2026-04-24 21:16:22.598283153 +0000 UTC m=+0.449471643,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-81.ec2.internal,}" Apr 24 21:16:22.611566 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.611548 2578 manager.go:319] Starting recovery of all containers Apr 24 21:16:22.612865 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:22.612841 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:16:22.613214 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:22.613189 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-81.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:16:22.613358 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:22.613334 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:16:22.623870 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.623698 2578 manager.go:324] Recovery completed Apr 24 21:16:22.628359 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.628345 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:22.631364 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.631345 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:22.631437 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.631379 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:22.631437 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.631391 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:22.631992 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.631976 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:16:22.631992 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.631988 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:16:22.632087 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.632004 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:16:22.634388 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.634369 2578 policy_none.go:49] "None policy: Start" Apr 24 21:16:22.634388 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.634387 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:16:22.634511 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.634397 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:16:22.677376 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.671360 2578 manager.go:341] "Starting Device Plugin manager" Apr 24 21:16:22.677376 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:22.671393 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:16:22.677376 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.671403 2578 server.go:85] "Starting device plugin registration server" Apr 24 21:16:22.677376 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.671730 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:16:22.677376 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.671744 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:16:22.677376 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.671832 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:16:22.677376 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.671927 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:16:22.677376 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.671937 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:16:22.677376 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:22.672568 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:16:22.677376 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:22.672607 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-81.ec2.internal\" not found" Apr 24 21:16:22.706289 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.706251 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:16:22.707498 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.707482 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:16:22.707578 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.707511 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:16:22.707578 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.707530 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:16:22.707578 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.707542 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:16:22.707720 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:22.707575 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:16:22.710712 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.710689 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:22.772532 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.772442 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:22.773454 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.773434 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:22.773543 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.773467 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:22.773543 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.773478 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:22.773543 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.773501 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-81.ec2.internal" Apr 24 21:16:22.782066 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.782041 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-81.ec2.internal" Apr 24 21:16:22.782147 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:22.782071 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-81.ec2.internal\": node \"ip-10-0-132-81.ec2.internal\" not found" Apr 24 21:16:22.798602 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:22.798575 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-81.ec2.internal\" not found" Apr 24 21:16:22.807704 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.807652 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-81.ec2.internal"] Apr 24 21:16:22.807786 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.807777 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:22.808792 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.808773 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:22.808885 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.808804 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:22.808885 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.808820 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:22.810161 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.810148 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:22.810358 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.810346 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal" Apr 24 21:16:22.810396 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.810373 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:22.815963 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.815948 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:22.816028 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.815982 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:22.816028 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.815992 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:22.816113 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.815950 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:22.816113 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.816064 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:22.816113 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.816083 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:22.817561 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.817547 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-81.ec2.internal" Apr 24 21:16:22.817615 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.817573 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:22.818513 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.818501 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:22.818570 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.818524 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:22.818570 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.818533 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:22.841645 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:22.841616 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-81.ec2.internal\" not found" node="ip-10-0-132-81.ec2.internal" Apr 24 21:16:22.846314 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:22.846292 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-81.ec2.internal\" not found" node="ip-10-0-132-81.ec2.internal" Apr 24 21:16:22.899219 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:22.899187 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-81.ec2.internal\" not found" Apr 24 21:16:22.910613 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.910590 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f22afa25508444e921602cc1e4e9b57f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal\" (UID: \"f22afa25508444e921602cc1e4e9b57f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal" Apr 24 21:16:22.910744 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.910620 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f22afa25508444e921602cc1e4e9b57f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal\" (UID: \"f22afa25508444e921602cc1e4e9b57f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal" Apr 24 21:16:22.910744 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:22.910641 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/91804ee55f307105f50a9b09138ac4ec-config\") pod \"kube-apiserver-proxy-ip-10-0-132-81.ec2.internal\" (UID: \"91804ee55f307105f50a9b09138ac4ec\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-81.ec2.internal" Apr 24 21:16:22.999976 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:22.999937 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-81.ec2.internal\" not found" Apr 24 21:16:23.011372 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.011345 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f22afa25508444e921602cc1e4e9b57f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal\" (UID: \"f22afa25508444e921602cc1e4e9b57f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal" Apr 24 21:16:23.011506 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.011378 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f22afa25508444e921602cc1e4e9b57f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal\" (UID: \"f22afa25508444e921602cc1e4e9b57f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal" Apr 24 21:16:23.011506 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.011396 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/91804ee55f307105f50a9b09138ac4ec-config\") pod \"kube-apiserver-proxy-ip-10-0-132-81.ec2.internal\" (UID: \"91804ee55f307105f50a9b09138ac4ec\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-81.ec2.internal" Apr 24 21:16:23.011506 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.011448 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/91804ee55f307105f50a9b09138ac4ec-config\") pod \"kube-apiserver-proxy-ip-10-0-132-81.ec2.internal\" (UID: \"91804ee55f307105f50a9b09138ac4ec\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-81.ec2.internal" Apr 24 21:16:23.011506 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.011463 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f22afa25508444e921602cc1e4e9b57f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal\" (UID: \"f22afa25508444e921602cc1e4e9b57f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal" Apr 24 21:16:23.011506 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.011473 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f22afa25508444e921602cc1e4e9b57f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal\" (UID: \"f22afa25508444e921602cc1e4e9b57f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal" Apr 24 21:16:23.100708 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:23.100615 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-81.ec2.internal\" not found" Apr 24 21:16:23.144253 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.144215 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal" Apr 24 21:16:23.149813 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.149797 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-81.ec2.internal" Apr 24 21:16:23.201574 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:23.201541 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-81.ec2.internal\" not found" Apr 24 21:16:23.302211 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:23.302169 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-81.ec2.internal\" not found" Apr 24 21:16:23.402875 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:23.402795 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-81.ec2.internal\" not found" Apr 24 21:16:23.503508 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:23.503466 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-81.ec2.internal\" not found" Apr 24 21:16:23.517004 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.516981 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:16:23.517130 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.517116 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:16:23.564544 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.564516 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:23.603611 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:23.603585 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-81.ec2.internal\" not found" Apr 24 21:16:23.606035 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.606016 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:16:23.613317 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.613269 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:11:22 +0000 UTC" deadline="2027-12-29 08:27:52.205861499 +0000 UTC" Apr 24 21:16:23.613317 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.613313 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14723h11m28.592553244s" Apr 24 21:16:23.615120 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.615100 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:16:23.638164 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.638141 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-7v84g" Apr 24 21:16:23.645748 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.645726 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-7v84g" Apr 24 21:16:23.647229 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:23.647202 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91804ee55f307105f50a9b09138ac4ec.slice/crio-e600037cc4b5e3030a4c447a904e6a549136389cff9ef778792bc61a8888b538 WatchSource:0}: Error finding container e600037cc4b5e3030a4c447a904e6a549136389cff9ef778792bc61a8888b538: Status 404 returned error can't find the container with id e600037cc4b5e3030a4c447a904e6a549136389cff9ef778792bc61a8888b538 Apr 24 21:16:23.647332 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.647213 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:23.647461 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:23.647443 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf22afa25508444e921602cc1e4e9b57f.slice/crio-750cce40eb5a4c7559ae5f96fdcabf1b049da69c7c374bec215791d8b21d9f4b WatchSource:0}: Error finding container 750cce40eb5a4c7559ae5f96fdcabf1b049da69c7c374bec215791d8b21d9f4b: Status 404 returned error can't find the container with id 750cce40eb5a4c7559ae5f96fdcabf1b049da69c7c374bec215791d8b21d9f4b Apr 24 21:16:23.654135 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.654115 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:16:23.657889 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.657870 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:23.707715 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.707691 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal" Apr 24 21:16:23.710494 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.710441 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-81.ec2.internal" event={"ID":"91804ee55f307105f50a9b09138ac4ec","Type":"ContainerStarted","Data":"e600037cc4b5e3030a4c447a904e6a549136389cff9ef778792bc61a8888b538"} Apr 24 21:16:23.711400 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.711382 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal" event={"ID":"f22afa25508444e921602cc1e4e9b57f","Type":"ContainerStarted","Data":"750cce40eb5a4c7559ae5f96fdcabf1b049da69c7c374bec215791d8b21d9f4b"} Apr 24 21:16:23.718877 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.718860 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:16:23.719861 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.719849 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-81.ec2.internal" Apr 24 21:16:23.727450 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:23.727435 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:16:24.586343 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.586307 2578 apiserver.go:52] "Watching apiserver" Apr 24 21:16:24.595105 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.595070 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:16:24.595475 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.595452 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-2c2cp","kube-system/kube-apiserver-proxy-ip-10-0-132-81.ec2.internal","openshift-dns/node-resolver-4x96t","openshift-image-registry/node-ca-2tx46","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal","openshift-multus/multus-additional-cni-plugins-cjzlc","openshift-network-diagnostics/network-check-target-dwmtg","openshift-network-operator/iptables-alerter-s2wzz","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5","openshift-cluster-node-tuning-operator/tuned-6xzp7","openshift-multus/multus-klqg5","openshift-multus/network-metrics-daemon-62n84","openshift-ovn-kubernetes/ovnkube-node-cg9z2"] Apr 24 21:16:24.598691 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.598651 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:24.598805 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:24.598735 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dwmtg" podUID="87f55c8d-8e86-4982-94b5-0f6145c23361" Apr 24 21:16:24.599907 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.599884 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4x96t" Apr 24 21:16:24.601255 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.601135 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2tx46" Apr 24 21:16:24.602245 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.602222 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:16:24.602348 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.602225 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:16:24.602348 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.602309 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wxxgs\"" Apr 24 21:16:24.602527 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.602513 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.603268 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.603248 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:16:24.603367 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.603287 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:16:24.603658 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.603632 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:16:24.604125 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.604099 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-qxxrn\"" Apr 24 21:16:24.605336 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.604702 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2c2cp" Apr 24 21:16:24.605336 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.604742 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s2wzz" Apr 24 21:16:24.606174 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.606153 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:16:24.606367 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.606314 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:16:24.606571 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.606515 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:16:24.606571 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.606545 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:16:24.606699 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.606584 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:16:24.606944 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.606927 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vddq6\"" Apr 24 21:16:24.607841 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.607202 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:16:24.607841 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.607501 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.607841 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.607549 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:16:24.608422 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.608397 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:24.609036 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.608751 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:24.609036 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.608775 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-hqqq9\"" Apr 24 21:16:24.609036 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.608957 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-d98d6\"" Apr 24 21:16:24.609036 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.608993 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:16:24.609698 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.609290 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.611126 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.611110 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.611906 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.611886 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rcvqf\"" Apr 24 21:16:24.612032 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.612014 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:16:24.612219 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.612196 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:16:24.612389 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.612374 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:16:24.612500 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.612484 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:24.612587 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:24.612564 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62n84" podUID="214d9db8-1af2-4a55-8c32-7b0ade9b8b1b" Apr 24 21:16:24.613179 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.613161 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:24.613281 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.613209 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:24.613343 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.613283 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-8ldq2\"" Apr 24 21:16:24.613689 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.613656 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.613985 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.613825 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:16:24.613985 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.613903 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-st5tl\"" Apr 24 21:16:24.616222 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.616195 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:16:24.617292 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.617273 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:16:24.617437 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.617419 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vhv4z\"" Apr 24 21:16:24.617518 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.617449 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:16:24.617518 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.617500 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:16:24.617636 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.617537 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:16:24.617891 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.617874 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:16:24.618884 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.618798 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szgzb\" (UniqueName: \"kubernetes.io/projected/a2a58d1a-b069-41cc-b869-2a502d2e4e3c-kube-api-access-szgzb\") pod \"node-ca-2tx46\" (UID: \"a2a58d1a-b069-41cc-b869-2a502d2e4e3c\") " pod="openshift-image-registry/node-ca-2tx46" Apr 24 21:16:24.618884 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.618836 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b90d25e6-8bbe-484f-9222-fe772ed03d48-hosts-file\") pod \"node-resolver-4x96t\" (UID: \"b90d25e6-8bbe-484f-9222-fe772ed03d48\") " pod="openshift-dns/node-resolver-4x96t" Apr 24 21:16:24.619039 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.618887 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.619039 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.618945 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-sysctl-d\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.619039 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.618972 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-lib-modules\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.619039 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.618997 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-multus-socket-dir-parent\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.619039 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619022 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-hostroot\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.619399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619044 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/42031ed4-962e-4310-b28d-4e04504596d2-device-dir\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.619399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619063 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/42031ed4-962e-4310-b28d-4e04504596d2-etc-selinux\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.619399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619082 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/42031ed4-962e-4310-b28d-4e04504596d2-sys-fs\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.619399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619102 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/372d4cd0-c127-43df-b1c7-06d67c0f967c-tmp\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.619399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619125 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2a58d1a-b069-41cc-b869-2a502d2e4e3c-host\") pod \"node-ca-2tx46\" (UID: \"a2a58d1a-b069-41cc-b869-2a502d2e4e3c\") " pod="openshift-image-registry/node-ca-2tx46" Apr 24 21:16:24.619399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619149 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-etc-kubernetes\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.619399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619186 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.619399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619213 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.619399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619246 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-sys\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.619399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619292 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a2a58d1a-b069-41cc-b869-2a502d2e4e3c-serviceca\") pod \"node-ca-2tx46\" (UID: \"a2a58d1a-b069-41cc-b869-2a502d2e4e3c\") " pod="openshift-image-registry/node-ca-2tx46" Apr 24 21:16:24.619399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619311 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42031ed4-962e-4310-b28d-4e04504596d2-socket-dir\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.619399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619332 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-sysconfig\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.619399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619347 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8d2fcb39-1f56-4917-9964-9a549ea0b2a2-iptables-alerter-script\") pod \"iptables-alerter-s2wzz\" (UID: \"8d2fcb39-1f56-4917-9964-9a549ea0b2a2\") " pod="openshift-network-operator/iptables-alerter-s2wzz" Apr 24 21:16:24.619399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619364 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-cni-binary-copy\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.619399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619386 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-host-var-lib-cni-bin\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.620023 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619422 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-multus-daemon-config\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.620023 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619444 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-system-cni-dir\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.620023 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619485 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-os-release\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.620023 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619521 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-kubernetes\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.620023 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619560 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-cnibin\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.620023 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619600 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.620023 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619629 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c3b3582a-d64c-4ef1-8758-602aabd2be60-agent-certs\") pod \"konnectivity-agent-2c2cp\" (UID: \"c3b3582a-d64c-4ef1-8758-602aabd2be60\") " pod="kube-system/konnectivity-agent-2c2cp" Apr 24 21:16:24.620023 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619697 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8d2fcb39-1f56-4917-9964-9a549ea0b2a2-host-slash\") pod \"iptables-alerter-s2wzz\" (UID: \"8d2fcb39-1f56-4917-9964-9a549ea0b2a2\") " pod="openshift-network-operator/iptables-alerter-s2wzz" Apr 24 21:16:24.620023 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619752 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-systemd\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.620023 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619798 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-host-run-netns\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.620023 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619822 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-host-var-lib-cni-multus\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.620023 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619855 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b90d25e6-8bbe-484f-9222-fe772ed03d48-tmp-dir\") pod \"node-resolver-4x96t\" (UID: \"b90d25e6-8bbe-484f-9222-fe772ed03d48\") " pod="openshift-dns/node-resolver-4x96t" Apr 24 21:16:24.620023 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619892 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-var-lib-kubelet\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.620023 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619928 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjvzd\" (UniqueName: \"kubernetes.io/projected/8d2fcb39-1f56-4917-9964-9a549ea0b2a2-kube-api-access-rjvzd\") pod \"iptables-alerter-s2wzz\" (UID: \"8d2fcb39-1f56-4917-9964-9a549ea0b2a2\") " pod="openshift-network-operator/iptables-alerter-s2wzz" Apr 24 21:16:24.620023 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619966 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-sysctl-conf\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.620023 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.619988 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-multus-conf-dir\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.620023 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620012 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-os-release\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.620692 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620035 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42031ed4-962e-4310-b28d-4e04504596d2-registration-dir\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.620692 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620059 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8cwj\" (UniqueName: \"kubernetes.io/projected/42031ed4-962e-4310-b28d-4e04504596d2-kube-api-access-n8cwj\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.620692 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620081 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-system-cni-dir\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.620692 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620109 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-cnibin\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.620692 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620149 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkd58\" (UniqueName: \"kubernetes.io/projected/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-kube-api-access-hkd58\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.620692 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620184 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-modprobe-d\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.620692 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620207 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-tuned\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.620692 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620230 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c3b3582a-d64c-4ef1-8758-602aabd2be60-konnectivity-ca\") pod \"konnectivity-agent-2c2cp\" (UID: \"c3b3582a-d64c-4ef1-8758-602aabd2be60\") " pod="kube-system/konnectivity-agent-2c2cp" Apr 24 21:16:24.620692 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620264 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkpq2\" (UniqueName: \"kubernetes.io/projected/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-kube-api-access-rkpq2\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.620692 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620287 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj7nd\" (UniqueName: \"kubernetes.io/projected/372d4cd0-c127-43df-b1c7-06d67c0f967c-kube-api-access-xj7nd\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.620692 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620322 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-host-run-k8s-cni-cncf-io\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.620692 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620345 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-host-var-lib-kubelet\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.620692 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620368 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7cq9\" (UniqueName: \"kubernetes.io/projected/b90d25e6-8bbe-484f-9222-fe772ed03d48-kube-api-access-d7cq9\") pod \"node-resolver-4x96t\" (UID: \"b90d25e6-8bbe-484f-9222-fe772ed03d48\") " pod="openshift-dns/node-resolver-4x96t" Apr 24 21:16:24.620692 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620390 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42031ed4-962e-4310-b28d-4e04504596d2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.620692 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620423 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-host\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.620692 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620456 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-multus-cni-dir\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.621299 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620496 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-host-run-multus-certs\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.621299 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620525 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-run\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.621299 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.620560 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p6kc\" (UniqueName: \"kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc\") pod \"network-check-target-dwmtg\" (UID: \"87f55c8d-8e86-4982-94b5-0f6145c23361\") " pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:24.646367 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.646324 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:11:23 +0000 UTC" deadline="2027-11-07 20:03:25.511134083 +0000 UTC" Apr 24 21:16:24.646367 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.646365 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13486h47m0.864774952s" Apr 24 21:16:24.670614 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.670578 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:24.710809 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.710777 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:16:24.721573 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721541 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl92b\" (UniqueName: \"kubernetes.io/projected/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-kube-api-access-rl92b\") pod \"network-metrics-daemon-62n84\" (UID: \"214d9db8-1af2-4a55-8c32-7b0ade9b8b1b\") " pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:24.721751 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721584 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.721751 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721601 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/38d01fc4-4ff2-408e-baa1-6d9c62d27470-ovn-node-metrics-cert\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.721751 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721630 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c3b3582a-d64c-4ef1-8758-602aabd2be60-agent-certs\") pod \"konnectivity-agent-2c2cp\" (UID: \"c3b3582a-d64c-4ef1-8758-602aabd2be60\") " pod="kube-system/konnectivity-agent-2c2cp" Apr 24 21:16:24.721751 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721656 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8d2fcb39-1f56-4917-9964-9a549ea0b2a2-host-slash\") pod \"iptables-alerter-s2wzz\" (UID: \"8d2fcb39-1f56-4917-9964-9a549ea0b2a2\") " pod="openshift-network-operator/iptables-alerter-s2wzz" Apr 24 21:16:24.721751 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721730 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-systemd\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.721980 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-host-run-netns\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.721980 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721781 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-host-var-lib-cni-multus\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.721980 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721798 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b90d25e6-8bbe-484f-9222-fe772ed03d48-tmp-dir\") pod \"node-resolver-4x96t\" (UID: \"b90d25e6-8bbe-484f-9222-fe772ed03d48\") " pod="openshift-dns/node-resolver-4x96t" Apr 24 21:16:24.721980 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721819 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-var-lib-kubelet\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.721980 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721844 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjvzd\" (UniqueName: \"kubernetes.io/projected/8d2fcb39-1f56-4917-9964-9a549ea0b2a2-kube-api-access-rjvzd\") pod \"iptables-alerter-s2wzz\" (UID: \"8d2fcb39-1f56-4917-9964-9a549ea0b2a2\") " pod="openshift-network-operator/iptables-alerter-s2wzz" Apr 24 21:16:24.721980 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721846 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-systemd\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.721980 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721846 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8d2fcb39-1f56-4917-9964-9a549ea0b2a2-host-slash\") pod \"iptables-alerter-s2wzz\" (UID: \"8d2fcb39-1f56-4917-9964-9a549ea0b2a2\") " pod="openshift-network-operator/iptables-alerter-s2wzz" Apr 24 21:16:24.721980 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-sysctl-conf\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.721980 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721870 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-host-run-netns\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.721980 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721903 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-host-var-lib-cni-multus\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.721980 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721918 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-var-lib-kubelet\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.721980 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721962 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-multus-conf-dir\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.722526 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.721991 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-os-release\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.722526 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722018 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42031ed4-962e-4310-b28d-4e04504596d2-registration-dir\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.722526 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722034 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-multus-conf-dir\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.722526 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722047 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8cwj\" (UniqueName: \"kubernetes.io/projected/42031ed4-962e-4310-b28d-4e04504596d2-kube-api-access-n8cwj\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.722526 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722060 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:16:24.722526 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722070 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-os-release\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.722526 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722077 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-run-systemd\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.722526 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722020 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-sysctl-conf\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.722526 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722091 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42031ed4-962e-4310-b28d-4e04504596d2-registration-dir\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.722526 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722104 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-system-cni-dir\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.722526 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722117 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b90d25e6-8bbe-484f-9222-fe772ed03d48-tmp-dir\") pod \"node-resolver-4x96t\" (UID: \"b90d25e6-8bbe-484f-9222-fe772ed03d48\") " pod="openshift-dns/node-resolver-4x96t" Apr 24 21:16:24.722526 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722140 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-systemd-units\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.722526 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722163 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-system-cni-dir\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.722526 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722210 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-etc-openvswitch\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.722526 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722241 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-cnibin\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.722526 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722267 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkd58\" (UniqueName: \"kubernetes.io/projected/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-kube-api-access-hkd58\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.722526 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722290 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-modprobe-d\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.723339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722301 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-cnibin\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.723339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722308 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-tuned\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.723339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722340 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c3b3582a-d64c-4ef1-8758-602aabd2be60-konnectivity-ca\") pod \"konnectivity-agent-2c2cp\" (UID: \"c3b3582a-d64c-4ef1-8758-602aabd2be60\") " pod="kube-system/konnectivity-agent-2c2cp" Apr 24 21:16:24.723339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722367 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkpq2\" (UniqueName: \"kubernetes.io/projected/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-kube-api-access-rkpq2\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.723339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722394 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-node-log\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.723339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722420 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xj7nd\" (UniqueName: \"kubernetes.io/projected/372d4cd0-c127-43df-b1c7-06d67c0f967c-kube-api-access-xj7nd\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.723339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722429 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-modprobe-d\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.723339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722446 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-host-run-k8s-cni-cncf-io\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.723339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722471 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-host-var-lib-kubelet\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.723339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722496 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7cq9\" (UniqueName: \"kubernetes.io/projected/b90d25e6-8bbe-484f-9222-fe772ed03d48-kube-api-access-d7cq9\") pod \"node-resolver-4x96t\" (UID: \"b90d25e6-8bbe-484f-9222-fe772ed03d48\") " pod="openshift-dns/node-resolver-4x96t" Apr 24 21:16:24.723339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722521 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42031ed4-962e-4310-b28d-4e04504596d2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.723339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722547 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-run-openvswitch\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.723339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722570 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-run-ovn\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.723339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722593 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-cni-bin\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.723339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722620 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-host\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.723339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-multus-cni-dir\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.723339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-host-run-multus-certs\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.724214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722721 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-run-netns\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.724214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722746 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-run\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.724214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722774 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8p6kc\" (UniqueName: \"kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc\") pod \"network-check-target-dwmtg\" (UID: \"87f55c8d-8e86-4982-94b5-0f6145c23361\") " pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:24.724214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szgzb\" (UniqueName: \"kubernetes.io/projected/a2a58d1a-b069-41cc-b869-2a502d2e4e3c-kube-api-access-szgzb\") pod \"node-ca-2tx46\" (UID: \"a2a58d1a-b069-41cc-b869-2a502d2e4e3c\") " pod="openshift-image-registry/node-ca-2tx46" Apr 24 21:16:24.724214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722859 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b90d25e6-8bbe-484f-9222-fe772ed03d48-hosts-file\") pod \"node-resolver-4x96t\" (UID: \"b90d25e6-8bbe-484f-9222-fe772ed03d48\") " pod="openshift-dns/node-resolver-4x96t" Apr 24 21:16:24.724214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722886 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.724214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722914 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-cni-netd\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.724214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722943 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xff2q\" (UniqueName: \"kubernetes.io/projected/38d01fc4-4ff2-408e-baa1-6d9c62d27470-kube-api-access-xff2q\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.724214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722972 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-sysctl-d\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.724214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.722999 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-lib-modules\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.724214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723033 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-multus-socket-dir-parent\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.724214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723053 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-host-run-k8s-cni-cncf-io\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.724214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723104 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-hostroot\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.724214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723133 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-host\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.724214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723146 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c3b3582a-d64c-4ef1-8758-602aabd2be60-konnectivity-ca\") pod \"konnectivity-agent-2c2cp\" (UID: \"c3b3582a-d64c-4ef1-8758-602aabd2be60\") " pod="kube-system/konnectivity-agent-2c2cp" Apr 24 21:16:24.724214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723153 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-host-var-lib-kubelet\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.724214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723058 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-hostroot\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.724214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723196 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-multus-cni-dir\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.725030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723463 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42031ed4-962e-4310-b28d-4e04504596d2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.725030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723503 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/42031ed4-962e-4310-b28d-4e04504596d2-device-dir\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.725030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723531 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/42031ed4-962e-4310-b28d-4e04504596d2-etc-selinux\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.725030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723580 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/42031ed4-962e-4310-b28d-4e04504596d2-device-dir\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.725030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723584 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/42031ed4-962e-4310-b28d-4e04504596d2-sys-fs\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.725030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723612 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/42031ed4-962e-4310-b28d-4e04504596d2-etc-selinux\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.725030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723623 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/38d01fc4-4ff2-408e-baa1-6d9c62d27470-ovnkube-script-lib\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.725030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723646 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/42031ed4-962e-4310-b28d-4e04504596d2-sys-fs\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.725030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723655 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-host-run-multus-certs\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.725030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723655 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/372d4cd0-c127-43df-b1c7-06d67c0f967c-tmp\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.725030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723712 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2a58d1a-b069-41cc-b869-2a502d2e4e3c-host\") pod \"node-ca-2tx46\" (UID: \"a2a58d1a-b069-41cc-b869-2a502d2e4e3c\") " pod="openshift-image-registry/node-ca-2tx46" Apr 24 21:16:24.725030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723724 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-run\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.725030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723740 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-etc-kubernetes\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.725030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723772 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2a58d1a-b069-41cc-b869-2a502d2e4e3c-host\") pod \"node-ca-2tx46\" (UID: \"a2a58d1a-b069-41cc-b869-2a502d2e4e3c\") " pod="openshift-image-registry/node-ca-2tx46" Apr 24 21:16:24.725030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723781 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.725030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723815 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-etc-kubernetes\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.725030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.725817 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723853 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/38d01fc4-4ff2-408e-baa1-6d9c62d27470-ovnkube-config\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.725817 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723881 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-sys\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.725817 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723909 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a2a58d1a-b069-41cc-b869-2a502d2e4e3c-serviceca\") pod \"node-ca-2tx46\" (UID: \"a2a58d1a-b069-41cc-b869-2a502d2e4e3c\") " pod="openshift-image-registry/node-ca-2tx46" Apr 24 21:16:24.725817 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723934 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42031ed4-962e-4310-b28d-4e04504596d2-socket-dir\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.725817 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723960 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-kubelet\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.725817 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723986 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-run-ovn-kubernetes\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.725817 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.723997 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-sysctl-d\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.725817 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724012 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-sysconfig\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.725817 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724058 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8d2fcb39-1f56-4917-9964-9a549ea0b2a2-iptables-alerter-script\") pod \"iptables-alerter-s2wzz\" (UID: \"8d2fcb39-1f56-4917-9964-9a549ea0b2a2\") " pod="openshift-network-operator/iptables-alerter-s2wzz" Apr 24 21:16:24.725817 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724082 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b90d25e6-8bbe-484f-9222-fe772ed03d48-hosts-file\") pod \"node-resolver-4x96t\" (UID: \"b90d25e6-8bbe-484f-9222-fe772ed03d48\") " pod="openshift-dns/node-resolver-4x96t" Apr 24 21:16:24.725817 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724085 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-cni-binary-copy\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.725817 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724178 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-lib-modules\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.725817 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724207 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-host-var-lib-cni-bin\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.725817 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724234 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-multus-daemon-config\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.725817 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724259 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-log-socket\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.725817 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724288 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-system-cni-dir\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.725817 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724310 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-os-release\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.726645 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724334 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs\") pod \"network-metrics-daemon-62n84\" (UID: \"214d9db8-1af2-4a55-8c32-7b0ade9b8b1b\") " pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:24.726645 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724356 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-slash\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.726645 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724383 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-var-lib-openvswitch\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.726645 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724409 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/38d01fc4-4ff2-408e-baa1-6d9c62d27470-env-overrides\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.726645 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724441 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-kubernetes\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.726645 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724464 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-cnibin\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.726645 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724467 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.726645 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724490 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.726645 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724495 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42031ed4-962e-4310-b28d-4e04504596d2-socket-dir\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.726645 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724537 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-sys\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.726645 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724622 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-sysconfig\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.726645 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724649 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.726645 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724651 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-cni-binary-copy\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.726645 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724384 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.726645 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724723 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-os-release\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.726645 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724768 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-cnibin\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.726645 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724770 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-kubernetes\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.727446 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724830 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-multus-socket-dir-parent\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.727446 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.724931 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a2a58d1a-b069-41cc-b869-2a502d2e4e3c-serviceca\") pod \"node-ca-2tx46\" (UID: \"a2a58d1a-b069-41cc-b869-2a502d2e4e3c\") " pod="openshift-image-registry/node-ca-2tx46" Apr 24 21:16:24.727446 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.725004 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-system-cni-dir\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.727446 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.725044 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-host-var-lib-cni-bin\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.727446 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.725187 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8d2fcb39-1f56-4917-9964-9a549ea0b2a2-iptables-alerter-script\") pod \"iptables-alerter-s2wzz\" (UID: \"8d2fcb39-1f56-4917-9964-9a549ea0b2a2\") " pod="openshift-network-operator/iptables-alerter-s2wzz" Apr 24 21:16:24.727446 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.725225 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-multus-daemon-config\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.727446 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.725286 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.727446 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.725587 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/372d4cd0-c127-43df-b1c7-06d67c0f967c-etc-tuned\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.727446 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.725843 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c3b3582a-d64c-4ef1-8758-602aabd2be60-agent-certs\") pod \"konnectivity-agent-2c2cp\" (UID: \"c3b3582a-d64c-4ef1-8758-602aabd2be60\") " pod="kube-system/konnectivity-agent-2c2cp" Apr 24 21:16:24.727446 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.725873 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/372d4cd0-c127-43df-b1c7-06d67c0f967c-tmp\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.733802 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:24.733759 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:24.733802 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:24.733789 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:24.733802 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:24.733802 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8p6kc for pod openshift-network-diagnostics/network-check-target-dwmtg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:24.734012 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:24.733911 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc podName:87f55c8d-8e86-4982-94b5-0f6145c23361 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:25.233883457 +0000 UTC m=+3.085071955 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8p6kc" (UniqueName: "kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc") pod "network-check-target-dwmtg" (UID: "87f55c8d-8e86-4982-94b5-0f6145c23361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:24.734692 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.734645 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7cq9\" (UniqueName: \"kubernetes.io/projected/b90d25e6-8bbe-484f-9222-fe772ed03d48-kube-api-access-d7cq9\") pod \"node-resolver-4x96t\" (UID: \"b90d25e6-8bbe-484f-9222-fe772ed03d48\") " pod="openshift-dns/node-resolver-4x96t" Apr 24 21:16:24.735072 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.735018 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8cwj\" (UniqueName: \"kubernetes.io/projected/42031ed4-962e-4310-b28d-4e04504596d2-kube-api-access-n8cwj\") pod \"aws-ebs-csi-driver-node-vqjc5\" (UID: \"42031ed4-962e-4310-b28d-4e04504596d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.735072 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.735020 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjvzd\" (UniqueName: \"kubernetes.io/projected/8d2fcb39-1f56-4917-9964-9a549ea0b2a2-kube-api-access-rjvzd\") pod \"iptables-alerter-s2wzz\" (UID: \"8d2fcb39-1f56-4917-9964-9a549ea0b2a2\") " pod="openshift-network-operator/iptables-alerter-s2wzz" Apr 24 21:16:24.735279 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.735249 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkd58\" (UniqueName: \"kubernetes.io/projected/f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1-kube-api-access-hkd58\") pod \"multus-additional-cni-plugins-cjzlc\" (UID: \"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1\") " pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.735366 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.735283 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkpq2\" (UniqueName: \"kubernetes.io/projected/b5070b1b-b70e-45ae-b891-b47bcfb3f22a-kube-api-access-rkpq2\") pod \"multus-klqg5\" (UID: \"b5070b1b-b70e-45ae-b891-b47bcfb3f22a\") " pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.736619 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.736588 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj7nd\" (UniqueName: \"kubernetes.io/projected/372d4cd0-c127-43df-b1c7-06d67c0f967c-kube-api-access-xj7nd\") pod \"tuned-6xzp7\" (UID: \"372d4cd0-c127-43df-b1c7-06d67c0f967c\") " pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.737125 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.737108 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szgzb\" (UniqueName: \"kubernetes.io/projected/a2a58d1a-b069-41cc-b869-2a502d2e4e3c-kube-api-access-szgzb\") pod \"node-ca-2tx46\" (UID: \"a2a58d1a-b069-41cc-b869-2a502d2e4e3c\") " pod="openshift-image-registry/node-ca-2tx46" Apr 24 21:16:24.824864 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.824821 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-run-systemd\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.824864 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.824862 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-systemd-units\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825093 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.824887 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-etc-openvswitch\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825093 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.824918 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-node-log\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825093 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.824919 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-run-systemd\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825093 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.824947 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-run-openvswitch\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825093 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.824946 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-systemd-units\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825093 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.824985 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-etc-openvswitch\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825093 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.824987 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-run-openvswitch\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825093 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.824993 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-node-log\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825093 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825007 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-run-ovn\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825093 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825031 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-cni-bin\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825093 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825050 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-run-netns\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825093 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825070 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-run-ovn\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825093 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825091 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-cni-netd\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825696 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825098 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-cni-bin\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825696 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825114 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xff2q\" (UniqueName: \"kubernetes.io/projected/38d01fc4-4ff2-408e-baa1-6d9c62d27470-kube-api-access-xff2q\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825696 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825141 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/38d01fc4-4ff2-408e-baa1-6d9c62d27470-ovnkube-script-lib\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825696 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825165 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/38d01fc4-4ff2-408e-baa1-6d9c62d27470-ovnkube-config\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825696 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825175 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-cni-netd\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825696 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825195 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-kubelet\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825696 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825220 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-run-ovn-kubernetes\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825696 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825236 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-kubelet\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825696 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825184 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-run-netns\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825696 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-log-socket\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825696 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825283 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-log-socket\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825696 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825295 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs\") pod \"network-metrics-daemon-62n84\" (UID: \"214d9db8-1af2-4a55-8c32-7b0ade9b8b1b\") " pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:24.825696 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825306 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-run-ovn-kubernetes\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825696 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825321 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-slash\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825696 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825357 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-slash\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.825696 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:24.825397 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:24.825696 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825454 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-var-lib-openvswitch\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.826505 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:24.825475 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs podName:214d9db8-1af2-4a55-8c32-7b0ade9b8b1b nodeName:}" failed. No retries permitted until 2026-04-24 21:16:25.325455366 +0000 UTC m=+3.176643869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs") pod "network-metrics-daemon-62n84" (UID: "214d9db8-1af2-4a55-8c32-7b0ade9b8b1b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:24.826505 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825496 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-var-lib-openvswitch\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.826505 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825505 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/38d01fc4-4ff2-408e-baa1-6d9c62d27470-env-overrides\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.826505 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825534 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rl92b\" (UniqueName: \"kubernetes.io/projected/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-kube-api-access-rl92b\") pod \"network-metrics-daemon-62n84\" (UID: \"214d9db8-1af2-4a55-8c32-7b0ade9b8b1b\") " pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:24.826505 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.826505 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825581 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/38d01fc4-4ff2-408e-baa1-6d9c62d27470-ovn-node-metrics-cert\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.826505 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825657 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38d01fc4-4ff2-408e-baa1-6d9c62d27470-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.826505 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825793 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/38d01fc4-4ff2-408e-baa1-6d9c62d27470-ovnkube-script-lib\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.826505 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.825812 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/38d01fc4-4ff2-408e-baa1-6d9c62d27470-ovnkube-config\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.826505 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.826036 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/38d01fc4-4ff2-408e-baa1-6d9c62d27470-env-overrides\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.827986 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.827957 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/38d01fc4-4ff2-408e-baa1-6d9c62d27470-ovn-node-metrics-cert\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.839321 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.839256 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl92b\" (UniqueName: \"kubernetes.io/projected/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-kube-api-access-rl92b\") pod \"network-metrics-daemon-62n84\" (UID: \"214d9db8-1af2-4a55-8c32-7b0ade9b8b1b\") " pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:24.839321 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.839303 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xff2q\" (UniqueName: \"kubernetes.io/projected/38d01fc4-4ff2-408e-baa1-6d9c62d27470-kube-api-access-xff2q\") pod \"ovnkube-node-cg9z2\" (UID: \"38d01fc4-4ff2-408e-baa1-6d9c62d27470\") " pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:24.915352 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.915319 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4x96t" Apr 24 21:16:24.923142 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.923116 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2tx46" Apr 24 21:16:24.931733 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.931709 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cjzlc" Apr 24 21:16:24.937297 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.937273 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2c2cp" Apr 24 21:16:24.943827 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.943801 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s2wzz" Apr 24 21:16:24.949406 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.949383 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" Apr 24 21:16:24.955075 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.955054 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" Apr 24 21:16:24.960201 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.960179 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-klqg5" Apr 24 21:16:24.965822 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:24.965803 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:25.210079 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:25.210053 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2a58d1a_b069_41cc_b869_2a502d2e4e3c.slice/crio-8a623ea1cc0ce3997725578bc42108aeb1d06de3bd5c8e2a074e2042055ccb76 WatchSource:0}: Error finding container 8a623ea1cc0ce3997725578bc42108aeb1d06de3bd5c8e2a074e2042055ccb76: Status 404 returned error can't find the container with id 8a623ea1cc0ce3997725578bc42108aeb1d06de3bd5c8e2a074e2042055ccb76 Apr 24 21:16:25.210954 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:25.210913 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod372d4cd0_c127_43df_b1c7_06d67c0f967c.slice/crio-b9f01f5749eaa197c2f590e44be7a2c8ef75c2a4c3d1765fd5f871b65c44e1f0 WatchSource:0}: Error finding container b9f01f5749eaa197c2f590e44be7a2c8ef75c2a4c3d1765fd5f871b65c44e1f0: Status 404 returned error can't find the container with id b9f01f5749eaa197c2f590e44be7a2c8ef75c2a4c3d1765fd5f871b65c44e1f0 Apr 24 21:16:25.211876 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:25.211853 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3b3582a_d64c_4ef1_8758_602aabd2be60.slice/crio-0c11279b8161d4d5fa6ba131111dae7b9168180eb15a0f3fadab7d8b1d963f17 WatchSource:0}: Error finding container 0c11279b8161d4d5fa6ba131111dae7b9168180eb15a0f3fadab7d8b1d963f17: Status 404 returned error can't find the container with id 0c11279b8161d4d5fa6ba131111dae7b9168180eb15a0f3fadab7d8b1d963f17 Apr 24 21:16:25.212835 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:25.212654 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d2fcb39_1f56_4917_9964_9a549ea0b2a2.slice/crio-32728c00af6bc4b51bf5013e7595a8dea516f4fa7c683b34c67022bdf61cb1b8 WatchSource:0}: Error finding container 32728c00af6bc4b51bf5013e7595a8dea516f4fa7c683b34c67022bdf61cb1b8: Status 404 returned error can't find the container with id 32728c00af6bc4b51bf5013e7595a8dea516f4fa7c683b34c67022bdf61cb1b8 Apr 24 21:16:25.213788 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:25.213768 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38d01fc4_4ff2_408e_baa1_6d9c62d27470.slice/crio-44c2d11e6c1e058bdf66ccf1f6328cae9a055dad3575a184e512ffd4be2fb457 WatchSource:0}: Error finding container 44c2d11e6c1e058bdf66ccf1f6328cae9a055dad3575a184e512ffd4be2fb457: Status 404 returned error can't find the container with id 44c2d11e6c1e058bdf66ccf1f6328cae9a055dad3575a184e512ffd4be2fb457 Apr 24 21:16:25.216391 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:25.216366 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb90d25e6_8bbe_484f_9222_fe772ed03d48.slice/crio-79da6f98b76872647cdcccead12dedce6630e0b71e280c131564aee54a8af222 WatchSource:0}: Error finding container 79da6f98b76872647cdcccead12dedce6630e0b71e280c131564aee54a8af222: Status 404 returned error can't find the container with id 79da6f98b76872647cdcccead12dedce6630e0b71e280c131564aee54a8af222 Apr 24 21:16:25.218163 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:25.217849 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8fda77c_3dd1_4cdf_998b_c5cf0dac12b1.slice/crio-7d9f07dc93c87455269a1141b804fbefc72b595bee9572ffbb3fc5a6a66bcf37 WatchSource:0}: Error finding container 7d9f07dc93c87455269a1141b804fbefc72b595bee9572ffbb3fc5a6a66bcf37: Status 404 returned error can't find the container with id 7d9f07dc93c87455269a1141b804fbefc72b595bee9572ffbb3fc5a6a66bcf37 Apr 24 21:16:25.218721 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:16:25.218627 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42031ed4_962e_4310_b28d_4e04504596d2.slice/crio-22400cc4def3d441b54aa1a51fe9eb349958b49e64811ac9e83e43aa3b14a55a WatchSource:0}: Error finding container 22400cc4def3d441b54aa1a51fe9eb349958b49e64811ac9e83e43aa3b14a55a: Status 404 returned error can't find the container with id 22400cc4def3d441b54aa1a51fe9eb349958b49e64811ac9e83e43aa3b14a55a Apr 24 21:16:25.329174 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:25.328986 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8p6kc\" (UniqueName: \"kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc\") pod \"network-check-target-dwmtg\" (UID: \"87f55c8d-8e86-4982-94b5-0f6145c23361\") " pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:25.329385 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:25.329189 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs\") pod \"network-metrics-daemon-62n84\" (UID: \"214d9db8-1af2-4a55-8c32-7b0ade9b8b1b\") " pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:25.329385 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:25.329137 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:25.329385 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:25.329253 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:25.329385 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:25.329277 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:25.329385 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:25.329281 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8p6kc for pod openshift-network-diagnostics/network-check-target-dwmtg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:25.329385 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:25.329331 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs podName:214d9db8-1af2-4a55-8c32-7b0ade9b8b1b nodeName:}" failed. No retries permitted until 2026-04-24 21:16:26.32931816 +0000 UTC m=+4.180506637 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs") pod "network-metrics-daemon-62n84" (UID: "214d9db8-1af2-4a55-8c32-7b0ade9b8b1b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:25.329385 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:25.329344 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc podName:87f55c8d-8e86-4982-94b5-0f6145c23361 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:26.329337573 +0000 UTC m=+4.180526050 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8p6kc" (UniqueName: "kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc") pod "network-check-target-dwmtg" (UID: "87f55c8d-8e86-4982-94b5-0f6145c23361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:25.646881 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:25.646840 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:11:23 +0000 UTC" deadline="2028-01-02 20:32:57.038769818 +0000 UTC" Apr 24 21:16:25.646881 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:25.646879 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14831h16m31.391894832s" Apr 24 21:16:25.708492 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:25.708460 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:25.708672 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:25.708605 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62n84" podUID="214d9db8-1af2-4a55-8c32-7b0ade9b8b1b" Apr 24 21:16:25.727688 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:25.727618 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" event={"ID":"42031ed4-962e-4310-b28d-4e04504596d2","Type":"ContainerStarted","Data":"22400cc4def3d441b54aa1a51fe9eb349958b49e64811ac9e83e43aa3b14a55a"} Apr 24 21:16:25.736397 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:25.736362 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4x96t" event={"ID":"b90d25e6-8bbe-484f-9222-fe772ed03d48","Type":"ContainerStarted","Data":"79da6f98b76872647cdcccead12dedce6630e0b71e280c131564aee54a8af222"} Apr 24 21:16:25.756198 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:25.756149 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s2wzz" event={"ID":"8d2fcb39-1f56-4917-9964-9a549ea0b2a2","Type":"ContainerStarted","Data":"32728c00af6bc4b51bf5013e7595a8dea516f4fa7c683b34c67022bdf61cb1b8"} Apr 24 21:16:25.768576 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:25.768521 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2c2cp" event={"ID":"c3b3582a-d64c-4ef1-8758-602aabd2be60","Type":"ContainerStarted","Data":"0c11279b8161d4d5fa6ba131111dae7b9168180eb15a0f3fadab7d8b1d963f17"} Apr 24 21:16:25.789921 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:25.789849 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2tx46" event={"ID":"a2a58d1a-b069-41cc-b869-2a502d2e4e3c","Type":"ContainerStarted","Data":"8a623ea1cc0ce3997725578bc42108aeb1d06de3bd5c8e2a074e2042055ccb76"} Apr 24 21:16:25.799696 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:25.799648 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjzlc" event={"ID":"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1","Type":"ContainerStarted","Data":"7d9f07dc93c87455269a1141b804fbefc72b595bee9572ffbb3fc5a6a66bcf37"} Apr 24 21:16:25.805392 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:25.805313 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-klqg5" event={"ID":"b5070b1b-b70e-45ae-b891-b47bcfb3f22a","Type":"ContainerStarted","Data":"bb4d42a67b51ba89c42076c04236ef3a903560b389adb1e69f0de5d53e42c076"} Apr 24 21:16:25.808147 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:25.808099 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" event={"ID":"38d01fc4-4ff2-408e-baa1-6d9c62d27470","Type":"ContainerStarted","Data":"44c2d11e6c1e058bdf66ccf1f6328cae9a055dad3575a184e512ffd4be2fb457"} Apr 24 21:16:25.815399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:25.815326 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" event={"ID":"372d4cd0-c127-43df-b1c7-06d67c0f967c","Type":"ContainerStarted","Data":"b9f01f5749eaa197c2f590e44be7a2c8ef75c2a4c3d1765fd5f871b65c44e1f0"} Apr 24 21:16:25.830128 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:25.830093 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-81.ec2.internal" event={"ID":"91804ee55f307105f50a9b09138ac4ec","Type":"ContainerStarted","Data":"93cbb2865f98f2d7f319a458e3e380258d6781fe329d212420f0a422e83e4e62"} Apr 24 21:16:26.337308 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:26.337265 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs\") pod \"network-metrics-daemon-62n84\" (UID: \"214d9db8-1af2-4a55-8c32-7b0ade9b8b1b\") " pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:26.337484 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:26.337359 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8p6kc\" (UniqueName: \"kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc\") pod \"network-check-target-dwmtg\" (UID: \"87f55c8d-8e86-4982-94b5-0f6145c23361\") " pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:26.337559 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:26.337526 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:26.337559 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:26.337547 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:26.337658 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:26.337559 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8p6kc for pod openshift-network-diagnostics/network-check-target-dwmtg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:26.337658 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:26.337623 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc podName:87f55c8d-8e86-4982-94b5-0f6145c23361 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:28.337601995 +0000 UTC m=+6.188790478 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8p6kc" (UniqueName: "kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc") pod "network-check-target-dwmtg" (UID: "87f55c8d-8e86-4982-94b5-0f6145c23361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:26.338093 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:26.338063 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:26.338207 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:26.338127 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs podName:214d9db8-1af2-4a55-8c32-7b0ade9b8b1b nodeName:}" failed. No retries permitted until 2026-04-24 21:16:28.338109191 +0000 UTC m=+6.189297671 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs") pod "network-metrics-daemon-62n84" (UID: "214d9db8-1af2-4a55-8c32-7b0ade9b8b1b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:26.710801 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:26.709144 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:26.710801 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:26.709276 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dwmtg" podUID="87f55c8d-8e86-4982-94b5-0f6145c23361" Apr 24 21:16:26.861716 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:26.861653 2578 generic.go:358] "Generic (PLEG): container finished" podID="f22afa25508444e921602cc1e4e9b57f" containerID="e8e1d54f234402071d87e967ff008f5f14e63b75dfcb2a67f3a481b8daf47ea2" exitCode=0 Apr 24 21:16:26.862631 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:26.861925 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal" event={"ID":"f22afa25508444e921602cc1e4e9b57f","Type":"ContainerDied","Data":"e8e1d54f234402071d87e967ff008f5f14e63b75dfcb2a67f3a481b8daf47ea2"} Apr 24 21:16:26.876666 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:26.875817 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-81.ec2.internal" podStartSLOduration=3.875796513 podStartE2EDuration="3.875796513s" podCreationTimestamp="2026-04-24 21:16:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:16:25.844512668 +0000 UTC m=+3.695701161" watchObservedRunningTime="2026-04-24 21:16:26.875796513 +0000 UTC m=+4.726985014" Apr 24 21:16:27.708013 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:27.707946 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:27.708229 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:27.708100 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62n84" podUID="214d9db8-1af2-4a55-8c32-7b0ade9b8b1b" Apr 24 21:16:27.871481 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:27.871444 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal" event={"ID":"f22afa25508444e921602cc1e4e9b57f","Type":"ContainerStarted","Data":"74c64b2e5211d6f429202740796f696552b413d97c46e8e432765c9920aef7fc"} Apr 24 21:16:28.355275 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:28.355231 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8p6kc\" (UniqueName: \"kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc\") pod \"network-check-target-dwmtg\" (UID: \"87f55c8d-8e86-4982-94b5-0f6145c23361\") " pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:28.355453 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:28.355310 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs\") pod \"network-metrics-daemon-62n84\" (UID: \"214d9db8-1af2-4a55-8c32-7b0ade9b8b1b\") " pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:28.355453 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:28.355423 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:28.355453 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:28.355438 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:28.355453 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:28.355449 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:28.355752 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:28.355463 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8p6kc for pod openshift-network-diagnostics/network-check-target-dwmtg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:28.355752 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:28.355510 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs podName:214d9db8-1af2-4a55-8c32-7b0ade9b8b1b nodeName:}" failed. No retries permitted until 2026-04-24 21:16:32.355488684 +0000 UTC m=+10.206677176 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs") pod "network-metrics-daemon-62n84" (UID: "214d9db8-1af2-4a55-8c32-7b0ade9b8b1b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:28.355752 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:28.355530 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc podName:87f55c8d-8e86-4982-94b5-0f6145c23361 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:32.355519607 +0000 UTC m=+10.206708086 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8p6kc" (UniqueName: "kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc") pod "network-check-target-dwmtg" (UID: "87f55c8d-8e86-4982-94b5-0f6145c23361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:28.708788 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:28.708696 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:28.708939 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:28.708857 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dwmtg" podUID="87f55c8d-8e86-4982-94b5-0f6145c23361" Apr 24 21:16:29.708278 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:29.708245 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:29.708744 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:29.708387 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62n84" podUID="214d9db8-1af2-4a55-8c32-7b0ade9b8b1b" Apr 24 21:16:30.708752 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:30.708716 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:30.709203 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:30.708850 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dwmtg" podUID="87f55c8d-8e86-4982-94b5-0f6145c23361" Apr 24 21:16:31.707965 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:31.707930 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:31.708173 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:31.708085 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62n84" podUID="214d9db8-1af2-4a55-8c32-7b0ade9b8b1b" Apr 24 21:16:32.389510 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:32.389295 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8p6kc\" (UniqueName: \"kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc\") pod \"network-check-target-dwmtg\" (UID: \"87f55c8d-8e86-4982-94b5-0f6145c23361\") " pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:32.389510 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:32.389365 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs\") pod \"network-metrics-daemon-62n84\" (UID: \"214d9db8-1af2-4a55-8c32-7b0ade9b8b1b\") " pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:32.389510 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:32.389486 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:32.389510 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:32.389512 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:32.389510 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:32.389525 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8p6kc for pod openshift-network-diagnostics/network-check-target-dwmtg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:32.390328 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:32.389581 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc podName:87f55c8d-8e86-4982-94b5-0f6145c23361 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:40.389563365 +0000 UTC m=+18.240751856 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-8p6kc" (UniqueName: "kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc") pod "network-check-target-dwmtg" (UID: "87f55c8d-8e86-4982-94b5-0f6145c23361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:32.390328 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:32.389488 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:32.390328 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:32.389805 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs podName:214d9db8-1af2-4a55-8c32-7b0ade9b8b1b nodeName:}" failed. No retries permitted until 2026-04-24 21:16:40.38978456 +0000 UTC m=+18.240973054 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs") pod "network-metrics-daemon-62n84" (UID: "214d9db8-1af2-4a55-8c32-7b0ade9b8b1b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:32.709346 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:32.708893 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:32.709346 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:32.708982 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dwmtg" podUID="87f55c8d-8e86-4982-94b5-0f6145c23361" Apr 24 21:16:33.708006 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:33.707961 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:33.708470 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:33.708120 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62n84" podUID="214d9db8-1af2-4a55-8c32-7b0ade9b8b1b" Apr 24 21:16:34.708714 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:34.708617 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:34.709175 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:34.708758 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dwmtg" podUID="87f55c8d-8e86-4982-94b5-0f6145c23361" Apr 24 21:16:35.708215 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:35.708173 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:35.708414 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:35.708322 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62n84" podUID="214d9db8-1af2-4a55-8c32-7b0ade9b8b1b" Apr 24 21:16:35.985367 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:35.985316 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-81.ec2.internal" podStartSLOduration=12.985300327000001 podStartE2EDuration="12.985300327s" podCreationTimestamp="2026-04-24 21:16:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:16:27.885549447 +0000 UTC m=+5.736737947" watchObservedRunningTime="2026-04-24 21:16:35.985300327 +0000 UTC m=+13.836488825" Apr 24 21:16:35.985840 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:35.985820 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6lx29"] Apr 24 21:16:36.003311 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:36.003278 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:36.003450 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:36.003354 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6lx29" podUID="dcc464bb-ad40-4751-9ee3-8ee223123ed6" Apr 24 21:16:36.116799 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:36.116752 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dcc464bb-ad40-4751-9ee3-8ee223123ed6-dbus\") pod \"global-pull-secret-syncer-6lx29\" (UID: \"dcc464bb-ad40-4751-9ee3-8ee223123ed6\") " pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:36.116971 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:36.116838 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret\") pod \"global-pull-secret-syncer-6lx29\" (UID: \"dcc464bb-ad40-4751-9ee3-8ee223123ed6\") " pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:36.116971 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:36.116897 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dcc464bb-ad40-4751-9ee3-8ee223123ed6-kubelet-config\") pod \"global-pull-secret-syncer-6lx29\" (UID: \"dcc464bb-ad40-4751-9ee3-8ee223123ed6\") " pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:36.218020 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:36.217975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret\") pod \"global-pull-secret-syncer-6lx29\" (UID: \"dcc464bb-ad40-4751-9ee3-8ee223123ed6\") " pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:36.218181 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:36.218051 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dcc464bb-ad40-4751-9ee3-8ee223123ed6-kubelet-config\") pod \"global-pull-secret-syncer-6lx29\" (UID: \"dcc464bb-ad40-4751-9ee3-8ee223123ed6\") " pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:36.218181 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:36.218080 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dcc464bb-ad40-4751-9ee3-8ee223123ed6-dbus\") pod \"global-pull-secret-syncer-6lx29\" (UID: \"dcc464bb-ad40-4751-9ee3-8ee223123ed6\") " pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:36.218181 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:36.218122 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:36.218331 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:36.218189 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dcc464bb-ad40-4751-9ee3-8ee223123ed6-kubelet-config\") pod \"global-pull-secret-syncer-6lx29\" (UID: \"dcc464bb-ad40-4751-9ee3-8ee223123ed6\") " pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:36.218331 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:36.218206 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret podName:dcc464bb-ad40-4751-9ee3-8ee223123ed6 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:36.718186568 +0000 UTC m=+14.569375047 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret") pod "global-pull-secret-syncer-6lx29" (UID: "dcc464bb-ad40-4751-9ee3-8ee223123ed6") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:36.218331 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:36.218251 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dcc464bb-ad40-4751-9ee3-8ee223123ed6-dbus\") pod \"global-pull-secret-syncer-6lx29\" (UID: \"dcc464bb-ad40-4751-9ee3-8ee223123ed6\") " pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:36.708011 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:36.707972 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:36.708204 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:36.708106 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dwmtg" podUID="87f55c8d-8e86-4982-94b5-0f6145c23361" Apr 24 21:16:36.721823 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:36.721772 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret\") pod \"global-pull-secret-syncer-6lx29\" (UID: \"dcc464bb-ad40-4751-9ee3-8ee223123ed6\") " pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:36.722015 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:36.721929 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:36.722015 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:36.721994 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret podName:dcc464bb-ad40-4751-9ee3-8ee223123ed6 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:37.721978747 +0000 UTC m=+15.573167229 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret") pod "global-pull-secret-syncer-6lx29" (UID: "dcc464bb-ad40-4751-9ee3-8ee223123ed6") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:37.707925 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:37.707891 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:37.708295 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:37.707891 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:37.708295 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:37.708041 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6lx29" podUID="dcc464bb-ad40-4751-9ee3-8ee223123ed6" Apr 24 21:16:37.708295 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:37.708119 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62n84" podUID="214d9db8-1af2-4a55-8c32-7b0ade9b8b1b" Apr 24 21:16:37.730877 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:37.730838 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret\") pod \"global-pull-secret-syncer-6lx29\" (UID: \"dcc464bb-ad40-4751-9ee3-8ee223123ed6\") " pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:37.731037 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:37.731016 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:37.731104 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:37.731094 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret podName:dcc464bb-ad40-4751-9ee3-8ee223123ed6 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:39.731074939 +0000 UTC m=+17.582263429 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret") pod "global-pull-secret-syncer-6lx29" (UID: "dcc464bb-ad40-4751-9ee3-8ee223123ed6") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:38.708849 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:38.708809 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:38.709285 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:38.708953 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dwmtg" podUID="87f55c8d-8e86-4982-94b5-0f6145c23361" Apr 24 21:16:39.708206 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:39.708169 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:39.708434 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:39.708169 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:39.708434 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:39.708296 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6lx29" podUID="dcc464bb-ad40-4751-9ee3-8ee223123ed6" Apr 24 21:16:39.708434 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:39.708397 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62n84" podUID="214d9db8-1af2-4a55-8c32-7b0ade9b8b1b" Apr 24 21:16:39.745903 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:39.745866 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret\") pod \"global-pull-secret-syncer-6lx29\" (UID: \"dcc464bb-ad40-4751-9ee3-8ee223123ed6\") " pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:39.746315 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:39.746006 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:39.746315 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:39.746095 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret podName:dcc464bb-ad40-4751-9ee3-8ee223123ed6 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:43.746073727 +0000 UTC m=+21.597262206 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret") pod "global-pull-secret-syncer-6lx29" (UID: "dcc464bb-ad40-4751-9ee3-8ee223123ed6") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:40.451380 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:40.451340 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs\") pod \"network-metrics-daemon-62n84\" (UID: \"214d9db8-1af2-4a55-8c32-7b0ade9b8b1b\") " pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:40.451556 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:40.451409 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8p6kc\" (UniqueName: \"kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc\") pod \"network-check-target-dwmtg\" (UID: \"87f55c8d-8e86-4982-94b5-0f6145c23361\") " pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:40.451556 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:40.451527 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:40.451556 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:40.451552 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:40.451728 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:40.451566 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8p6kc for pod openshift-network-diagnostics/network-check-target-dwmtg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:40.451728 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:40.451524 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:40.451728 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:40.451614 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc podName:87f55c8d-8e86-4982-94b5-0f6145c23361 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:56.451600539 +0000 UTC m=+34.302789016 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-8p6kc" (UniqueName: "kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc") pod "network-check-target-dwmtg" (UID: "87f55c8d-8e86-4982-94b5-0f6145c23361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:40.451728 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:40.451702 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs podName:214d9db8-1af2-4a55-8c32-7b0ade9b8b1b nodeName:}" failed. No retries permitted until 2026-04-24 21:16:56.451662051 +0000 UTC m=+34.302850542 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs") pod "network-metrics-daemon-62n84" (UID: "214d9db8-1af2-4a55-8c32-7b0ade9b8b1b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:40.708037 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:40.707953 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:40.708214 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:40.708070 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dwmtg" podUID="87f55c8d-8e86-4982-94b5-0f6145c23361" Apr 24 21:16:41.708301 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:41.708266 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:41.708704 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:41.708393 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62n84" podUID="214d9db8-1af2-4a55-8c32-7b0ade9b8b1b" Apr 24 21:16:41.708704 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:41.708454 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:41.708704 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:41.708558 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6lx29" podUID="dcc464bb-ad40-4751-9ee3-8ee223123ed6" Apr 24 21:16:42.708761 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:42.708726 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:42.709209 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:42.708857 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dwmtg" podUID="87f55c8d-8e86-4982-94b5-0f6145c23361" Apr 24 21:16:42.905707 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:42.905375 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" event={"ID":"38d01fc4-4ff2-408e-baa1-6d9c62d27470","Type":"ContainerStarted","Data":"90216f34fd5ef6883f4afac266ecd02a0bd2513e02b3c45da66d290a9a977a50"} Apr 24 21:16:42.907225 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:42.907186 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" event={"ID":"372d4cd0-c127-43df-b1c7-06d67c0f967c","Type":"ContainerStarted","Data":"77944b38740c93861d16d4412bb89451cc8e7b877626c7c3e1f3c2e344bf3a54"} Apr 24 21:16:42.911177 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:42.911152 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2c2cp" event={"ID":"c3b3582a-d64c-4ef1-8758-602aabd2be60","Type":"ContainerStarted","Data":"a9ed0e0e0313ca4944022d9d6eb675ba110c200e718459c16b37fb5c02112bac"} Apr 24 21:16:42.914878 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:42.914805 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-klqg5" event={"ID":"b5070b1b-b70e-45ae-b891-b47bcfb3f22a","Type":"ContainerStarted","Data":"e076603016b08fa345dc53057936e6a8efd38b4cca51bfaa71108542e3633729"} Apr 24 21:16:42.930011 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:42.929943 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6xzp7" podStartSLOduration=3.507718804 podStartE2EDuration="20.929922872s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:25.213030049 +0000 UTC m=+3.064218533" lastFinishedPulling="2026-04-24 21:16:42.63523411 +0000 UTC m=+20.486422601" observedRunningTime="2026-04-24 21:16:42.929199416 +0000 UTC m=+20.780387916" watchObservedRunningTime="2026-04-24 21:16:42.929922872 +0000 UTC m=+20.781111371" Apr 24 21:16:42.951348 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:42.950983 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2c2cp" podStartSLOduration=8.244839297 podStartE2EDuration="20.950964753s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:25.215014664 +0000 UTC m=+3.066203152" lastFinishedPulling="2026-04-24 21:16:37.921140095 +0000 UTC m=+15.772328608" observedRunningTime="2026-04-24 21:16:42.950444256 +0000 UTC m=+20.801632757" watchObservedRunningTime="2026-04-24 21:16:42.950964753 +0000 UTC m=+20.802153253" Apr 24 21:16:42.976568 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:42.976459 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-klqg5" podStartSLOduration=3.421741775 podStartE2EDuration="20.976439844s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:25.223931416 +0000 UTC m=+3.075119893" lastFinishedPulling="2026-04-24 21:16:42.77862947 +0000 UTC m=+20.629817962" observedRunningTime="2026-04-24 21:16:42.976159617 +0000 UTC m=+20.827348097" watchObservedRunningTime="2026-04-24 21:16:42.976439844 +0000 UTC m=+20.827628344" Apr 24 21:16:43.708873 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.708669 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:43.709711 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.708733 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:43.709711 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:43.708946 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6lx29" podUID="dcc464bb-ad40-4751-9ee3-8ee223123ed6" Apr 24 21:16:43.709711 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:43.709037 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62n84" podUID="214d9db8-1af2-4a55-8c32-7b0ade9b8b1b" Apr 24 21:16:43.775163 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.775122 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret\") pod \"global-pull-secret-syncer-6lx29\" (UID: \"dcc464bb-ad40-4751-9ee3-8ee223123ed6\") " pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:43.775323 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:43.775288 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:43.775398 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:43.775373 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret podName:dcc464bb-ad40-4751-9ee3-8ee223123ed6 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:51.77535301 +0000 UTC m=+29.626541491 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret") pod "global-pull-secret-syncer-6lx29" (UID: "dcc464bb-ad40-4751-9ee3-8ee223123ed6") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:43.864290 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.864269 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:16:43.918143 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.918109 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" event={"ID":"42031ed4-962e-4310-b28d-4e04504596d2","Type":"ContainerStarted","Data":"ed36d6802f480cd2b0ba47d089ed23a9dedb03783b9adadfa800056ef86b3836"} Apr 24 21:16:43.918143 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.918147 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" event={"ID":"42031ed4-962e-4310-b28d-4e04504596d2","Type":"ContainerStarted","Data":"79a21e1178fdf3e05c589ec9fe12dc338fb1aa5b5bb46c288a76433ea136eb4e"} Apr 24 21:16:43.919391 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.919362 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4x96t" event={"ID":"b90d25e6-8bbe-484f-9222-fe772ed03d48","Type":"ContainerStarted","Data":"6a626c38e221e435f06ad16d5fd10ae3f188befd5986979453f0d0940476d827"} Apr 24 21:16:43.920598 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.920574 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2tx46" event={"ID":"a2a58d1a-b069-41cc-b869-2a502d2e4e3c","Type":"ContainerStarted","Data":"21aaf9883221b78cdbc0514c7f88a21f6159854799f4db2f3a53623673e7180f"} Apr 24 21:16:43.921846 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.921823 2578 generic.go:358] "Generic (PLEG): container finished" podID="f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1" containerID="ba1606e8f205ba51d2816b92459a9f6fe323ddc13deff62a04f97e687abd6790" exitCode=0 Apr 24 21:16:43.921957 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.921889 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjzlc" event={"ID":"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1","Type":"ContainerDied","Data":"ba1606e8f205ba51d2816b92459a9f6fe323ddc13deff62a04f97e687abd6790"} Apr 24 21:16:43.924396 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.924379 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:16:43.924698 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.924661 2578 generic.go:358] "Generic (PLEG): container finished" podID="38d01fc4-4ff2-408e-baa1-6d9c62d27470" containerID="be2807d480bcd583b4ea948bd068252fe39649eabe3d703569526c72d6dbc437" exitCode=1 Apr 24 21:16:43.924777 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.924702 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" event={"ID":"38d01fc4-4ff2-408e-baa1-6d9c62d27470","Type":"ContainerStarted","Data":"491e7f844ed1f0cf27a0ccc927decf5d7ed9fa4d69c7e3cc14da1cf399f0bcb3"} Apr 24 21:16:43.924777 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.924735 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" event={"ID":"38d01fc4-4ff2-408e-baa1-6d9c62d27470","Type":"ContainerStarted","Data":"62c65db17ff378e9126c7d95165f7db575338f4e5caee6d16bc2a236b8cee8d7"} Apr 24 21:16:43.924777 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.924751 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" event={"ID":"38d01fc4-4ff2-408e-baa1-6d9c62d27470","Type":"ContainerStarted","Data":"4a805cfd2abdb07a8d5a9fee695adbc7f5028daae8790f0c06b5c33b36760708"} Apr 24 21:16:43.924777 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.924763 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" event={"ID":"38d01fc4-4ff2-408e-baa1-6d9c62d27470","Type":"ContainerStarted","Data":"3b90ceec3997e86fd28c12f00f6938835cd38145e212018f3f08e2d8799592f0"} Apr 24 21:16:43.924777 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.924777 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" event={"ID":"38d01fc4-4ff2-408e-baa1-6d9c62d27470","Type":"ContainerDied","Data":"be2807d480bcd583b4ea948bd068252fe39649eabe3d703569526c72d6dbc437"} Apr 24 21:16:43.937530 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.937486 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4x96t" podStartSLOduration=4.520497046 podStartE2EDuration="21.937472326s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:25.218270138 +0000 UTC m=+3.069458631" lastFinishedPulling="2026-04-24 21:16:42.63524542 +0000 UTC m=+20.486433911" observedRunningTime="2026-04-24 21:16:43.9371563 +0000 UTC m=+21.788344800" watchObservedRunningTime="2026-04-24 21:16:43.937472326 +0000 UTC m=+21.788660825" Apr 24 21:16:43.973613 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:43.973561 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2tx46" podStartSLOduration=4.593037286 podStartE2EDuration="21.973547829s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:25.212333619 +0000 UTC m=+3.063522111" lastFinishedPulling="2026-04-24 21:16:42.592844173 +0000 UTC m=+20.444032654" observedRunningTime="2026-04-24 21:16:43.951043232 +0000 UTC m=+21.802231742" watchObservedRunningTime="2026-04-24 21:16:43.973547829 +0000 UTC m=+21.824736329" Apr 24 21:16:44.685283 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:44.685183 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:16:43.864285843Z","UUID":"21cc9eea-7c2b-4f0c-b6f1-3ab027f9dd6e","Handler":null,"Name":"","Endpoint":""} Apr 24 21:16:44.686887 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:44.686864 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:16:44.687010 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:44.686896 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:16:44.708166 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:44.708133 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:44.708315 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:44.708249 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dwmtg" podUID="87f55c8d-8e86-4982-94b5-0f6145c23361" Apr 24 21:16:44.929075 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:44.929041 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" event={"ID":"42031ed4-962e-4310-b28d-4e04504596d2","Type":"ContainerStarted","Data":"bcfe6743b2a9697f2aef6059859ea5d7fb0c2eea1ff17c04f6a2b06ad23e0caa"} Apr 24 21:16:44.930542 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:44.930511 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s2wzz" event={"ID":"8d2fcb39-1f56-4917-9964-9a549ea0b2a2","Type":"ContainerStarted","Data":"45e2e72baac4924f36ec71e2aff93caa6b3b5d2f06fcb5594c5e715b26856eb9"} Apr 24 21:16:44.947965 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:44.947862 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqjc5" podStartSLOduration=3.432324232 podStartE2EDuration="22.947844514s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:25.22230068 +0000 UTC m=+3.073489162" lastFinishedPulling="2026-04-24 21:16:44.737820952 +0000 UTC m=+22.589009444" observedRunningTime="2026-04-24 21:16:44.947641846 +0000 UTC m=+22.798830378" watchObservedRunningTime="2026-04-24 21:16:44.947844514 +0000 UTC m=+22.799033015" Apr 24 21:16:44.965474 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:44.965422 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-s2wzz" podStartSLOduration=5.543240629 podStartE2EDuration="22.965404874s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:25.215306717 +0000 UTC m=+3.066495209" lastFinishedPulling="2026-04-24 21:16:42.637470963 +0000 UTC m=+20.488659454" observedRunningTime="2026-04-24 21:16:44.965372205 +0000 UTC m=+22.816560723" watchObservedRunningTime="2026-04-24 21:16:44.965404874 +0000 UTC m=+22.816593393" Apr 24 21:16:45.623223 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:45.623137 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2c2cp" Apr 24 21:16:45.623873 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:45.623848 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2c2cp" Apr 24 21:16:45.707898 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:45.707847 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:45.708113 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:45.707847 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:45.708113 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:45.707949 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6lx29" podUID="dcc464bb-ad40-4751-9ee3-8ee223123ed6" Apr 24 21:16:45.708113 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:45.708058 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62n84" podUID="214d9db8-1af2-4a55-8c32-7b0ade9b8b1b" Apr 24 21:16:45.935568 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:45.935489 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:16:45.936025 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:45.935996 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" event={"ID":"38d01fc4-4ff2-408e-baa1-6d9c62d27470","Type":"ContainerStarted","Data":"90e10e0140aa453056a30a8a571e73b263870e13ff60ab705fcab02ec4a2d192"} Apr 24 21:16:45.936371 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:45.936335 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2c2cp" Apr 24 21:16:45.936731 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:45.936670 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2c2cp" Apr 24 21:16:46.708694 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:46.708476 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:46.708909 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:46.708785 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dwmtg" podUID="87f55c8d-8e86-4982-94b5-0f6145c23361" Apr 24 21:16:47.708785 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:47.708701 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:47.709246 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:47.708701 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:47.709246 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:47.708828 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62n84" podUID="214d9db8-1af2-4a55-8c32-7b0ade9b8b1b" Apr 24 21:16:47.709246 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:47.708878 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6lx29" podUID="dcc464bb-ad40-4751-9ee3-8ee223123ed6" Apr 24 21:16:47.943771 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:47.943610 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:16:47.944347 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:47.944227 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" event={"ID":"38d01fc4-4ff2-408e-baa1-6d9c62d27470","Type":"ContainerStarted","Data":"83a24f936ab7d90015fbd554a5f2222fa43ec61239e5964bbee4245418c6073a"} Apr 24 21:16:47.944714 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:47.944510 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:47.944714 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:47.944535 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:47.944714 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:47.944669 2578 scope.go:117] "RemoveContainer" containerID="be2807d480bcd583b4ea948bd068252fe39649eabe3d703569526c72d6dbc437" Apr 24 21:16:47.959618 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:47.959294 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:48.708115 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:48.708075 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:48.708295 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:48.708213 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dwmtg" podUID="87f55c8d-8e86-4982-94b5-0f6145c23361" Apr 24 21:16:48.947310 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:48.947270 2578 generic.go:358] "Generic (PLEG): container finished" podID="f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1" containerID="b73553b5e5612443e08b3354e2914fe2f6cf4c06f657a2f251f156a5e6f9770c" exitCode=0 Apr 24 21:16:48.947986 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:48.947337 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjzlc" event={"ID":"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1","Type":"ContainerDied","Data":"b73553b5e5612443e08b3354e2914fe2f6cf4c06f657a2f251f156a5e6f9770c"} Apr 24 21:16:48.953774 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:48.953756 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:16:48.954084 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:48.954066 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" event={"ID":"38d01fc4-4ff2-408e-baa1-6d9c62d27470","Type":"ContainerStarted","Data":"67995659ae30a4b4e05714c91a3c55b35587244828262545554860f46fab3629"} Apr 24 21:16:48.954344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:48.954328 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:48.968500 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:48.968431 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:16:48.999521 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:48.999481 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" podStartSLOduration=9.540832035 podStartE2EDuration="26.999468023s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:25.216195498 +0000 UTC m=+3.067383979" lastFinishedPulling="2026-04-24 21:16:42.674831475 +0000 UTC m=+20.526019967" observedRunningTime="2026-04-24 21:16:48.998058076 +0000 UTC m=+26.849246586" watchObservedRunningTime="2026-04-24 21:16:48.999468023 +0000 UTC m=+26.850656518" Apr 24 21:16:49.708117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:49.708081 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:49.708117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:49.708116 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:49.708312 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:49.708206 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62n84" podUID="214d9db8-1af2-4a55-8c32-7b0ade9b8b1b" Apr 24 21:16:49.708353 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:49.708306 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6lx29" podUID="dcc464bb-ad40-4751-9ee3-8ee223123ed6" Apr 24 21:16:49.786697 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:49.786609 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6lx29"] Apr 24 21:16:49.790318 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:49.790277 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-62n84"] Apr 24 21:16:49.791013 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:49.790992 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dwmtg"] Apr 24 21:16:49.791152 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:49.791138 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:49.791273 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:49.791249 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dwmtg" podUID="87f55c8d-8e86-4982-94b5-0f6145c23361" Apr 24 21:16:49.958170 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:49.958138 2578 generic.go:358] "Generic (PLEG): container finished" podID="f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1" containerID="d1930ba21dd3f24759215ba92057a19705a220b4f986c78121423ebae9b919b3" exitCode=0 Apr 24 21:16:49.958558 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:49.958235 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjzlc" event={"ID":"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1","Type":"ContainerDied","Data":"d1930ba21dd3f24759215ba92057a19705a220b4f986c78121423ebae9b919b3"} Apr 24 21:16:49.958558 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:49.958276 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:49.958558 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:49.958396 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:49.958558 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:49.958473 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6lx29" podUID="dcc464bb-ad40-4751-9ee3-8ee223123ed6" Apr 24 21:16:49.958558 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:49.958542 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62n84" podUID="214d9db8-1af2-4a55-8c32-7b0ade9b8b1b" Apr 24 21:16:50.962481 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:50.962393 2578 generic.go:358] "Generic (PLEG): container finished" podID="f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1" containerID="fe3d040ff8d117faae751ce0b95b7178d83f7c8cac280f66a02201840ee056c3" exitCode=0 Apr 24 21:16:50.962862 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:50.962483 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjzlc" event={"ID":"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1","Type":"ContainerDied","Data":"fe3d040ff8d117faae751ce0b95b7178d83f7c8cac280f66a02201840ee056c3"} Apr 24 21:16:51.707902 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:51.707775 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:51.707902 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:51.707828 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:51.708117 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:51.707908 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dwmtg" podUID="87f55c8d-8e86-4982-94b5-0f6145c23361" Apr 24 21:16:51.708117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:51.707977 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:51.708117 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:51.708068 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6lx29" podUID="dcc464bb-ad40-4751-9ee3-8ee223123ed6" Apr 24 21:16:51.708286 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:51.708171 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62n84" podUID="214d9db8-1af2-4a55-8c32-7b0ade9b8b1b" Apr 24 21:16:51.838258 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:51.838174 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret\") pod \"global-pull-secret-syncer-6lx29\" (UID: \"dcc464bb-ad40-4751-9ee3-8ee223123ed6\") " pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:51.838407 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:51.838341 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:51.838456 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:51.838423 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret podName:dcc464bb-ad40-4751-9ee3-8ee223123ed6 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:07.838401018 +0000 UTC m=+45.689589507 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret") pod "global-pull-secret-syncer-6lx29" (UID: "dcc464bb-ad40-4751-9ee3-8ee223123ed6") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:53.707829 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:53.707797 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:53.708237 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:53.707797 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:53.708237 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:53.707918 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dwmtg" podUID="87f55c8d-8e86-4982-94b5-0f6145c23361" Apr 24 21:16:53.708237 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:53.708018 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6lx29" podUID="dcc464bb-ad40-4751-9ee3-8ee223123ed6" Apr 24 21:16:53.708237 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:53.707798 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:53.708237 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:53.708148 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62n84" podUID="214d9db8-1af2-4a55-8c32-7b0ade9b8b1b" Apr 24 21:16:55.708668 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:55.708577 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:55.708668 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:55.708629 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:55.709320 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:55.708695 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dwmtg" podUID="87f55c8d-8e86-4982-94b5-0f6145c23361" Apr 24 21:16:55.709320 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:55.708757 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6lx29" podUID="dcc464bb-ad40-4751-9ee3-8ee223123ed6" Apr 24 21:16:55.709320 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:55.708812 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:55.709320 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:55.708905 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62n84" podUID="214d9db8-1af2-4a55-8c32-7b0ade9b8b1b" Apr 24 21:16:55.930296 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:55.930264 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-81.ec2.internal" event="NodeReady" Apr 24 21:16:55.930498 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:55.930427 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:16:55.989476 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:55.989434 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5c7bdc7564-g4mfd"] Apr 24 21:16:56.018853 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.018805 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-srdps"] Apr 24 21:16:56.019016 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.018966 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.030465 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.030430 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:16:56.031878 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.031853 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:16:56.032078 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.032061 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:16:56.032167 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.032118 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jgvbr\"" Apr 24 21:16:56.034612 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.034588 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tsgdk"] Apr 24 21:16:56.034746 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.034712 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-srdps" Apr 24 21:16:56.038718 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.038315 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:16:56.038718 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.038330 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:16:56.038718 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.038338 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:16:56.038718 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.038435 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-znhx4\"" Apr 24 21:16:56.041324 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.041304 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:16:56.048863 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.048838 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5c7bdc7564-g4mfd"] Apr 24 21:16:56.048997 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.048868 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tsgdk"] Apr 24 21:16:56.048997 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.048882 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-srdps"] Apr 24 21:16:56.049092 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.049025 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tsgdk" Apr 24 21:16:56.051726 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.051704 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:16:56.051873 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.051759 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:16:56.051942 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.051881 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-86llq\"" Apr 24 21:16:56.170488 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.170447 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-bound-sa-token\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.170724 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.170538 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.170724 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.170575 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/12da021b-8530-4058-b1e8-2689e6c9fdb6-ca-trust-extracted\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.170724 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.170602 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2nj4\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-kube-api-access-s2nj4\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.170883 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.170734 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbvsx\" (UniqueName: \"kubernetes.io/projected/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-kube-api-access-xbvsx\") pod \"dns-default-tsgdk\" (UID: \"1b1a23d3-3a62-4a84-a3aa-d49e382f7322\") " pod="openshift-dns/dns-default-tsgdk" Apr 24 21:16:56.170883 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.170769 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/12da021b-8530-4058-b1e8-2689e6c9fdb6-installation-pull-secrets\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.170883 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.170801 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert\") pod \"ingress-canary-srdps\" (UID: \"e0d12939-1428-44ac-bf2e-c0fee7bf5161\") " pod="openshift-ingress-canary/ingress-canary-srdps" Apr 24 21:16:56.170883 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.170823 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kmlb\" (UniqueName: \"kubernetes.io/projected/e0d12939-1428-44ac-bf2e-c0fee7bf5161-kube-api-access-2kmlb\") pod \"ingress-canary-srdps\" (UID: \"e0d12939-1428-44ac-bf2e-c0fee7bf5161\") " pod="openshift-ingress-canary/ingress-canary-srdps" Apr 24 21:16:56.170883 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.170838 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-config-volume\") pod \"dns-default-tsgdk\" (UID: \"1b1a23d3-3a62-4a84-a3aa-d49e382f7322\") " pod="openshift-dns/dns-default-tsgdk" Apr 24 21:16:56.170883 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.170853 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-tmp-dir\") pod \"dns-default-tsgdk\" (UID: \"1b1a23d3-3a62-4a84-a3aa-d49e382f7322\") " pod="openshift-dns/dns-default-tsgdk" Apr 24 21:16:56.171152 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.170893 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/12da021b-8530-4058-b1e8-2689e6c9fdb6-image-registry-private-configuration\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.171152 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.170921 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12da021b-8530-4058-b1e8-2689e6c9fdb6-trusted-ca\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.171152 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.170997 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-metrics-tls\") pod \"dns-default-tsgdk\" (UID: \"1b1a23d3-3a62-4a84-a3aa-d49e382f7322\") " pod="openshift-dns/dns-default-tsgdk" Apr 24 21:16:56.171152 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.171051 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-certificates\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.271795 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.271701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.271795 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.271754 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/12da021b-8530-4058-b1e8-2689e6c9fdb6-ca-trust-extracted\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.271795 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.271771 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2nj4\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-kube-api-access-s2nj4\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.272066 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.271814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbvsx\" (UniqueName: \"kubernetes.io/projected/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-kube-api-access-xbvsx\") pod \"dns-default-tsgdk\" (UID: \"1b1a23d3-3a62-4a84-a3aa-d49e382f7322\") " pod="openshift-dns/dns-default-tsgdk" Apr 24 21:16:56.272066 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.271845 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/12da021b-8530-4058-b1e8-2689e6c9fdb6-installation-pull-secrets\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.272066 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.271870 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:56.272066 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.271891 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c7bdc7564-g4mfd: secret "image-registry-tls" not found Apr 24 21:16:56.272066 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.271942 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:56.272066 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.271961 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls podName:12da021b-8530-4058-b1e8-2689e6c9fdb6 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:56.771939804 +0000 UTC m=+34.623128286 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls") pod "image-registry-5c7bdc7564-g4mfd" (UID: "12da021b-8530-4058-b1e8-2689e6c9fdb6") : secret "image-registry-tls" not found Apr 24 21:16:56.272066 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.271989 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert podName:e0d12939-1428-44ac-bf2e-c0fee7bf5161 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:56.771974082 +0000 UTC m=+34.623162559 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert") pod "ingress-canary-srdps" (UID: "e0d12939-1428-44ac-bf2e-c0fee7bf5161") : secret "canary-serving-cert" not found Apr 24 21:16:56.272437 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.271872 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert\") pod \"ingress-canary-srdps\" (UID: \"e0d12939-1428-44ac-bf2e-c0fee7bf5161\") " pod="openshift-ingress-canary/ingress-canary-srdps" Apr 24 21:16:56.272437 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.272137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kmlb\" (UniqueName: \"kubernetes.io/projected/e0d12939-1428-44ac-bf2e-c0fee7bf5161-kube-api-access-2kmlb\") pod \"ingress-canary-srdps\" (UID: \"e0d12939-1428-44ac-bf2e-c0fee7bf5161\") " pod="openshift-ingress-canary/ingress-canary-srdps" Apr 24 21:16:56.272437 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.272172 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/12da021b-8530-4058-b1e8-2689e6c9fdb6-ca-trust-extracted\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.272437 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.272178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-config-volume\") pod \"dns-default-tsgdk\" (UID: \"1b1a23d3-3a62-4a84-a3aa-d49e382f7322\") " pod="openshift-dns/dns-default-tsgdk" Apr 24 21:16:56.272437 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.272235 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-tmp-dir\") pod \"dns-default-tsgdk\" (UID: \"1b1a23d3-3a62-4a84-a3aa-d49e382f7322\") " pod="openshift-dns/dns-default-tsgdk" Apr 24 21:16:56.272437 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.272297 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/12da021b-8530-4058-b1e8-2689e6c9fdb6-image-registry-private-configuration\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.272437 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.272326 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12da021b-8530-4058-b1e8-2689e6c9fdb6-trusted-ca\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.272437 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.272351 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-metrics-tls\") pod \"dns-default-tsgdk\" (UID: \"1b1a23d3-3a62-4a84-a3aa-d49e382f7322\") " pod="openshift-dns/dns-default-tsgdk" Apr 24 21:16:56.272437 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.272397 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-certificates\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.272879 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.272443 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-bound-sa-token\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.272879 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.272745 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-config-volume\") pod \"dns-default-tsgdk\" (UID: \"1b1a23d3-3a62-4a84-a3aa-d49e382f7322\") " pod="openshift-dns/dns-default-tsgdk" Apr 24 21:16:56.272879 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.272769 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:56.272879 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.272833 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-metrics-tls podName:1b1a23d3-3a62-4a84-a3aa-d49e382f7322 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:56.772815452 +0000 UTC m=+34.624003934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-metrics-tls") pod "dns-default-tsgdk" (UID: "1b1a23d3-3a62-4a84-a3aa-d49e382f7322") : secret "dns-default-metrics-tls" not found Apr 24 21:16:56.273262 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.273232 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-tmp-dir\") pod \"dns-default-tsgdk\" (UID: \"1b1a23d3-3a62-4a84-a3aa-d49e382f7322\") " pod="openshift-dns/dns-default-tsgdk" Apr 24 21:16:56.273401 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.273375 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-certificates\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.274342 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.274272 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12da021b-8530-4058-b1e8-2689e6c9fdb6-trusted-ca\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.277863 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.277812 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/12da021b-8530-4058-b1e8-2689e6c9fdb6-image-registry-private-configuration\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.277863 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.277813 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/12da021b-8530-4058-b1e8-2689e6c9fdb6-installation-pull-secrets\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.290124 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.290100 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2nj4\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-kube-api-access-s2nj4\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.290241 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.290209 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-bound-sa-token\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.290934 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.290915 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbvsx\" (UniqueName: \"kubernetes.io/projected/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-kube-api-access-xbvsx\") pod \"dns-default-tsgdk\" (UID: \"1b1a23d3-3a62-4a84-a3aa-d49e382f7322\") " pod="openshift-dns/dns-default-tsgdk" Apr 24 21:16:56.291326 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.291293 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kmlb\" (UniqueName: \"kubernetes.io/projected/e0d12939-1428-44ac-bf2e-c0fee7bf5161-kube-api-access-2kmlb\") pod \"ingress-canary-srdps\" (UID: \"e0d12939-1428-44ac-bf2e-c0fee7bf5161\") " pod="openshift-ingress-canary/ingress-canary-srdps" Apr 24 21:16:56.474439 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.474380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8p6kc\" (UniqueName: \"kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc\") pod \"network-check-target-dwmtg\" (UID: \"87f55c8d-8e86-4982-94b5-0f6145c23361\") " pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:56.474632 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.474532 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs\") pod \"network-metrics-daemon-62n84\" (UID: \"214d9db8-1af2-4a55-8c32-7b0ade9b8b1b\") " pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:56.474632 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.474552 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:56.474632 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.474585 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:56.474632 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.474598 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8p6kc for pod openshift-network-diagnostics/network-check-target-dwmtg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:56.474814 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.474645 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:56.474814 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.474668 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc podName:87f55c8d-8e86-4982-94b5-0f6145c23361 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:28.474650187 +0000 UTC m=+66.325838667 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-8p6kc" (UniqueName: "kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc") pod "network-check-target-dwmtg" (UID: "87f55c8d-8e86-4982-94b5-0f6145c23361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:56.474814 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.474774 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs podName:214d9db8-1af2-4a55-8c32-7b0ade9b8b1b nodeName:}" failed. No retries permitted until 2026-04-24 21:17:28.474757481 +0000 UTC m=+66.325945958 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs") pod "network-metrics-daemon-62n84" (UID: "214d9db8-1af2-4a55-8c32-7b0ade9b8b1b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:56.777235 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.777200 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:56.777954 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.777373 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:56.777954 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.777396 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c7bdc7564-g4mfd: secret "image-registry-tls" not found Apr 24 21:16:56.777954 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.777441 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:56.777954 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.777457 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls podName:12da021b-8530-4058-b1e8-2689e6c9fdb6 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:57.777436397 +0000 UTC m=+35.628624885 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls") pod "image-registry-5c7bdc7564-g4mfd" (UID: "12da021b-8530-4058-b1e8-2689e6c9fdb6") : secret "image-registry-tls" not found Apr 24 21:16:56.777954 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.777483 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert podName:e0d12939-1428-44ac-bf2e-c0fee7bf5161 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:57.777471906 +0000 UTC m=+35.628660382 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert") pod "ingress-canary-srdps" (UID: "e0d12939-1428-44ac-bf2e-c0fee7bf5161") : secret "canary-serving-cert" not found Apr 24 21:16:56.777954 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.777373 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert\") pod \"ingress-canary-srdps\" (UID: \"e0d12939-1428-44ac-bf2e-c0fee7bf5161\") " pod="openshift-ingress-canary/ingress-canary-srdps" Apr 24 21:16:56.777954 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:56.777533 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-metrics-tls\") pod \"dns-default-tsgdk\" (UID: \"1b1a23d3-3a62-4a84-a3aa-d49e382f7322\") " pod="openshift-dns/dns-default-tsgdk" Apr 24 21:16:56.777954 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.777612 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:56.777954 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:56.777637 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-metrics-tls podName:1b1a23d3-3a62-4a84-a3aa-d49e382f7322 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:57.777630545 +0000 UTC m=+35.628819022 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-metrics-tls") pod "dns-default-tsgdk" (UID: "1b1a23d3-3a62-4a84-a3aa-d49e382f7322") : secret "dns-default-metrics-tls" not found Apr 24 21:16:57.708520 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:57.708485 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:16:57.708739 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:57.708537 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:16:57.708739 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:57.708546 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:16:57.711294 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:57.711267 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:16:57.712067 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:57.712041 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:16:57.712192 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:57.712076 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jphgp\"" Apr 24 21:16:57.712192 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:57.712091 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-w24qg\"" Apr 24 21:16:57.712192 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:57.712093 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:16:57.712192 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:57.712177 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:16:57.787051 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:57.787017 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-metrics-tls\") pod \"dns-default-tsgdk\" (UID: \"1b1a23d3-3a62-4a84-a3aa-d49e382f7322\") " pod="openshift-dns/dns-default-tsgdk" Apr 24 21:16:57.787557 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:57.787083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:57.787557 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:57.787148 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert\") pod \"ingress-canary-srdps\" (UID: \"e0d12939-1428-44ac-bf2e-c0fee7bf5161\") " pod="openshift-ingress-canary/ingress-canary-srdps" Apr 24 21:16:57.787557 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:57.787181 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:57.787557 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:57.787235 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:57.787557 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:57.787251 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-metrics-tls podName:1b1a23d3-3a62-4a84-a3aa-d49e382f7322 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:59.787234368 +0000 UTC m=+37.638422846 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-metrics-tls") pod "dns-default-tsgdk" (UID: "1b1a23d3-3a62-4a84-a3aa-d49e382f7322") : secret "dns-default-metrics-tls" not found Apr 24 21:16:57.787557 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:57.787268 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:57.787557 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:57.787279 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert podName:e0d12939-1428-44ac-bf2e-c0fee7bf5161 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:59.787263713 +0000 UTC m=+37.638452194 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert") pod "ingress-canary-srdps" (UID: "e0d12939-1428-44ac-bf2e-c0fee7bf5161") : secret "canary-serving-cert" not found Apr 24 21:16:57.787557 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:57.787285 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c7bdc7564-g4mfd: secret "image-registry-tls" not found Apr 24 21:16:57.787557 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:57.787332 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls podName:12da021b-8530-4058-b1e8-2689e6c9fdb6 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:59.787320039 +0000 UTC m=+37.638508516 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls") pod "image-registry-5c7bdc7564-g4mfd" (UID: "12da021b-8530-4058-b1e8-2689e6c9fdb6") : secret "image-registry-tls" not found Apr 24 21:16:57.982607 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:57.982577 2578 generic.go:358] "Generic (PLEG): container finished" podID="f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1" containerID="8e41b6ea5277ec6dd7df7fab0608176370656badedba04d4065ed9e515c4b6b3" exitCode=0 Apr 24 21:16:57.982777 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:57.982629 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjzlc" event={"ID":"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1","Type":"ContainerDied","Data":"8e41b6ea5277ec6dd7df7fab0608176370656badedba04d4065ed9e515c4b6b3"} Apr 24 21:16:58.987425 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:58.987391 2578 generic.go:358] "Generic (PLEG): container finished" podID="f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1" containerID="07548d291570a5bfd1e2341c87ca61830cc642b146f925a3108de776c74af4cc" exitCode=0 Apr 24 21:16:58.987905 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:58.987446 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjzlc" event={"ID":"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1","Type":"ContainerDied","Data":"07548d291570a5bfd1e2341c87ca61830cc642b146f925a3108de776c74af4cc"} Apr 24 21:16:59.804400 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:59.804220 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert\") pod \"ingress-canary-srdps\" (UID: \"e0d12939-1428-44ac-bf2e-c0fee7bf5161\") " pod="openshift-ingress-canary/ingress-canary-srdps" Apr 24 21:16:59.804577 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:59.804426 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-metrics-tls\") pod \"dns-default-tsgdk\" (UID: \"1b1a23d3-3a62-4a84-a3aa-d49e382f7322\") " pod="openshift-dns/dns-default-tsgdk" Apr 24 21:16:59.804577 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:59.804374 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:59.804577 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:59.804480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:16:59.804577 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:59.804522 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert podName:e0d12939-1428-44ac-bf2e-c0fee7bf5161 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:03.804505682 +0000 UTC m=+41.655694159 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert") pod "ingress-canary-srdps" (UID: "e0d12939-1428-44ac-bf2e-c0fee7bf5161") : secret "canary-serving-cert" not found Apr 24 21:16:59.804577 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:59.804573 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:59.804861 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:59.804586 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c7bdc7564-g4mfd: secret "image-registry-tls" not found Apr 24 21:16:59.804861 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:59.804604 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:59.804861 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:59.804632 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls podName:12da021b-8530-4058-b1e8-2689e6c9fdb6 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:03.804618358 +0000 UTC m=+41.655806841 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls") pod "image-registry-5c7bdc7564-g4mfd" (UID: "12da021b-8530-4058-b1e8-2689e6c9fdb6") : secret "image-registry-tls" not found Apr 24 21:16:59.804861 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:16:59.804646 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-metrics-tls podName:1b1a23d3-3a62-4a84-a3aa-d49e382f7322 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:03.804639543 +0000 UTC m=+41.655828021 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-metrics-tls") pod "dns-default-tsgdk" (UID: "1b1a23d3-3a62-4a84-a3aa-d49e382f7322") : secret "dns-default-metrics-tls" not found Apr 24 21:16:59.991872 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:16:59.991829 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjzlc" event={"ID":"f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1","Type":"ContainerStarted","Data":"d85ec63b06526f9a1c6e172818b989142c0f9bb41f390df773f37c3f768aa81b"} Apr 24 21:17:00.015446 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:00.015398 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-cjzlc" podStartSLOduration=6.417310932 podStartE2EDuration="38.015382274s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:25.221070344 +0000 UTC m=+3.072258827" lastFinishedPulling="2026-04-24 21:16:56.819141692 +0000 UTC m=+34.670330169" observedRunningTime="2026-04-24 21:17:00.014328694 +0000 UTC m=+37.865517192" watchObservedRunningTime="2026-04-24 21:17:00.015382274 +0000 UTC m=+37.866570771" Apr 24 21:17:01.612396 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:01.611642 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-cl5g5"] Apr 24 21:17:01.639440 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:01.639409 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-cl5g5"] Apr 24 21:17:01.639582 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:01.639547 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cl5g5" Apr 24 21:17:01.642101 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:01.642070 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 21:17:01.642904 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:01.642879 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-r6tqh\"" Apr 24 21:17:01.643034 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:01.642879 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 21:17:01.820484 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:01.820448 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbh5k\" (UniqueName: \"kubernetes.io/projected/165cdc84-5c62-46ec-9b79-042587eb1d6d-kube-api-access-rbh5k\") pod \"migrator-74bb7799d9-cl5g5\" (UID: \"165cdc84-5c62-46ec-9b79-042587eb1d6d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cl5g5" Apr 24 21:17:01.920934 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:01.920840 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbh5k\" (UniqueName: \"kubernetes.io/projected/165cdc84-5c62-46ec-9b79-042587eb1d6d-kube-api-access-rbh5k\") pod \"migrator-74bb7799d9-cl5g5\" (UID: \"165cdc84-5c62-46ec-9b79-042587eb1d6d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cl5g5" Apr 24 21:17:01.929780 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:01.929747 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbh5k\" (UniqueName: \"kubernetes.io/projected/165cdc84-5c62-46ec-9b79-042587eb1d6d-kube-api-access-rbh5k\") pod \"migrator-74bb7799d9-cl5g5\" (UID: \"165cdc84-5c62-46ec-9b79-042587eb1d6d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cl5g5" Apr 24 21:17:01.948669 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:01.948642 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cl5g5" Apr 24 21:17:02.120624 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:02.120590 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-cl5g5"] Apr 24 21:17:02.123975 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:17:02.123946 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod165cdc84_5c62_46ec_9b79_042587eb1d6d.slice/crio-b9a7053508acfbad083a68c0eb270e53ce0bb7cb124614598320878140cae291 WatchSource:0}: Error finding container b9a7053508acfbad083a68c0eb270e53ce0bb7cb124614598320878140cae291: Status 404 returned error can't find the container with id b9a7053508acfbad083a68c0eb270e53ce0bb7cb124614598320878140cae291 Apr 24 21:17:02.443311 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:02.443285 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4x96t_b90d25e6-8bbe-484f-9222-fe772ed03d48/dns-node-resolver/0.log" Apr 24 21:17:02.999113 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:02.999071 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cl5g5" event={"ID":"165cdc84-5c62-46ec-9b79-042587eb1d6d","Type":"ContainerStarted","Data":"b9a7053508acfbad083a68c0eb270e53ce0bb7cb124614598320878140cae291"} Apr 24 21:17:03.444494 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:03.444423 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2tx46_a2a58d1a-b069-41cc-b869-2a502d2e4e3c/node-ca/0.log" Apr 24 21:17:03.836968 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:03.836931 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:17:03.837137 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:03.836985 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert\") pod \"ingress-canary-srdps\" (UID: \"e0d12939-1428-44ac-bf2e-c0fee7bf5161\") " pod="openshift-ingress-canary/ingress-canary-srdps" Apr 24 21:17:03.837137 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:03.837016 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-metrics-tls\") pod \"dns-default-tsgdk\" (UID: \"1b1a23d3-3a62-4a84-a3aa-d49e382f7322\") " pod="openshift-dns/dns-default-tsgdk" Apr 24 21:17:03.837137 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:17:03.837100 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:17:03.837137 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:17:03.837109 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:17:03.837137 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:17:03.837128 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:17:03.837137 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:17:03.837132 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c7bdc7564-g4mfd: secret "image-registry-tls" not found Apr 24 21:17:03.837306 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:17:03.837155 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-metrics-tls podName:1b1a23d3-3a62-4a84-a3aa-d49e382f7322 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:11.837139972 +0000 UTC m=+49.688328452 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-metrics-tls") pod "dns-default-tsgdk" (UID: "1b1a23d3-3a62-4a84-a3aa-d49e382f7322") : secret "dns-default-metrics-tls" not found Apr 24 21:17:03.837306 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:17:03.837184 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert podName:e0d12939-1428-44ac-bf2e-c0fee7bf5161 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:11.8371656 +0000 UTC m=+49.688354079 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert") pod "ingress-canary-srdps" (UID: "e0d12939-1428-44ac-bf2e-c0fee7bf5161") : secret "canary-serving-cert" not found Apr 24 21:17:03.837306 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:17:03.837201 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls podName:12da021b-8530-4058-b1e8-2689e6c9fdb6 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:11.8371927 +0000 UTC m=+49.688381179 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls") pod "image-registry-5c7bdc7564-g4mfd" (UID: "12da021b-8530-4058-b1e8-2689e6c9fdb6") : secret "image-registry-tls" not found Apr 24 21:17:03.843220 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:03.843197 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-grdrz"] Apr 24 21:17:03.870125 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:03.870098 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-grdrz"] Apr 24 21:17:03.870278 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:03.870211 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-grdrz" Apr 24 21:17:03.873064 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:03.873040 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 21:17:03.873197 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:03.873040 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 21:17:03.873900 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:03.873884 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 21:17:03.874000 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:03.873926 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 21:17:03.874000 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:03.873975 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-sz295\"" Apr 24 21:17:04.002437 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:04.002405 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cl5g5" event={"ID":"165cdc84-5c62-46ec-9b79-042587eb1d6d","Type":"ContainerStarted","Data":"e1b38802b9af1d973a3d255b35a7c96d90fdf82ede9e54b3b3657691d31db6f9"} Apr 24 21:17:04.002437 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:04.002441 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cl5g5" event={"ID":"165cdc84-5c62-46ec-9b79-042587eb1d6d","Type":"ContainerStarted","Data":"98e350fdd29c361d46c4159754e2370a33d7becad7abd11e22ecf8f5be93f4ff"} Apr 24 21:17:04.018489 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:04.018435 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cl5g5" podStartSLOduration=1.606683968 podStartE2EDuration="3.018419432s" podCreationTimestamp="2026-04-24 21:17:01 +0000 UTC" firstStartedPulling="2026-04-24 21:17:02.125870679 +0000 UTC m=+39.977059156" lastFinishedPulling="2026-04-24 21:17:03.537606142 +0000 UTC m=+41.388794620" observedRunningTime="2026-04-24 21:17:04.017408935 +0000 UTC m=+41.868597435" watchObservedRunningTime="2026-04-24 21:17:04.018419432 +0000 UTC m=+41.869607931" Apr 24 21:17:04.039290 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:04.039259 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ac605d18-6710-477a-b057-44d9e53eba0c-signing-cabundle\") pod \"service-ca-865cb79987-grdrz\" (UID: \"ac605d18-6710-477a-b057-44d9e53eba0c\") " pod="openshift-service-ca/service-ca-865cb79987-grdrz" Apr 24 21:17:04.039388 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:04.039327 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ac605d18-6710-477a-b057-44d9e53eba0c-signing-key\") pod \"service-ca-865cb79987-grdrz\" (UID: \"ac605d18-6710-477a-b057-44d9e53eba0c\") " pod="openshift-service-ca/service-ca-865cb79987-grdrz" Apr 24 21:17:04.039437 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:04.039385 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrr96\" (UniqueName: \"kubernetes.io/projected/ac605d18-6710-477a-b057-44d9e53eba0c-kube-api-access-qrr96\") pod \"service-ca-865cb79987-grdrz\" (UID: \"ac605d18-6710-477a-b057-44d9e53eba0c\") " pod="openshift-service-ca/service-ca-865cb79987-grdrz" Apr 24 21:17:04.139952 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:04.139916 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ac605d18-6710-477a-b057-44d9e53eba0c-signing-cabundle\") pod \"service-ca-865cb79987-grdrz\" (UID: \"ac605d18-6710-477a-b057-44d9e53eba0c\") " pod="openshift-service-ca/service-ca-865cb79987-grdrz" Apr 24 21:17:04.140054 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:04.139988 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ac605d18-6710-477a-b057-44d9e53eba0c-signing-key\") pod \"service-ca-865cb79987-grdrz\" (UID: \"ac605d18-6710-477a-b057-44d9e53eba0c\") " pod="openshift-service-ca/service-ca-865cb79987-grdrz" Apr 24 21:17:04.140054 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:04.140024 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrr96\" (UniqueName: \"kubernetes.io/projected/ac605d18-6710-477a-b057-44d9e53eba0c-kube-api-access-qrr96\") pod \"service-ca-865cb79987-grdrz\" (UID: \"ac605d18-6710-477a-b057-44d9e53eba0c\") " pod="openshift-service-ca/service-ca-865cb79987-grdrz" Apr 24 21:17:04.140922 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:04.140885 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ac605d18-6710-477a-b057-44d9e53eba0c-signing-cabundle\") pod \"service-ca-865cb79987-grdrz\" (UID: \"ac605d18-6710-477a-b057-44d9e53eba0c\") " pod="openshift-service-ca/service-ca-865cb79987-grdrz" Apr 24 21:17:04.142372 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:04.142349 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ac605d18-6710-477a-b057-44d9e53eba0c-signing-key\") pod \"service-ca-865cb79987-grdrz\" (UID: \"ac605d18-6710-477a-b057-44d9e53eba0c\") " pod="openshift-service-ca/service-ca-865cb79987-grdrz" Apr 24 21:17:04.148108 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:04.148089 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrr96\" (UniqueName: \"kubernetes.io/projected/ac605d18-6710-477a-b057-44d9e53eba0c-kube-api-access-qrr96\") pod \"service-ca-865cb79987-grdrz\" (UID: \"ac605d18-6710-477a-b057-44d9e53eba0c\") " pod="openshift-service-ca/service-ca-865cb79987-grdrz" Apr 24 21:17:04.179507 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:04.179475 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-grdrz" Apr 24 21:17:04.291725 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:04.291613 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-grdrz"] Apr 24 21:17:04.295048 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:17:04.295025 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac605d18_6710_477a_b057_44d9e53eba0c.slice/crio-448b277cd992ce7609e57946566b389bd59db4811f3ee00b4a7c694cf28fed9a WatchSource:0}: Error finding container 448b277cd992ce7609e57946566b389bd59db4811f3ee00b4a7c694cf28fed9a: Status 404 returned error can't find the container with id 448b277cd992ce7609e57946566b389bd59db4811f3ee00b4a7c694cf28fed9a Apr 24 21:17:05.006369 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:05.006320 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-grdrz" event={"ID":"ac605d18-6710-477a-b057-44d9e53eba0c","Type":"ContainerStarted","Data":"448b277cd992ce7609e57946566b389bd59db4811f3ee00b4a7c694cf28fed9a"} Apr 24 21:17:07.012014 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:07.011970 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-grdrz" event={"ID":"ac605d18-6710-477a-b057-44d9e53eba0c","Type":"ContainerStarted","Data":"a3abd377dffbdba860b0b73d13535290c7071878b2e4fe7c756d9460506bf26c"} Apr 24 21:17:07.026145 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:07.026088 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-grdrz" podStartSLOduration=1.862265157 podStartE2EDuration="4.026074354s" podCreationTimestamp="2026-04-24 21:17:03 +0000 UTC" firstStartedPulling="2026-04-24 21:17:04.296842542 +0000 UTC m=+42.148031024" lastFinishedPulling="2026-04-24 21:17:06.460651744 +0000 UTC m=+44.311840221" observedRunningTime="2026-04-24 21:17:07.025206094 +0000 UTC m=+44.876394594" watchObservedRunningTime="2026-04-24 21:17:07.026074354 +0000 UTC m=+44.877262832" Apr 24 21:17:07.871283 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:07.871221 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret\") pod \"global-pull-secret-syncer-6lx29\" (UID: \"dcc464bb-ad40-4751-9ee3-8ee223123ed6\") " pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:17:07.873875 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:07.873842 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcc464bb-ad40-4751-9ee3-8ee223123ed6-original-pull-secret\") pod \"global-pull-secret-syncer-6lx29\" (UID: \"dcc464bb-ad40-4751-9ee3-8ee223123ed6\") " pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:17:07.918800 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:07.918763 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6lx29" Apr 24 21:17:08.047366 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:08.047321 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6lx29"] Apr 24 21:17:08.053542 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:17:08.053340 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcc464bb_ad40_4751_9ee3_8ee223123ed6.slice/crio-333706d621b7fc43856aa40de0363eb9fa080191b1cc13dad10008e6142dc7fa WatchSource:0}: Error finding container 333706d621b7fc43856aa40de0363eb9fa080191b1cc13dad10008e6142dc7fa: Status 404 returned error can't find the container with id 333706d621b7fc43856aa40de0363eb9fa080191b1cc13dad10008e6142dc7fa Apr 24 21:17:09.019202 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:09.019083 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6lx29" event={"ID":"dcc464bb-ad40-4751-9ee3-8ee223123ed6","Type":"ContainerStarted","Data":"333706d621b7fc43856aa40de0363eb9fa080191b1cc13dad10008e6142dc7fa"} Apr 24 21:17:11.905090 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:11.905045 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:17:11.905573 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:11.905132 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert\") pod \"ingress-canary-srdps\" (UID: \"e0d12939-1428-44ac-bf2e-c0fee7bf5161\") " pod="openshift-ingress-canary/ingress-canary-srdps" Apr 24 21:17:11.905573 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:11.905177 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-metrics-tls\") pod \"dns-default-tsgdk\" (UID: \"1b1a23d3-3a62-4a84-a3aa-d49e382f7322\") " pod="openshift-dns/dns-default-tsgdk" Apr 24 21:17:11.905573 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:17:11.905231 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:17:11.905573 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:17:11.905257 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c7bdc7564-g4mfd: secret "image-registry-tls" not found Apr 24 21:17:11.905573 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:17:11.905301 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:17:11.905573 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:17:11.905326 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls podName:12da021b-8530-4058-b1e8-2689e6c9fdb6 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:27.905304268 +0000 UTC m=+65.756492755 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls") pod "image-registry-5c7bdc7564-g4mfd" (UID: "12da021b-8530-4058-b1e8-2689e6c9fdb6") : secret "image-registry-tls" not found Apr 24 21:17:11.905573 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:17:11.905361 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert podName:e0d12939-1428-44ac-bf2e-c0fee7bf5161 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:27.905343738 +0000 UTC m=+65.756532228 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert") pod "ingress-canary-srdps" (UID: "e0d12939-1428-44ac-bf2e-c0fee7bf5161") : secret "canary-serving-cert" not found Apr 24 21:17:11.907901 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:11.907876 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1b1a23d3-3a62-4a84-a3aa-d49e382f7322-metrics-tls\") pod \"dns-default-tsgdk\" (UID: \"1b1a23d3-3a62-4a84-a3aa-d49e382f7322\") " pod="openshift-dns/dns-default-tsgdk" Apr 24 21:17:11.959410 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:11.959377 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tsgdk" Apr 24 21:17:12.269297 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:12.269265 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tsgdk"] Apr 24 21:17:12.272632 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:17:12.272603 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b1a23d3_3a62_4a84_a3aa_d49e382f7322.slice/crio-ff5adb87c184488ed3c3ea84a067c25a6a3825dd843b65dc51a8890038a5f36f WatchSource:0}: Error finding container ff5adb87c184488ed3c3ea84a067c25a6a3825dd843b65dc51a8890038a5f36f: Status 404 returned error can't find the container with id ff5adb87c184488ed3c3ea84a067c25a6a3825dd843b65dc51a8890038a5f36f Apr 24 21:17:13.030048 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:13.030004 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6lx29" event={"ID":"dcc464bb-ad40-4751-9ee3-8ee223123ed6","Type":"ContainerStarted","Data":"dcd58c9a6adccacd5b8070c4a1595a209d114f08a8c13d7087ab9af9860fc37b"} Apr 24 21:17:13.031136 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:13.031099 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tsgdk" event={"ID":"1b1a23d3-3a62-4a84-a3aa-d49e382f7322","Type":"ContainerStarted","Data":"ff5adb87c184488ed3c3ea84a067c25a6a3825dd843b65dc51a8890038a5f36f"} Apr 24 21:17:13.045462 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:13.045402 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6lx29" podStartSLOduration=33.59052574 podStartE2EDuration="38.045382189s" podCreationTimestamp="2026-04-24 21:16:35 +0000 UTC" firstStartedPulling="2026-04-24 21:17:08.05551365 +0000 UTC m=+45.906702131" lastFinishedPulling="2026-04-24 21:17:12.510370103 +0000 UTC m=+50.361558580" observedRunningTime="2026-04-24 21:17:13.044559969 +0000 UTC m=+50.895748467" watchObservedRunningTime="2026-04-24 21:17:13.045382189 +0000 UTC m=+50.896570690" Apr 24 21:17:14.035888 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:14.035785 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tsgdk" event={"ID":"1b1a23d3-3a62-4a84-a3aa-d49e382f7322","Type":"ContainerStarted","Data":"b3ee095dbab3ef536b118fbd8d095048be609cebd9f297aff61e146a078df334"} Apr 24 21:17:14.035888 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:14.035832 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tsgdk" event={"ID":"1b1a23d3-3a62-4a84-a3aa-d49e382f7322","Type":"ContainerStarted","Data":"6355789750b158c39c1dc351f06a2932abbd21671cd57a5b3c464b27387c8a25"} Apr 24 21:17:14.053937 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:14.053799 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tsgdk" podStartSLOduration=17.588733621 podStartE2EDuration="19.053780935s" podCreationTimestamp="2026-04-24 21:16:55 +0000 UTC" firstStartedPulling="2026-04-24 21:17:12.274385246 +0000 UTC m=+50.125573727" lastFinishedPulling="2026-04-24 21:17:13.739432551 +0000 UTC m=+51.590621041" observedRunningTime="2026-04-24 21:17:14.053653525 +0000 UTC m=+51.904842025" watchObservedRunningTime="2026-04-24 21:17:14.053780935 +0000 UTC m=+51.904969448" Apr 24 21:17:15.041204 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:15.039556 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-tsgdk" Apr 24 21:17:20.973290 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:20.973262 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cg9z2" Apr 24 21:17:25.043618 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:25.043587 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tsgdk" Apr 24 21:17:26.731519 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.731489 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-m27jw"] Apr 24 21:17:26.772918 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.772891 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55f577cf7b-ng6nd"] Apr 24 21:17:26.773066 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.773036 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-m27jw" Apr 24 21:17:26.775793 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.775765 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:17:26.775930 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.775888 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-4xlw8\"" Apr 24 21:17:26.776059 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.776039 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:17:26.785837 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.785817 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-m27jw"] Apr 24 21:17:26.785960 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.785841 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55f577cf7b-ng6nd"] Apr 24 21:17:26.785960 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.785943 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:26.789050 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.789031 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 21:17:26.789285 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.789269 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 21:17:26.789471 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.789448 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 21:17:26.789756 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.789739 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-wpj8v\"" Apr 24 21:17:26.789840 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.789763 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 21:17:26.790002 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.789987 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 21:17:26.819670 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.819642 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fbae862e-6962-4e66-9699-0d72da740cc0-console-config\") pod \"console-55f577cf7b-ng6nd\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:26.819824 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.819759 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fbae862e-6962-4e66-9699-0d72da740cc0-console-oauth-config\") pod \"console-55f577cf7b-ng6nd\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:26.819824 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.819787 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbae862e-6962-4e66-9699-0d72da740cc0-console-serving-cert\") pod \"console-55f577cf7b-ng6nd\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:26.819903 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.819836 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fbae862e-6962-4e66-9699-0d72da740cc0-service-ca\") pod \"console-55f577cf7b-ng6nd\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:26.819903 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.819854 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fbae862e-6962-4e66-9699-0d72da740cc0-oauth-serving-cert\") pod \"console-55f577cf7b-ng6nd\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:26.819975 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.819902 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgwzd\" (UniqueName: \"kubernetes.io/projected/f9ced7e0-e303-4e2f-9642-e2c46ac4800a-kube-api-access-lgwzd\") pod \"downloads-6bcc868b7-m27jw\" (UID: \"f9ced7e0-e303-4e2f-9642-e2c46ac4800a\") " pod="openshift-console/downloads-6bcc868b7-m27jw" Apr 24 21:17:26.819975 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.819925 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czmvn\" (UniqueName: \"kubernetes.io/projected/fbae862e-6962-4e66-9699-0d72da740cc0-kube-api-access-czmvn\") pod \"console-55f577cf7b-ng6nd\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:26.845326 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.845292 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-m4fz4"] Apr 24 21:17:26.871088 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.871056 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-m4fz4" Apr 24 21:17:26.873959 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.873933 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-m4fz4"] Apr 24 21:17:26.875117 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.875095 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:17:26.875241 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.875168 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:17:26.875241 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.875190 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:17:26.875241 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.875210 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-2bmll\"" Apr 24 21:17:26.875431 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.875402 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:17:26.921054 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.921021 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fbae862e-6962-4e66-9699-0d72da740cc0-console-config\") pod \"console-55f577cf7b-ng6nd\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:26.921054 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.921057 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fbae862e-6962-4e66-9699-0d72da740cc0-console-oauth-config\") pod \"console-55f577cf7b-ng6nd\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:26.921303 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.921082 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbae862e-6962-4e66-9699-0d72da740cc0-console-serving-cert\") pod \"console-55f577cf7b-ng6nd\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:26.921303 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.921110 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgwzd\" (UniqueName: \"kubernetes.io/projected/f9ced7e0-e303-4e2f-9642-e2c46ac4800a-kube-api-access-lgwzd\") pod \"downloads-6bcc868b7-m27jw\" (UID: \"f9ced7e0-e303-4e2f-9642-e2c46ac4800a\") " pod="openshift-console/downloads-6bcc868b7-m27jw" Apr 24 21:17:26.921303 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.921179 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8590998a-f1f9-44c1-b54a-f83ca722daad-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m4fz4\" (UID: \"8590998a-f1f9-44c1-b54a-f83ca722daad\") " pod="openshift-insights/insights-runtime-extractor-m4fz4" Apr 24 21:17:26.921303 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.921207 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8590998a-f1f9-44c1-b54a-f83ca722daad-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m4fz4\" (UID: \"8590998a-f1f9-44c1-b54a-f83ca722daad\") " pod="openshift-insights/insights-runtime-extractor-m4fz4" Apr 24 21:17:26.921303 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.921241 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czmvn\" (UniqueName: \"kubernetes.io/projected/fbae862e-6962-4e66-9699-0d72da740cc0-kube-api-access-czmvn\") pod \"console-55f577cf7b-ng6nd\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:26.921544 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.921316 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8590998a-f1f9-44c1-b54a-f83ca722daad-data-volume\") pod \"insights-runtime-extractor-m4fz4\" (UID: \"8590998a-f1f9-44c1-b54a-f83ca722daad\") " pod="openshift-insights/insights-runtime-extractor-m4fz4" Apr 24 21:17:26.921544 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.921344 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fbae862e-6962-4e66-9699-0d72da740cc0-service-ca\") pod \"console-55f577cf7b-ng6nd\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:26.921544 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.921369 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8590998a-f1f9-44c1-b54a-f83ca722daad-crio-socket\") pod \"insights-runtime-extractor-m4fz4\" (UID: \"8590998a-f1f9-44c1-b54a-f83ca722daad\") " pod="openshift-insights/insights-runtime-extractor-m4fz4" Apr 24 21:17:26.921544 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.921408 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-279t2\" (UniqueName: \"kubernetes.io/projected/8590998a-f1f9-44c1-b54a-f83ca722daad-kube-api-access-279t2\") pod \"insights-runtime-extractor-m4fz4\" (UID: \"8590998a-f1f9-44c1-b54a-f83ca722daad\") " pod="openshift-insights/insights-runtime-extractor-m4fz4" Apr 24 21:17:26.921544 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.921459 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fbae862e-6962-4e66-9699-0d72da740cc0-oauth-serving-cert\") pod \"console-55f577cf7b-ng6nd\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:26.921927 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.921909 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fbae862e-6962-4e66-9699-0d72da740cc0-console-config\") pod \"console-55f577cf7b-ng6nd\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:26.922000 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.921959 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fbae862e-6962-4e66-9699-0d72da740cc0-service-ca\") pod \"console-55f577cf7b-ng6nd\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:26.922076 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.922060 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fbae862e-6962-4e66-9699-0d72da740cc0-oauth-serving-cert\") pod \"console-55f577cf7b-ng6nd\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:26.924114 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.924091 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbae862e-6962-4e66-9699-0d72da740cc0-console-serving-cert\") pod \"console-55f577cf7b-ng6nd\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:26.924209 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.924169 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fbae862e-6962-4e66-9699-0d72da740cc0-console-oauth-config\") pod \"console-55f577cf7b-ng6nd\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:26.937466 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.937440 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgwzd\" (UniqueName: \"kubernetes.io/projected/f9ced7e0-e303-4e2f-9642-e2c46ac4800a-kube-api-access-lgwzd\") pod \"downloads-6bcc868b7-m27jw\" (UID: \"f9ced7e0-e303-4e2f-9642-e2c46ac4800a\") " pod="openshift-console/downloads-6bcc868b7-m27jw" Apr 24 21:17:26.938609 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:26.938577 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czmvn\" (UniqueName: \"kubernetes.io/projected/fbae862e-6962-4e66-9699-0d72da740cc0-kube-api-access-czmvn\") pod \"console-55f577cf7b-ng6nd\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:27.022256 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.022165 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8590998a-f1f9-44c1-b54a-f83ca722daad-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m4fz4\" (UID: \"8590998a-f1f9-44c1-b54a-f83ca722daad\") " pod="openshift-insights/insights-runtime-extractor-m4fz4" Apr 24 21:17:27.022256 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.022204 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8590998a-f1f9-44c1-b54a-f83ca722daad-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m4fz4\" (UID: \"8590998a-f1f9-44c1-b54a-f83ca722daad\") " pod="openshift-insights/insights-runtime-extractor-m4fz4" Apr 24 21:17:27.022488 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.022272 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8590998a-f1f9-44c1-b54a-f83ca722daad-data-volume\") pod \"insights-runtime-extractor-m4fz4\" (UID: \"8590998a-f1f9-44c1-b54a-f83ca722daad\") " pod="openshift-insights/insights-runtime-extractor-m4fz4" Apr 24 21:17:27.022488 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.022289 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8590998a-f1f9-44c1-b54a-f83ca722daad-crio-socket\") pod \"insights-runtime-extractor-m4fz4\" (UID: \"8590998a-f1f9-44c1-b54a-f83ca722daad\") " pod="openshift-insights/insights-runtime-extractor-m4fz4" Apr 24 21:17:27.022488 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.022310 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-279t2\" (UniqueName: \"kubernetes.io/projected/8590998a-f1f9-44c1-b54a-f83ca722daad-kube-api-access-279t2\") pod \"insights-runtime-extractor-m4fz4\" (UID: \"8590998a-f1f9-44c1-b54a-f83ca722daad\") " pod="openshift-insights/insights-runtime-extractor-m4fz4" Apr 24 21:17:27.022488 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.022458 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8590998a-f1f9-44c1-b54a-f83ca722daad-crio-socket\") pod \"insights-runtime-extractor-m4fz4\" (UID: \"8590998a-f1f9-44c1-b54a-f83ca722daad\") " pod="openshift-insights/insights-runtime-extractor-m4fz4" Apr 24 21:17:27.022806 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.022709 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8590998a-f1f9-44c1-b54a-f83ca722daad-data-volume\") pod \"insights-runtime-extractor-m4fz4\" (UID: \"8590998a-f1f9-44c1-b54a-f83ca722daad\") " pod="openshift-insights/insights-runtime-extractor-m4fz4" Apr 24 21:17:27.022806 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.022750 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8590998a-f1f9-44c1-b54a-f83ca722daad-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m4fz4\" (UID: \"8590998a-f1f9-44c1-b54a-f83ca722daad\") " pod="openshift-insights/insights-runtime-extractor-m4fz4" Apr 24 21:17:27.024606 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.024583 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8590998a-f1f9-44c1-b54a-f83ca722daad-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m4fz4\" (UID: \"8590998a-f1f9-44c1-b54a-f83ca722daad\") " pod="openshift-insights/insights-runtime-extractor-m4fz4" Apr 24 21:17:27.042777 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.042751 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-279t2\" (UniqueName: \"kubernetes.io/projected/8590998a-f1f9-44c1-b54a-f83ca722daad-kube-api-access-279t2\") pod \"insights-runtime-extractor-m4fz4\" (UID: \"8590998a-f1f9-44c1-b54a-f83ca722daad\") " pod="openshift-insights/insights-runtime-extractor-m4fz4" Apr 24 21:17:27.081491 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.081462 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-m27jw" Apr 24 21:17:27.094478 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.094449 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:27.193089 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.193053 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-m4fz4" Apr 24 21:17:27.218221 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.218190 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-m27jw"] Apr 24 21:17:27.222419 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:17:27.222390 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9ced7e0_e303_4e2f_9642_e2c46ac4800a.slice/crio-4d4530899ca4f9d678fc5403e974d04289f6b9871f5f9a16e97c07520e758245 WatchSource:0}: Error finding container 4d4530899ca4f9d678fc5403e974d04289f6b9871f5f9a16e97c07520e758245: Status 404 returned error can't find the container with id 4d4530899ca4f9d678fc5403e974d04289f6b9871f5f9a16e97c07520e758245 Apr 24 21:17:27.250089 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.249777 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55f577cf7b-ng6nd"] Apr 24 21:17:27.253287 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:17:27.253262 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbae862e_6962_4e66_9699_0d72da740cc0.slice/crio-5cd4a51087ca3ceb7f9bd23bb32f1d7b843c60d9e9aadf0861b18b942e9ab647 WatchSource:0}: Error finding container 5cd4a51087ca3ceb7f9bd23bb32f1d7b843c60d9e9aadf0861b18b942e9ab647: Status 404 returned error can't find the container with id 5cd4a51087ca3ceb7f9bd23bb32f1d7b843c60d9e9aadf0861b18b942e9ab647 Apr 24 21:17:27.317022 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.316929 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-m4fz4"] Apr 24 21:17:27.321711 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:17:27.321659 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8590998a_f1f9_44c1_b54a_f83ca722daad.slice/crio-b8e8ab8ce934fbde77cdbbab651b101d0497915a7685e59ac6ef6f8fa1c7b829 WatchSource:0}: Error finding container b8e8ab8ce934fbde77cdbbab651b101d0497915a7685e59ac6ef6f8fa1c7b829: Status 404 returned error can't find the container with id b8e8ab8ce934fbde77cdbbab651b101d0497915a7685e59ac6ef6f8fa1c7b829 Apr 24 21:17:27.928789 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.928193 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:17:27.928789 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.928316 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert\") pod \"ingress-canary-srdps\" (UID: \"e0d12939-1428-44ac-bf2e-c0fee7bf5161\") " pod="openshift-ingress-canary/ingress-canary-srdps" Apr 24 21:17:27.931992 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.931939 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0d12939-1428-44ac-bf2e-c0fee7bf5161-cert\") pod \"ingress-canary-srdps\" (UID: \"e0d12939-1428-44ac-bf2e-c0fee7bf5161\") " pod="openshift-ingress-canary/ingress-canary-srdps" Apr 24 21:17:27.933649 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:27.933602 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls\") pod \"image-registry-5c7bdc7564-g4mfd\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:17:28.077306 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.077238 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m4fz4" event={"ID":"8590998a-f1f9-44c1-b54a-f83ca722daad","Type":"ContainerStarted","Data":"0f6421654de70c50e1b398961b0fcc718694ac806d61d1b65301e9afd6fece65"} Apr 24 21:17:28.077306 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.077284 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m4fz4" event={"ID":"8590998a-f1f9-44c1-b54a-f83ca722daad","Type":"ContainerStarted","Data":"b8e8ab8ce934fbde77cdbbab651b101d0497915a7685e59ac6ef6f8fa1c7b829"} Apr 24 21:17:28.078528 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.078497 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55f577cf7b-ng6nd" event={"ID":"fbae862e-6962-4e66-9699-0d72da740cc0","Type":"ContainerStarted","Data":"5cd4a51087ca3ceb7f9bd23bb32f1d7b843c60d9e9aadf0861b18b942e9ab647"} Apr 24 21:17:28.080465 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.080421 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-m27jw" event={"ID":"f9ced7e0-e303-4e2f-9642-e2c46ac4800a","Type":"ContainerStarted","Data":"4d4530899ca4f9d678fc5403e974d04289f6b9871f5f9a16e97c07520e758245"} Apr 24 21:17:28.135826 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.135794 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jgvbr\"" Apr 24 21:17:28.143155 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.143124 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:17:28.149015 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.148992 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-znhx4\"" Apr 24 21:17:28.157579 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.157554 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-srdps" Apr 24 21:17:28.335287 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.335254 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-srdps"] Apr 24 21:17:28.349458 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.349421 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5c7bdc7564-g4mfd"] Apr 24 21:17:28.500442 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:17:28.500383 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0d12939_1428_44ac_bf2e_c0fee7bf5161.slice/crio-754a15343389b64fe69235dc812e10f0131a40220c051188214df5ad8d07d483 WatchSource:0}: Error finding container 754a15343389b64fe69235dc812e10f0131a40220c051188214df5ad8d07d483: Status 404 returned error can't find the container with id 754a15343389b64fe69235dc812e10f0131a40220c051188214df5ad8d07d483 Apr 24 21:17:28.501365 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:17:28.501316 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12da021b_8530_4058_b1e8_2689e6c9fdb6.slice/crio-07880974cd5997a58a85192964bf05dca4d2b6f0d52fdb5d4c0130fc9901f678 WatchSource:0}: Error finding container 07880974cd5997a58a85192964bf05dca4d2b6f0d52fdb5d4c0130fc9901f678: Status 404 returned error can't find the container with id 07880974cd5997a58a85192964bf05dca4d2b6f0d52fdb5d4c0130fc9901f678 Apr 24 21:17:28.533388 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.533324 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8p6kc\" (UniqueName: \"kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc\") pod \"network-check-target-dwmtg\" (UID: \"87f55c8d-8e86-4982-94b5-0f6145c23361\") " pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:17:28.533522 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.533387 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs\") pod \"network-metrics-daemon-62n84\" (UID: \"214d9db8-1af2-4a55-8c32-7b0ade9b8b1b\") " pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:17:28.535698 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.535660 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:17:28.535809 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.535696 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:17:28.546319 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.546293 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:17:28.557307 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.557272 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p6kc\" (UniqueName: \"kubernetes.io/projected/87f55c8d-8e86-4982-94b5-0f6145c23361-kube-api-access-8p6kc\") pod \"network-check-target-dwmtg\" (UID: \"87f55c8d-8e86-4982-94b5-0f6145c23361\") " pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:17:28.558731 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.558642 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/214d9db8-1af2-4a55-8c32-7b0ade9b8b1b-metrics-certs\") pod \"network-metrics-daemon-62n84\" (UID: \"214d9db8-1af2-4a55-8c32-7b0ade9b8b1b\") " pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:17:28.625823 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.625795 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jphgp\"" Apr 24 21:17:28.629995 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.629971 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-w24qg\"" Apr 24 21:17:28.634288 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.633958 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62n84" Apr 24 21:17:28.637983 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.637950 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:17:28.797064 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.797025 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-62n84"] Apr 24 21:17:28.802250 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:17:28.802166 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod214d9db8_1af2_4a55_8c32_7b0ade9b8b1b.slice/crio-02de7dfa765c2f2f9ae9aaa817acc084eb8cc13200d727755338f9441fe0d68f WatchSource:0}: Error finding container 02de7dfa765c2f2f9ae9aaa817acc084eb8cc13200d727755338f9441fe0d68f: Status 404 returned error can't find the container with id 02de7dfa765c2f2f9ae9aaa817acc084eb8cc13200d727755338f9441fe0d68f Apr 24 21:17:28.831210 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:28.831178 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dwmtg"] Apr 24 21:17:28.834975 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:17:28.834931 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87f55c8d_8e86_4982_94b5_0f6145c23361.slice/crio-32a006a17fda94a2f837e829ef2e44384049392e05acdfe398b8ac01a9591db8 WatchSource:0}: Error finding container 32a006a17fda94a2f837e829ef2e44384049392e05acdfe398b8ac01a9591db8: Status 404 returned error can't find the container with id 32a006a17fda94a2f837e829ef2e44384049392e05acdfe398b8ac01a9591db8 Apr 24 21:17:29.086209 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:29.086090 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-62n84" event={"ID":"214d9db8-1af2-4a55-8c32-7b0ade9b8b1b","Type":"ContainerStarted","Data":"02de7dfa765c2f2f9ae9aaa817acc084eb8cc13200d727755338f9441fe0d68f"} Apr 24 21:17:29.087954 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:29.087886 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-srdps" event={"ID":"e0d12939-1428-44ac-bf2e-c0fee7bf5161","Type":"ContainerStarted","Data":"754a15343389b64fe69235dc812e10f0131a40220c051188214df5ad8d07d483"} Apr 24 21:17:29.091526 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:29.091449 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dwmtg" event={"ID":"87f55c8d-8e86-4982-94b5-0f6145c23361","Type":"ContainerStarted","Data":"32a006a17fda94a2f837e829ef2e44384049392e05acdfe398b8ac01a9591db8"} Apr 24 21:17:29.096598 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:29.096539 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m4fz4" event={"ID":"8590998a-f1f9-44c1-b54a-f83ca722daad","Type":"ContainerStarted","Data":"bd061165072ac377758531fe19be2b5a3bfaa998fa35234bbd5ee5897d0f54dc"} Apr 24 21:17:29.099227 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:29.098715 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" event={"ID":"12da021b-8530-4058-b1e8-2689e6c9fdb6","Type":"ContainerStarted","Data":"1f2317a036cbb3708c79a77066c725b4c67b58076dac6652bb28f74fe51ceba3"} Apr 24 21:17:29.099227 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:29.098747 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" event={"ID":"12da021b-8530-4058-b1e8-2689e6c9fdb6","Type":"ContainerStarted","Data":"07880974cd5997a58a85192964bf05dca4d2b6f0d52fdb5d4c0130fc9901f678"} Apr 24 21:17:29.099227 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:29.098953 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:17:29.129861 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:29.128983 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" podStartSLOduration=61.128961363 podStartE2EDuration="1m1.128961363s" podCreationTimestamp="2026-04-24 21:16:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:17:29.126493468 +0000 UTC m=+66.977681968" watchObservedRunningTime="2026-04-24 21:17:29.128961363 +0000 UTC m=+66.980149899" Apr 24 21:17:34.119385 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:34.119345 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-62n84" event={"ID":"214d9db8-1af2-4a55-8c32-7b0ade9b8b1b","Type":"ContainerStarted","Data":"7235614bbd0ac0eb6643518ba5de015c28d40bbc5444d73882f7634776518973"} Apr 24 21:17:34.119819 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:34.119394 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-62n84" event={"ID":"214d9db8-1af2-4a55-8c32-7b0ade9b8b1b","Type":"ContainerStarted","Data":"f2cc94bd55ba26d1e19e44558aadc8e8f8f0149e81fad597ce69c075fa6f6178"} Apr 24 21:17:34.121771 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:34.121739 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-srdps" event={"ID":"e0d12939-1428-44ac-bf2e-c0fee7bf5161","Type":"ContainerStarted","Data":"7d515335f73e6494d19779bf9c351f30129cf6d00911c2a7dc61659a94c41d61"} Apr 24 21:17:34.123655 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:34.123615 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dwmtg" event={"ID":"87f55c8d-8e86-4982-94b5-0f6145c23361","Type":"ContainerStarted","Data":"a270cff9e4006d7ddc6107a0af905cc98cc1fee87f5d99b474887d6878d6f45b"} Apr 24 21:17:34.123812 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:34.123766 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:17:34.128792 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:34.128127 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m4fz4" event={"ID":"8590998a-f1f9-44c1-b54a-f83ca722daad","Type":"ContainerStarted","Data":"c7d69fdb28218b28f8408d6bc26ab2af16fef83f9d4d912b1e117afe22a8d7ad"} Apr 24 21:17:34.130508 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:34.130468 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55f577cf7b-ng6nd" event={"ID":"fbae862e-6962-4e66-9699-0d72da740cc0","Type":"ContainerStarted","Data":"1fdd96fedbc320d9ac6538552fd1ebf94d7a79a86dc497fa2878788a1f93bfb3"} Apr 24 21:17:34.146835 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:34.146771 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-62n84" podStartSLOduration=67.153748657 podStartE2EDuration="1m12.146751636s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:17:28.804918651 +0000 UTC m=+66.656107135" lastFinishedPulling="2026-04-24 21:17:33.797921638 +0000 UTC m=+71.649110114" observedRunningTime="2026-04-24 21:17:34.145883441 +0000 UTC m=+71.997071935" watchObservedRunningTime="2026-04-24 21:17:34.146751636 +0000 UTC m=+71.997940136" Apr 24 21:17:34.172742 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:34.172658 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-dwmtg" podStartSLOduration=67.197963974 podStartE2EDuration="1m12.1726439s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:17:28.837871055 +0000 UTC m=+66.689059534" lastFinishedPulling="2026-04-24 21:17:33.81255097 +0000 UTC m=+71.663739460" observedRunningTime="2026-04-24 21:17:34.171978613 +0000 UTC m=+72.023167132" watchObservedRunningTime="2026-04-24 21:17:34.1726439 +0000 UTC m=+72.023832399" Apr 24 21:17:34.194370 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:34.194312 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-m4fz4" podStartSLOduration=1.847042891 podStartE2EDuration="8.194292249s" podCreationTimestamp="2026-04-24 21:17:26 +0000 UTC" firstStartedPulling="2026-04-24 21:17:27.451994211 +0000 UTC m=+65.303182700" lastFinishedPulling="2026-04-24 21:17:33.799243567 +0000 UTC m=+71.650432058" observedRunningTime="2026-04-24 21:17:34.192924964 +0000 UTC m=+72.044113488" watchObservedRunningTime="2026-04-24 21:17:34.194292249 +0000 UTC m=+72.045480749" Apr 24 21:17:34.213664 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:34.213607 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-srdps" podStartSLOduration=33.944263908 podStartE2EDuration="39.213594157s" podCreationTimestamp="2026-04-24 21:16:55 +0000 UTC" firstStartedPulling="2026-04-24 21:17:28.529080213 +0000 UTC m=+66.380268702" lastFinishedPulling="2026-04-24 21:17:33.798410455 +0000 UTC m=+71.649598951" observedRunningTime="2026-04-24 21:17:34.213138345 +0000 UTC m=+72.064326843" watchObservedRunningTime="2026-04-24 21:17:34.213594157 +0000 UTC m=+72.064782656" Apr 24 21:17:37.095137 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:37.095098 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:37.095137 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:37.095148 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:37.100634 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:37.100606 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:37.144481 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:37.144426 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55f577cf7b-ng6nd" podStartSLOduration=4.602503076 podStartE2EDuration="11.144404799s" podCreationTimestamp="2026-04-24 21:17:26 +0000 UTC" firstStartedPulling="2026-04-24 21:17:27.25579194 +0000 UTC m=+65.106980418" lastFinishedPulling="2026-04-24 21:17:33.797693646 +0000 UTC m=+71.648882141" observedRunningTime="2026-04-24 21:17:34.23694256 +0000 UTC m=+72.088131060" watchObservedRunningTime="2026-04-24 21:17:37.144404799 +0000 UTC m=+74.995593299" Apr 24 21:17:37.147363 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:37.147338 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:17:40.376319 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.376280 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55f577cf7b-ng6nd"] Apr 24 21:17:40.503286 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.503253 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-n2flz"] Apr 24 21:17:40.507932 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.507898 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.510623 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.510516 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-gdtmb\"" Apr 24 21:17:40.517459 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.517420 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:17:40.517590 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.517468 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:17:40.517590 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.517547 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:17:40.517590 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.517417 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:17:40.517813 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.517723 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:17:40.517879 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.517839 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:17:40.635908 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.635831 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f3aec88-d654-473c-a26a-236aeb20a6cd-sys\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.635908 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.635878 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9f3aec88-d654-473c-a26a-236aeb20a6cd-node-exporter-accelerators-collector-config\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.635908 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.635909 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9f3aec88-d654-473c-a26a-236aeb20a6cd-root\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.636171 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.635949 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft8tn\" (UniqueName: \"kubernetes.io/projected/9f3aec88-d654-473c-a26a-236aeb20a6cd-kube-api-access-ft8tn\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.636171 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.635989 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9f3aec88-d654-473c-a26a-236aeb20a6cd-node-exporter-wtmp\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.636171 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.636017 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9f3aec88-d654-473c-a26a-236aeb20a6cd-metrics-client-ca\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.636171 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.636044 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9f3aec88-d654-473c-a26a-236aeb20a6cd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.636171 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.636064 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9f3aec88-d654-473c-a26a-236aeb20a6cd-node-exporter-textfile\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.636412 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.636179 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9f3aec88-d654-473c-a26a-236aeb20a6cd-node-exporter-tls\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.736644 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.736606 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9f3aec88-d654-473c-a26a-236aeb20a6cd-node-exporter-tls\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.736852 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.736664 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f3aec88-d654-473c-a26a-236aeb20a6cd-sys\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.736852 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.736717 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9f3aec88-d654-473c-a26a-236aeb20a6cd-node-exporter-accelerators-collector-config\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.736852 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.736747 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9f3aec88-d654-473c-a26a-236aeb20a6cd-root\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.736852 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.736751 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f3aec88-d654-473c-a26a-236aeb20a6cd-sys\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.736852 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.736772 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ft8tn\" (UniqueName: \"kubernetes.io/projected/9f3aec88-d654-473c-a26a-236aeb20a6cd-kube-api-access-ft8tn\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.736852 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:17:40.736796 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:17:40.736852 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.736815 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9f3aec88-d654-473c-a26a-236aeb20a6cd-node-exporter-wtmp\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.736852 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.736841 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9f3aec88-d654-473c-a26a-236aeb20a6cd-metrics-client-ca\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.737244 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:17:40.736866 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f3aec88-d654-473c-a26a-236aeb20a6cd-node-exporter-tls podName:9f3aec88-d654-473c-a26a-236aeb20a6cd nodeName:}" failed. No retries permitted until 2026-04-24 21:17:41.236842138 +0000 UTC m=+79.088030622 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/9f3aec88-d654-473c-a26a-236aeb20a6cd-node-exporter-tls") pod "node-exporter-n2flz" (UID: "9f3aec88-d654-473c-a26a-236aeb20a6cd") : secret "node-exporter-tls" not found Apr 24 21:17:40.737244 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.736912 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9f3aec88-d654-473c-a26a-236aeb20a6cd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.737244 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.736948 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9f3aec88-d654-473c-a26a-236aeb20a6cd-node-exporter-textfile\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.737408 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.737328 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9f3aec88-d654-473c-a26a-236aeb20a6cd-node-exporter-textfile\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.737461 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.737400 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9f3aec88-d654-473c-a26a-236aeb20a6cd-node-exporter-accelerators-collector-config\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.737541 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.737511 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9f3aec88-d654-473c-a26a-236aeb20a6cd-metrics-client-ca\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.737669 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.737623 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9f3aec88-d654-473c-a26a-236aeb20a6cd-root\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.737669 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.737655 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9f3aec88-d654-473c-a26a-236aeb20a6cd-node-exporter-wtmp\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.740108 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.740083 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9f3aec88-d654-473c-a26a-236aeb20a6cd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:40.775652 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:40.775616 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft8tn\" (UniqueName: \"kubernetes.io/projected/9f3aec88-d654-473c-a26a-236aeb20a6cd-kube-api-access-ft8tn\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:41.241885 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:41.241759 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9f3aec88-d654-473c-a26a-236aeb20a6cd-node-exporter-tls\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:41.244382 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:41.244350 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9f3aec88-d654-473c-a26a-236aeb20a6cd-node-exporter-tls\") pod \"node-exporter-n2flz\" (UID: \"9f3aec88-d654-473c-a26a-236aeb20a6cd\") " pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:41.420946 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:41.420914 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n2flz" Apr 24 21:17:43.833308 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:43.833272 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7b985d598b-262rf"] Apr 24 21:17:43.839727 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:43.839692 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:43.845412 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:43.845374 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 21:17:43.848528 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:43.848246 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 21:17:43.854732 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:43.854667 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-bqrlw\"" Apr 24 21:17:43.855196 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:43.855171 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 21:17:43.855314 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:43.855273 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 21:17:43.855572 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:43.855554 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 21:17:43.868230 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:43.868195 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7b985d598b-262rf"] Apr 24 21:17:43.892160 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:43.892129 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-3hho87dq553m0\"" Apr 24 21:17:43.965287 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:43.965252 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3418a28a-7b63-4978-949c-05cef2f4db92-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:43.965473 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:43.965302 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3418a28a-7b63-4978-949c-05cef2f4db92-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:43.965473 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:43.965345 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3418a28a-7b63-4978-949c-05cef2f4db92-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:43.965473 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:43.965399 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3418a28a-7b63-4978-949c-05cef2f4db92-secret-grpc-tls\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:43.965473 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:43.965428 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3418a28a-7b63-4978-949c-05cef2f4db92-metrics-client-ca\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:43.965473 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:43.965468 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3418a28a-7b63-4978-949c-05cef2f4db92-secret-thanos-querier-tls\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:43.965707 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:43.965494 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3418a28a-7b63-4978-949c-05cef2f4db92-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:43.965707 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:43.965536 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbx8c\" (UniqueName: \"kubernetes.io/projected/3418a28a-7b63-4978-949c-05cef2f4db92-kube-api-access-rbx8c\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:44.066806 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.066772 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbx8c\" (UniqueName: \"kubernetes.io/projected/3418a28a-7b63-4978-949c-05cef2f4db92-kube-api-access-rbx8c\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:44.067006 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.066813 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3418a28a-7b63-4978-949c-05cef2f4db92-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:44.067006 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.066854 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3418a28a-7b63-4978-949c-05cef2f4db92-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:44.067006 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.066899 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3418a28a-7b63-4978-949c-05cef2f4db92-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:44.067006 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.066927 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3418a28a-7b63-4978-949c-05cef2f4db92-secret-grpc-tls\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:44.067006 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.066955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3418a28a-7b63-4978-949c-05cef2f4db92-metrics-client-ca\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:44.067006 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.067001 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3418a28a-7b63-4978-949c-05cef2f4db92-secret-thanos-querier-tls\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:44.067312 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.067030 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3418a28a-7b63-4978-949c-05cef2f4db92-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:44.067912 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.067859 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3418a28a-7b63-4978-949c-05cef2f4db92-metrics-client-ca\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:44.070265 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.070200 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3418a28a-7b63-4978-949c-05cef2f4db92-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:44.070265 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.070232 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3418a28a-7b63-4978-949c-05cef2f4db92-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:44.070437 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.070411 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3418a28a-7b63-4978-949c-05cef2f4db92-secret-thanos-querier-tls\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:44.070561 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.070534 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3418a28a-7b63-4978-949c-05cef2f4db92-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:44.070667 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.070648 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3418a28a-7b63-4978-949c-05cef2f4db92-secret-grpc-tls\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:44.070745 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.070673 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3418a28a-7b63-4978-949c-05cef2f4db92-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:44.081336 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.081312 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbx8c\" (UniqueName: \"kubernetes.io/projected/3418a28a-7b63-4978-949c-05cef2f4db92-kube-api-access-rbx8c\") pod \"thanos-querier-7b985d598b-262rf\" (UID: \"3418a28a-7b63-4978-949c-05cef2f4db92\") " pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:44.152242 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.152150 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:44.480143 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:17:44.480096 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f3aec88_d654_473c_a26a_236aeb20a6cd.slice/crio-97b68a8f32e21cbad424c9ab3daf5f9dbdb700afbb823dd3186c4382af7c012c WatchSource:0}: Error finding container 97b68a8f32e21cbad424c9ab3daf5f9dbdb700afbb823dd3186c4382af7c012c: Status 404 returned error can't find the container with id 97b68a8f32e21cbad424c9ab3daf5f9dbdb700afbb823dd3186c4382af7c012c Apr 24 21:17:44.525024 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.524984 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-n677g"] Apr 24 21:17:44.528247 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.528224 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n677g" Apr 24 21:17:44.531135 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.530948 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-4vhxp\"" Apr 24 21:17:44.531573 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.531403 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 21:17:44.545214 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.545185 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-n677g"] Apr 24 21:17:44.619110 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.619084 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7b985d598b-262rf"] Apr 24 21:17:44.623695 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:17:44.623648 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3418a28a_7b63_4978_949c_05cef2f4db92.slice/crio-673295573cf72d55b4b679440a160bf95bc4e41549453a6e224319eda5158e88 WatchSource:0}: Error finding container 673295573cf72d55b4b679440a160bf95bc4e41549453a6e224319eda5158e88: Status 404 returned error can't find the container with id 673295573cf72d55b4b679440a160bf95bc4e41549453a6e224319eda5158e88 Apr 24 21:17:44.670649 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.670588 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d5a35927-4abc-4212-9393-0fc58f4776db-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-n677g\" (UID: \"d5a35927-4abc-4212-9393-0fc58f4776db\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n677g" Apr 24 21:17:44.771992 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.771948 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d5a35927-4abc-4212-9393-0fc58f4776db-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-n677g\" (UID: \"d5a35927-4abc-4212-9393-0fc58f4776db\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n677g" Apr 24 21:17:44.775006 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.774974 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d5a35927-4abc-4212-9393-0fc58f4776db-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-n677g\" (UID: \"d5a35927-4abc-4212-9393-0fc58f4776db\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n677g" Apr 24 21:17:44.842906 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.842851 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n677g" Apr 24 21:17:44.989613 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:44.989559 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-n677g"] Apr 24 21:17:44.993155 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:17:44.993123 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5a35927_4abc_4212_9393_0fc58f4776db.slice/crio-9c1f6c84d89a2617ddd09f378788935fff4da5e658cde2de82da29627055b545 WatchSource:0}: Error finding container 9c1f6c84d89a2617ddd09f378788935fff4da5e658cde2de82da29627055b545: Status 404 returned error can't find the container with id 9c1f6c84d89a2617ddd09f378788935fff4da5e658cde2de82da29627055b545 Apr 24 21:17:45.169006 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:45.168894 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n2flz" event={"ID":"9f3aec88-d654-473c-a26a-236aeb20a6cd","Type":"ContainerStarted","Data":"97b68a8f32e21cbad424c9ab3daf5f9dbdb700afbb823dd3186c4382af7c012c"} Apr 24 21:17:45.170995 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:45.170930 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n677g" event={"ID":"d5a35927-4abc-4212-9393-0fc58f4776db","Type":"ContainerStarted","Data":"9c1f6c84d89a2617ddd09f378788935fff4da5e658cde2de82da29627055b545"} Apr 24 21:17:45.172836 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:45.172736 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-m27jw" event={"ID":"f9ced7e0-e303-4e2f-9642-e2c46ac4800a","Type":"ContainerStarted","Data":"db05b6d291e5325029f6e703dc1676c86a6b3086a8daf5108fdc7ad0f16ec0a9"} Apr 24 21:17:45.172836 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:45.172794 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-m27jw" Apr 24 21:17:45.174610 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:45.174540 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" event={"ID":"3418a28a-7b63-4978-949c-05cef2f4db92","Type":"ContainerStarted","Data":"673295573cf72d55b4b679440a160bf95bc4e41549453a6e224319eda5158e88"} Apr 24 21:17:45.187304 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:45.187273 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-m27jw" Apr 24 21:17:45.207433 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:45.207371 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-m27jw" podStartSLOduration=1.859671871 podStartE2EDuration="19.20734999s" podCreationTimestamp="2026-04-24 21:17:26 +0000 UTC" firstStartedPulling="2026-04-24 21:17:27.224277727 +0000 UTC m=+65.075466205" lastFinishedPulling="2026-04-24 21:17:44.571955833 +0000 UTC m=+82.423144324" observedRunningTime="2026-04-24 21:17:45.20616255 +0000 UTC m=+83.057351051" watchObservedRunningTime="2026-04-24 21:17:45.20734999 +0000 UTC m=+83.058538496" Apr 24 21:17:46.180702 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:46.180055 2578 generic.go:358] "Generic (PLEG): container finished" podID="9f3aec88-d654-473c-a26a-236aeb20a6cd" containerID="39b4c9dced27c8258ec9ae155f02472bcabc9a33a7a5fe547274de42d1ba23cc" exitCode=0 Apr 24 21:17:46.180702 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:46.180139 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n2flz" event={"ID":"9f3aec88-d654-473c-a26a-236aeb20a6cd","Type":"ContainerDied","Data":"39b4c9dced27c8258ec9ae155f02472bcabc9a33a7a5fe547274de42d1ba23cc"} Apr 24 21:17:48.188474 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:48.188436 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" event={"ID":"3418a28a-7b63-4978-949c-05cef2f4db92","Type":"ContainerStarted","Data":"7fc409355255119b53bf4b9d2b35944a0536828743824a91c109b2a35e72dad9"} Apr 24 21:17:48.188908 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:48.188479 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" event={"ID":"3418a28a-7b63-4978-949c-05cef2f4db92","Type":"ContainerStarted","Data":"980379e659b53e4d1267bbad2cebda1592f729a9696acdca65583e7b8f2d9ae5"} Apr 24 21:17:48.188908 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:48.188494 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" event={"ID":"3418a28a-7b63-4978-949c-05cef2f4db92","Type":"ContainerStarted","Data":"f0cb384c2ebc44f2f5913c2e241daad2dd6dd168dc2ef63ce8ea8778f7c7ac65"} Apr 24 21:17:48.190657 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:48.190631 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n2flz" event={"ID":"9f3aec88-d654-473c-a26a-236aeb20a6cd","Type":"ContainerStarted","Data":"3983da215a277a3e3a47399368f0be8948adb2044d6c4d3d49b4d7455b4d7297"} Apr 24 21:17:48.190798 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:48.190666 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n2flz" event={"ID":"9f3aec88-d654-473c-a26a-236aeb20a6cd","Type":"ContainerStarted","Data":"eaaf7b0b22e329b9660423654e87e2fdd5c09be839acb9d01d1a6c9c4251b2e6"} Apr 24 21:17:48.192119 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:48.192093 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n677g" event={"ID":"d5a35927-4abc-4212-9393-0fc58f4776db","Type":"ContainerStarted","Data":"99167cf09464745e045c422b5c53ad9d5173df44933eb3a605b59c9f6bddbb75"} Apr 24 21:17:48.192323 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:48.192303 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n677g" Apr 24 21:17:48.198048 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:48.198019 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n677g" Apr 24 21:17:48.211849 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:48.211814 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-n2flz" podStartSLOduration=7.262511452 podStartE2EDuration="8.211802367s" podCreationTimestamp="2026-04-24 21:17:40 +0000 UTC" firstStartedPulling="2026-04-24 21:17:44.482190818 +0000 UTC m=+82.333379294" lastFinishedPulling="2026-04-24 21:17:45.431481719 +0000 UTC m=+83.282670209" observedRunningTime="2026-04-24 21:17:48.210498031 +0000 UTC m=+86.061686531" watchObservedRunningTime="2026-04-24 21:17:48.211802367 +0000 UTC m=+86.062990919" Apr 24 21:17:48.226697 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:48.226585 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n677g" podStartSLOduration=1.497136653 podStartE2EDuration="4.226571461s" podCreationTimestamp="2026-04-24 21:17:44 +0000 UTC" firstStartedPulling="2026-04-24 21:17:44.995941186 +0000 UTC m=+82.847129667" lastFinishedPulling="2026-04-24 21:17:47.725375997 +0000 UTC m=+85.576564475" observedRunningTime="2026-04-24 21:17:48.225349041 +0000 UTC m=+86.076537540" watchObservedRunningTime="2026-04-24 21:17:48.226571461 +0000 UTC m=+86.077759960" Apr 24 21:17:50.109325 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:50.109293 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:17:50.204840 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:50.204764 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" event={"ID":"3418a28a-7b63-4978-949c-05cef2f4db92","Type":"ContainerStarted","Data":"986ee34689b7fa3f4bfb275d5d1253f09d37233e398f780b37764e5585dc7656"} Apr 24 21:17:50.204840 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:50.204822 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" event={"ID":"3418a28a-7b63-4978-949c-05cef2f4db92","Type":"ContainerStarted","Data":"bf63bbf3dca670d092ab22a52addf2192cbd91a613d46795c5bb10bbaf88a9f3"} Apr 24 21:17:50.204840 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:50.204840 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" event={"ID":"3418a28a-7b63-4978-949c-05cef2f4db92","Type":"ContainerStarted","Data":"974f8f083cb961bc9c2f407c39ac65f54ef6951ae279823118309cfe0417c66a"} Apr 24 21:17:50.231140 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:50.231071 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" podStartSLOduration=2.46556274 podStartE2EDuration="7.231051414s" podCreationTimestamp="2026-04-24 21:17:43 +0000 UTC" firstStartedPulling="2026-04-24 21:17:44.625616129 +0000 UTC m=+82.476804609" lastFinishedPulling="2026-04-24 21:17:49.391104794 +0000 UTC m=+87.242293283" observedRunningTime="2026-04-24 21:17:50.229771858 +0000 UTC m=+88.080960358" watchObservedRunningTime="2026-04-24 21:17:50.231051414 +0000 UTC m=+88.082239914" Apr 24 21:17:51.208531 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:51.208495 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:17:51.673517 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:51.673473 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5c7bdc7564-g4mfd"] Apr 24 21:17:57.218967 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:17:57.218936 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7b985d598b-262rf" Apr 24 21:18:05.136108 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.136074 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-dwmtg" Apr 24 21:18:05.398699 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.398556 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-55f577cf7b-ng6nd" podUID="fbae862e-6962-4e66-9699-0d72da740cc0" containerName="console" containerID="cri-o://1fdd96fedbc320d9ac6538552fd1ebf94d7a79a86dc497fa2878788a1f93bfb3" gracePeriod=15 Apr 24 21:18:05.688126 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.688103 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55f577cf7b-ng6nd_fbae862e-6962-4e66-9699-0d72da740cc0/console/0.log" Apr 24 21:18:05.688248 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.688175 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:18:05.745416 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.745380 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fbae862e-6962-4e66-9699-0d72da740cc0-service-ca\") pod \"fbae862e-6962-4e66-9699-0d72da740cc0\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " Apr 24 21:18:05.745416 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.745418 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fbae862e-6962-4e66-9699-0d72da740cc0-console-oauth-config\") pod \"fbae862e-6962-4e66-9699-0d72da740cc0\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " Apr 24 21:18:05.745618 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.745441 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czmvn\" (UniqueName: \"kubernetes.io/projected/fbae862e-6962-4e66-9699-0d72da740cc0-kube-api-access-czmvn\") pod \"fbae862e-6962-4e66-9699-0d72da740cc0\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " Apr 24 21:18:05.745618 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.745477 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbae862e-6962-4e66-9699-0d72da740cc0-console-serving-cert\") pod \"fbae862e-6962-4e66-9699-0d72da740cc0\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " Apr 24 21:18:05.745618 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.745500 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fbae862e-6962-4e66-9699-0d72da740cc0-oauth-serving-cert\") pod \"fbae862e-6962-4e66-9699-0d72da740cc0\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " Apr 24 21:18:05.745618 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.745521 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fbae862e-6962-4e66-9699-0d72da740cc0-console-config\") pod \"fbae862e-6962-4e66-9699-0d72da740cc0\" (UID: \"fbae862e-6962-4e66-9699-0d72da740cc0\") " Apr 24 21:18:05.745988 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.745864 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbae862e-6962-4e66-9699-0d72da740cc0-service-ca" (OuterVolumeSpecName: "service-ca") pod "fbae862e-6962-4e66-9699-0d72da740cc0" (UID: "fbae862e-6962-4e66-9699-0d72da740cc0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:05.746126 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.745953 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbae862e-6962-4e66-9699-0d72da740cc0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fbae862e-6962-4e66-9699-0d72da740cc0" (UID: "fbae862e-6962-4e66-9699-0d72da740cc0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:05.746126 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.746033 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbae862e-6962-4e66-9699-0d72da740cc0-console-config" (OuterVolumeSpecName: "console-config") pod "fbae862e-6962-4e66-9699-0d72da740cc0" (UID: "fbae862e-6962-4e66-9699-0d72da740cc0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:05.747976 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.747949 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbae862e-6962-4e66-9699-0d72da740cc0-kube-api-access-czmvn" (OuterVolumeSpecName: "kube-api-access-czmvn") pod "fbae862e-6962-4e66-9699-0d72da740cc0" (UID: "fbae862e-6962-4e66-9699-0d72da740cc0"). InnerVolumeSpecName "kube-api-access-czmvn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:05.748139 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.748115 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbae862e-6962-4e66-9699-0d72da740cc0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fbae862e-6962-4e66-9699-0d72da740cc0" (UID: "fbae862e-6962-4e66-9699-0d72da740cc0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:05.748192 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.748133 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbae862e-6962-4e66-9699-0d72da740cc0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fbae862e-6962-4e66-9699-0d72da740cc0" (UID: "fbae862e-6962-4e66-9699-0d72da740cc0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:05.846223 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.846179 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fbae862e-6962-4e66-9699-0d72da740cc0-oauth-serving-cert\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:18:05.846223 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.846215 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fbae862e-6962-4e66-9699-0d72da740cc0-console-config\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:18:05.846223 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.846225 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fbae862e-6962-4e66-9699-0d72da740cc0-service-ca\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:18:05.846223 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.846233 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fbae862e-6962-4e66-9699-0d72da740cc0-console-oauth-config\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:18:05.846514 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.846242 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-czmvn\" (UniqueName: \"kubernetes.io/projected/fbae862e-6962-4e66-9699-0d72da740cc0-kube-api-access-czmvn\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:18:05.846514 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:05.846251 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbae862e-6962-4e66-9699-0d72da740cc0-console-serving-cert\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:18:06.250777 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:06.250748 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55f577cf7b-ng6nd_fbae862e-6962-4e66-9699-0d72da740cc0/console/0.log" Apr 24 21:18:06.251182 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:06.250785 2578 generic.go:358] "Generic (PLEG): container finished" podID="fbae862e-6962-4e66-9699-0d72da740cc0" containerID="1fdd96fedbc320d9ac6538552fd1ebf94d7a79a86dc497fa2878788a1f93bfb3" exitCode=2 Apr 24 21:18:06.251182 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:06.250819 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55f577cf7b-ng6nd" event={"ID":"fbae862e-6962-4e66-9699-0d72da740cc0","Type":"ContainerDied","Data":"1fdd96fedbc320d9ac6538552fd1ebf94d7a79a86dc497fa2878788a1f93bfb3"} Apr 24 21:18:06.251182 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:06.250843 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55f577cf7b-ng6nd" Apr 24 21:18:06.251182 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:06.250857 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55f577cf7b-ng6nd" event={"ID":"fbae862e-6962-4e66-9699-0d72da740cc0","Type":"ContainerDied","Data":"5cd4a51087ca3ceb7f9bd23bb32f1d7b843c60d9e9aadf0861b18b942e9ab647"} Apr 24 21:18:06.251182 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:06.250875 2578 scope.go:117] "RemoveContainer" containerID="1fdd96fedbc320d9ac6538552fd1ebf94d7a79a86dc497fa2878788a1f93bfb3" Apr 24 21:18:06.262879 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:06.262856 2578 scope.go:117] "RemoveContainer" containerID="1fdd96fedbc320d9ac6538552fd1ebf94d7a79a86dc497fa2878788a1f93bfb3" Apr 24 21:18:06.263140 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:18:06.263121 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fdd96fedbc320d9ac6538552fd1ebf94d7a79a86dc497fa2878788a1f93bfb3\": container with ID starting with 1fdd96fedbc320d9ac6538552fd1ebf94d7a79a86dc497fa2878788a1f93bfb3 not found: ID does not exist" containerID="1fdd96fedbc320d9ac6538552fd1ebf94d7a79a86dc497fa2878788a1f93bfb3" Apr 24 21:18:06.263199 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:06.263148 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fdd96fedbc320d9ac6538552fd1ebf94d7a79a86dc497fa2878788a1f93bfb3"} err="failed to get container status \"1fdd96fedbc320d9ac6538552fd1ebf94d7a79a86dc497fa2878788a1f93bfb3\": rpc error: code = NotFound desc = could not find container \"1fdd96fedbc320d9ac6538552fd1ebf94d7a79a86dc497fa2878788a1f93bfb3\": container with ID starting with 1fdd96fedbc320d9ac6538552fd1ebf94d7a79a86dc497fa2878788a1f93bfb3 not found: ID does not exist" Apr 24 21:18:06.279126 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:06.279102 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55f577cf7b-ng6nd"] Apr 24 21:18:06.288277 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:06.288246 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-55f577cf7b-ng6nd"] Apr 24 21:18:06.713065 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:06.712988 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbae862e-6962-4e66-9699-0d72da740cc0" path="/var/lib/kubelet/pods/fbae862e-6962-4e66-9699-0d72da740cc0/volumes" Apr 24 21:18:16.696307 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:16.696239 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" podUID="12da021b-8530-4058-b1e8-2689e6c9fdb6" containerName="registry" containerID="cri-o://1f2317a036cbb3708c79a77066c725b4c67b58076dac6652bb28f74fe51ceba3" gracePeriod=30 Apr 24 21:18:16.992107 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:16.992083 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:18:17.017559 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.017526 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/12da021b-8530-4058-b1e8-2689e6c9fdb6-image-registry-private-configuration\") pod \"12da021b-8530-4058-b1e8-2689e6c9fdb6\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " Apr 24 21:18:17.017768 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.017572 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls\") pod \"12da021b-8530-4058-b1e8-2689e6c9fdb6\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " Apr 24 21:18:17.017768 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.017604 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/12da021b-8530-4058-b1e8-2689e6c9fdb6-installation-pull-secrets\") pod \"12da021b-8530-4058-b1e8-2689e6c9fdb6\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " Apr 24 21:18:17.017768 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.017657 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2nj4\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-kube-api-access-s2nj4\") pod \"12da021b-8530-4058-b1e8-2689e6c9fdb6\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " Apr 24 21:18:17.017768 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.017712 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-bound-sa-token\") pod \"12da021b-8530-4058-b1e8-2689e6c9fdb6\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " Apr 24 21:18:17.017768 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.017741 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/12da021b-8530-4058-b1e8-2689e6c9fdb6-ca-trust-extracted\") pod \"12da021b-8530-4058-b1e8-2689e6c9fdb6\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " Apr 24 21:18:17.017768 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.017770 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-certificates\") pod \"12da021b-8530-4058-b1e8-2689e6c9fdb6\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " Apr 24 21:18:17.018089 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.017793 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12da021b-8530-4058-b1e8-2689e6c9fdb6-trusted-ca\") pod \"12da021b-8530-4058-b1e8-2689e6c9fdb6\" (UID: \"12da021b-8530-4058-b1e8-2689e6c9fdb6\") " Apr 24 21:18:17.019072 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.018442 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12da021b-8530-4058-b1e8-2689e6c9fdb6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "12da021b-8530-4058-b1e8-2689e6c9fdb6" (UID: "12da021b-8530-4058-b1e8-2689e6c9fdb6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:17.019072 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.018566 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "12da021b-8530-4058-b1e8-2689e6c9fdb6" (UID: "12da021b-8530-4058-b1e8-2689e6c9fdb6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:17.020714 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.020538 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12da021b-8530-4058-b1e8-2689e6c9fdb6-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "12da021b-8530-4058-b1e8-2689e6c9fdb6" (UID: "12da021b-8530-4058-b1e8-2689e6c9fdb6"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:17.020854 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.020808 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12da021b-8530-4058-b1e8-2689e6c9fdb6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "12da021b-8530-4058-b1e8-2689e6c9fdb6" (UID: "12da021b-8530-4058-b1e8-2689e6c9fdb6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:17.021428 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.021220 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "12da021b-8530-4058-b1e8-2689e6c9fdb6" (UID: "12da021b-8530-4058-b1e8-2689e6c9fdb6"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:17.022029 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.021997 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-kube-api-access-s2nj4" (OuterVolumeSpecName: "kube-api-access-s2nj4") pod "12da021b-8530-4058-b1e8-2689e6c9fdb6" (UID: "12da021b-8530-4058-b1e8-2689e6c9fdb6"). InnerVolumeSpecName "kube-api-access-s2nj4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:17.022127 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.022048 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "12da021b-8530-4058-b1e8-2689e6c9fdb6" (UID: "12da021b-8530-4058-b1e8-2689e6c9fdb6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:17.036796 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.036744 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12da021b-8530-4058-b1e8-2689e6c9fdb6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "12da021b-8530-4058-b1e8-2689e6c9fdb6" (UID: "12da021b-8530-4058-b1e8-2689e6c9fdb6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:18:17.119301 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.119268 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12da021b-8530-4058-b1e8-2689e6c9fdb6-trusted-ca\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:18:17.119301 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.119306 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/12da021b-8530-4058-b1e8-2689e6c9fdb6-image-registry-private-configuration\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:18:17.119489 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.119322 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-tls\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:18:17.119489 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.119338 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/12da021b-8530-4058-b1e8-2689e6c9fdb6-installation-pull-secrets\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:18:17.119489 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.119353 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s2nj4\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-kube-api-access-s2nj4\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:18:17.119489 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.119368 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/12da021b-8530-4058-b1e8-2689e6c9fdb6-bound-sa-token\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:18:17.119489 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.119381 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/12da021b-8530-4058-b1e8-2689e6c9fdb6-ca-trust-extracted\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:18:17.119489 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.119396 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/12da021b-8530-4058-b1e8-2689e6c9fdb6-registry-certificates\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:18:17.285498 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.285460 2578 generic.go:358] "Generic (PLEG): container finished" podID="12da021b-8530-4058-b1e8-2689e6c9fdb6" containerID="1f2317a036cbb3708c79a77066c725b4c67b58076dac6652bb28f74fe51ceba3" exitCode=0 Apr 24 21:18:17.285727 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.285524 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" Apr 24 21:18:17.285727 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.285534 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" event={"ID":"12da021b-8530-4058-b1e8-2689e6c9fdb6","Type":"ContainerDied","Data":"1f2317a036cbb3708c79a77066c725b4c67b58076dac6652bb28f74fe51ceba3"} Apr 24 21:18:17.285727 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.285572 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c7bdc7564-g4mfd" event={"ID":"12da021b-8530-4058-b1e8-2689e6c9fdb6","Type":"ContainerDied","Data":"07880974cd5997a58a85192964bf05dca4d2b6f0d52fdb5d4c0130fc9901f678"} Apr 24 21:18:17.285727 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.285588 2578 scope.go:117] "RemoveContainer" containerID="1f2317a036cbb3708c79a77066c725b4c67b58076dac6652bb28f74fe51ceba3" Apr 24 21:18:17.300864 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.300812 2578 scope.go:117] "RemoveContainer" containerID="1f2317a036cbb3708c79a77066c725b4c67b58076dac6652bb28f74fe51ceba3" Apr 24 21:18:17.301188 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:18:17.301164 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f2317a036cbb3708c79a77066c725b4c67b58076dac6652bb28f74fe51ceba3\": container with ID starting with 1f2317a036cbb3708c79a77066c725b4c67b58076dac6652bb28f74fe51ceba3 not found: ID does not exist" containerID="1f2317a036cbb3708c79a77066c725b4c67b58076dac6652bb28f74fe51ceba3" Apr 24 21:18:17.301281 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.301201 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f2317a036cbb3708c79a77066c725b4c67b58076dac6652bb28f74fe51ceba3"} err="failed to get container status \"1f2317a036cbb3708c79a77066c725b4c67b58076dac6652bb28f74fe51ceba3\": rpc error: code = NotFound desc = could not find container \"1f2317a036cbb3708c79a77066c725b4c67b58076dac6652bb28f74fe51ceba3\": container with ID starting with 1f2317a036cbb3708c79a77066c725b4c67b58076dac6652bb28f74fe51ceba3 not found: ID does not exist" Apr 24 21:18:17.309548 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.309518 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5c7bdc7564-g4mfd"] Apr 24 21:18:17.315936 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:17.314286 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5c7bdc7564-g4mfd"] Apr 24 21:18:18.711926 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:18:18.711893 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12da021b-8530-4058-b1e8-2689e6c9fdb6" path="/var/lib/kubelet/pods/12da021b-8530-4058-b1e8-2689e6c9fdb6/volumes" Apr 24 21:21:22.632435 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:21:22.632404 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:21:22.633000 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:21:22.632404 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:21:22.636498 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:21:22.636475 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:24:18.162151 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.162109 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-67ddfbdbd5-g6vdq"] Apr 24 21:24:18.162575 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.162379 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12da021b-8530-4058-b1e8-2689e6c9fdb6" containerName="registry" Apr 24 21:24:18.162575 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.162390 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="12da021b-8530-4058-b1e8-2689e6c9fdb6" containerName="registry" Apr 24 21:24:18.162575 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.162399 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbae862e-6962-4e66-9699-0d72da740cc0" containerName="console" Apr 24 21:24:18.162575 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.162405 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbae862e-6962-4e66-9699-0d72da740cc0" containerName="console" Apr 24 21:24:18.162575 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.162460 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="12da021b-8530-4058-b1e8-2689e6c9fdb6" containerName="registry" Apr 24 21:24:18.162575 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.162468 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fbae862e-6962-4e66-9699-0d72da740cc0" containerName="console" Apr 24 21:24:18.165202 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.165186 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.168533 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.168501 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 21:24:18.168692 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.168657 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-wpj8v\"" Apr 24 21:24:18.168769 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.168657 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 21:24:18.168851 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.168833 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 21:24:18.168919 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.168867 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 21:24:18.168964 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.168944 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 21:24:18.173817 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.173797 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 21:24:18.176660 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.176642 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67ddfbdbd5-g6vdq"] Apr 24 21:24:18.312927 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.312886 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-service-ca\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.313138 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.312949 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-console-config\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.313138 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.313019 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/035ee76b-7011-4224-94c4-cbcf1b848a0b-console-oauth-config\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.313138 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.313056 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/035ee76b-7011-4224-94c4-cbcf1b848a0b-console-serving-cert\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.313138 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.313081 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-oauth-serving-cert\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.313138 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.313132 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6z7q\" (UniqueName: \"kubernetes.io/projected/035ee76b-7011-4224-94c4-cbcf1b848a0b-kube-api-access-t6z7q\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.313317 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.313176 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-trusted-ca-bundle\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.413731 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.413628 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-console-config\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.413731 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.413697 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/035ee76b-7011-4224-94c4-cbcf1b848a0b-console-oauth-config\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.413731 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.413719 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/035ee76b-7011-4224-94c4-cbcf1b848a0b-console-serving-cert\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.413731 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.413734 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-oauth-serving-cert\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.413979 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.413751 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6z7q\" (UniqueName: \"kubernetes.io/projected/035ee76b-7011-4224-94c4-cbcf1b848a0b-kube-api-access-t6z7q\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.413979 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.413768 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-trusted-ca-bundle\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.413979 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.413817 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-service-ca\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.414463 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.414433 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-console-config\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.414579 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.414465 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-oauth-serving-cert\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.414579 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.414475 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-service-ca\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.414847 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.414826 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-trusted-ca-bundle\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.416120 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.416098 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/035ee76b-7011-4224-94c4-cbcf1b848a0b-console-oauth-config\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.416293 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.416276 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/035ee76b-7011-4224-94c4-cbcf1b848a0b-console-serving-cert\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.422043 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.422025 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6z7q\" (UniqueName: \"kubernetes.io/projected/035ee76b-7011-4224-94c4-cbcf1b848a0b-kube-api-access-t6z7q\") pod \"console-67ddfbdbd5-g6vdq\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.475534 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.475502 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:18.591548 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.591524 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67ddfbdbd5-g6vdq"] Apr 24 21:24:18.594044 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:24:18.594017 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod035ee76b_7011_4224_94c4_cbcf1b848a0b.slice/crio-4c6d4f5436235076853257f3399f972e1bcf595bd8221cdf5dfcd8662a4663af WatchSource:0}: Error finding container 4c6d4f5436235076853257f3399f972e1bcf595bd8221cdf5dfcd8662a4663af: Status 404 returned error can't find the container with id 4c6d4f5436235076853257f3399f972e1bcf595bd8221cdf5dfcd8662a4663af Apr 24 21:24:18.595950 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:18.595933 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:24:19.201790 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:19.201755 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67ddfbdbd5-g6vdq" event={"ID":"035ee76b-7011-4224-94c4-cbcf1b848a0b","Type":"ContainerStarted","Data":"e7982edd1825d958b12b5141acac4d2e597b2fe949eac86ff43d3232cb95bc14"} Apr 24 21:24:19.201790 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:19.201791 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67ddfbdbd5-g6vdq" event={"ID":"035ee76b-7011-4224-94c4-cbcf1b848a0b","Type":"ContainerStarted","Data":"4c6d4f5436235076853257f3399f972e1bcf595bd8221cdf5dfcd8662a4663af"} Apr 24 21:24:19.221696 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:19.221636 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67ddfbdbd5-g6vdq" podStartSLOduration=1.221620533 podStartE2EDuration="1.221620533s" podCreationTimestamp="2026-04-24 21:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:24:19.219789711 +0000 UTC m=+477.070978210" watchObservedRunningTime="2026-04-24 21:24:19.221620533 +0000 UTC m=+477.072809032" Apr 24 21:24:28.476287 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:28.476249 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:28.476287 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:28.476290 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:28.481136 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:28.481113 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:24:29.229770 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:24:29.229739 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:25:04.016767 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:04.016734 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl"] Apr 24 21:25:04.019930 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:04.019913 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl" Apr 24 21:25:04.022551 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:04.022523 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:25:04.023497 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:04.023477 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:25:04.023579 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:04.023479 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bbr4w\"" Apr 24 21:25:04.028720 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:04.028696 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl"] Apr 24 21:25:04.114585 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:04.114552 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6166023e-1837-49ce-a72f-c9547dc9e3a6-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl\" (UID: \"6166023e-1837-49ce-a72f-c9547dc9e3a6\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl" Apr 24 21:25:04.115006 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:04.114587 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdkk5\" (UniqueName: \"kubernetes.io/projected/6166023e-1837-49ce-a72f-c9547dc9e3a6-kube-api-access-vdkk5\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl\" (UID: \"6166023e-1837-49ce-a72f-c9547dc9e3a6\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl" Apr 24 21:25:04.115225 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:04.115201 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6166023e-1837-49ce-a72f-c9547dc9e3a6-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl\" (UID: \"6166023e-1837-49ce-a72f-c9547dc9e3a6\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl" Apr 24 21:25:04.215827 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:04.215791 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6166023e-1837-49ce-a72f-c9547dc9e3a6-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl\" (UID: \"6166023e-1837-49ce-a72f-c9547dc9e3a6\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl" Apr 24 21:25:04.216010 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:04.215892 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6166023e-1837-49ce-a72f-c9547dc9e3a6-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl\" (UID: \"6166023e-1837-49ce-a72f-c9547dc9e3a6\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl" Apr 24 21:25:04.216010 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:04.215919 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdkk5\" (UniqueName: \"kubernetes.io/projected/6166023e-1837-49ce-a72f-c9547dc9e3a6-kube-api-access-vdkk5\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl\" (UID: \"6166023e-1837-49ce-a72f-c9547dc9e3a6\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl" Apr 24 21:25:04.216303 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:04.216279 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6166023e-1837-49ce-a72f-c9547dc9e3a6-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl\" (UID: \"6166023e-1837-49ce-a72f-c9547dc9e3a6\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl" Apr 24 21:25:04.216303 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:04.216292 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6166023e-1837-49ce-a72f-c9547dc9e3a6-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl\" (UID: \"6166023e-1837-49ce-a72f-c9547dc9e3a6\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl" Apr 24 21:25:04.233727 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:04.233696 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdkk5\" (UniqueName: \"kubernetes.io/projected/6166023e-1837-49ce-a72f-c9547dc9e3a6-kube-api-access-vdkk5\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl\" (UID: \"6166023e-1837-49ce-a72f-c9547dc9e3a6\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl" Apr 24 21:25:04.329080 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:04.328993 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl" Apr 24 21:25:04.450653 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:04.450630 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl"] Apr 24 21:25:04.453173 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:25:04.453143 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6166023e_1837_49ce_a72f_c9547dc9e3a6.slice/crio-a700ea62b581b5a2f35162a2c0f7b27eabd31772229f1eb2544a05a044200268 WatchSource:0}: Error finding container a700ea62b581b5a2f35162a2c0f7b27eabd31772229f1eb2544a05a044200268: Status 404 returned error can't find the container with id a700ea62b581b5a2f35162a2c0f7b27eabd31772229f1eb2544a05a044200268 Apr 24 21:25:05.323873 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.323827 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl" event={"ID":"6166023e-1837-49ce-a72f-c9547dc9e3a6","Type":"ContainerStarted","Data":"a700ea62b581b5a2f35162a2c0f7b27eabd31772229f1eb2544a05a044200268"} Apr 24 21:25:05.467889 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.467855 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-544648b5b5-4kc4n"] Apr 24 21:25:05.471103 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.471078 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-544648b5b5-4kc4n" Apr 24 21:25:05.473932 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.473905 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 21:25:05.475036 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.474891 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 21:25:05.475036 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.474911 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-9mr8n\"" Apr 24 21:25:05.475036 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.474931 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 21:25:05.477252 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.477232 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 21:25:05.483652 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.483620 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-544648b5b5-4kc4n"] Apr 24 21:25:05.489818 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.489792 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x"] Apr 24 21:25:05.493161 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.492979 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x" Apr 24 21:25:05.495908 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.495778 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 21:25:05.514368 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.514338 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x"] Apr 24 21:25:05.527172 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.527127 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbsp9\" (UniqueName: \"kubernetes.io/projected/c327a679-a3f0-445a-919c-ae821c822efc-kube-api-access-vbsp9\") pod \"klusterlet-addon-workmgr-746dd99b58-f855x\" (UID: \"c327a679-a3f0-445a-919c-ae821c822efc\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x" Apr 24 21:25:05.527318 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.527178 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2ea4ff06-fbd9-4660-89e1-55160df72184-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-544648b5b5-4kc4n\" (UID: \"2ea4ff06-fbd9-4660-89e1-55160df72184\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-544648b5b5-4kc4n" Apr 24 21:25:05.527318 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.527197 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c327a679-a3f0-445a-919c-ae821c822efc-tmp\") pod \"klusterlet-addon-workmgr-746dd99b58-f855x\" (UID: \"c327a679-a3f0-445a-919c-ae821c822efc\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x" Apr 24 21:25:05.527423 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.527313 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px45w\" (UniqueName: \"kubernetes.io/projected/2ea4ff06-fbd9-4660-89e1-55160df72184-kube-api-access-px45w\") pod \"managed-serviceaccount-addon-agent-544648b5b5-4kc4n\" (UID: \"2ea4ff06-fbd9-4660-89e1-55160df72184\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-544648b5b5-4kc4n" Apr 24 21:25:05.527423 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.527352 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c327a679-a3f0-445a-919c-ae821c822efc-klusterlet-config\") pod \"klusterlet-addon-workmgr-746dd99b58-f855x\" (UID: \"c327a679-a3f0-445a-919c-ae821c822efc\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x" Apr 24 21:25:05.628103 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.628013 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbsp9\" (UniqueName: \"kubernetes.io/projected/c327a679-a3f0-445a-919c-ae821c822efc-kube-api-access-vbsp9\") pod \"klusterlet-addon-workmgr-746dd99b58-f855x\" (UID: \"c327a679-a3f0-445a-919c-ae821c822efc\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x" Apr 24 21:25:05.628103 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.628073 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2ea4ff06-fbd9-4660-89e1-55160df72184-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-544648b5b5-4kc4n\" (UID: \"2ea4ff06-fbd9-4660-89e1-55160df72184\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-544648b5b5-4kc4n" Apr 24 21:25:05.628351 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.628197 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c327a679-a3f0-445a-919c-ae821c822efc-tmp\") pod \"klusterlet-addon-workmgr-746dd99b58-f855x\" (UID: \"c327a679-a3f0-445a-919c-ae821c822efc\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x" Apr 24 21:25:05.628351 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.628294 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-px45w\" (UniqueName: \"kubernetes.io/projected/2ea4ff06-fbd9-4660-89e1-55160df72184-kube-api-access-px45w\") pod \"managed-serviceaccount-addon-agent-544648b5b5-4kc4n\" (UID: \"2ea4ff06-fbd9-4660-89e1-55160df72184\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-544648b5b5-4kc4n" Apr 24 21:25:05.628351 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.628332 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c327a679-a3f0-445a-919c-ae821c822efc-klusterlet-config\") pod \"klusterlet-addon-workmgr-746dd99b58-f855x\" (UID: \"c327a679-a3f0-445a-919c-ae821c822efc\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x" Apr 24 21:25:05.628591 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.628566 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c327a679-a3f0-445a-919c-ae821c822efc-tmp\") pod \"klusterlet-addon-workmgr-746dd99b58-f855x\" (UID: \"c327a679-a3f0-445a-919c-ae821c822efc\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x" Apr 24 21:25:05.630984 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.630955 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2ea4ff06-fbd9-4660-89e1-55160df72184-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-544648b5b5-4kc4n\" (UID: \"2ea4ff06-fbd9-4660-89e1-55160df72184\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-544648b5b5-4kc4n" Apr 24 21:25:05.631120 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.631102 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c327a679-a3f0-445a-919c-ae821c822efc-klusterlet-config\") pod \"klusterlet-addon-workmgr-746dd99b58-f855x\" (UID: \"c327a679-a3f0-445a-919c-ae821c822efc\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x" Apr 24 21:25:05.638952 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.638906 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-px45w\" (UniqueName: \"kubernetes.io/projected/2ea4ff06-fbd9-4660-89e1-55160df72184-kube-api-access-px45w\") pod \"managed-serviceaccount-addon-agent-544648b5b5-4kc4n\" (UID: \"2ea4ff06-fbd9-4660-89e1-55160df72184\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-544648b5b5-4kc4n" Apr 24 21:25:05.640933 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.640910 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbsp9\" (UniqueName: \"kubernetes.io/projected/c327a679-a3f0-445a-919c-ae821c822efc-kube-api-access-vbsp9\") pod \"klusterlet-addon-workmgr-746dd99b58-f855x\" (UID: \"c327a679-a3f0-445a-919c-ae821c822efc\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x" Apr 24 21:25:05.793137 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.793101 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-544648b5b5-4kc4n" Apr 24 21:25:05.804013 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.803987 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x" Apr 24 21:25:05.942831 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.942794 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-544648b5b5-4kc4n"] Apr 24 21:25:05.946318 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:25:05.946283 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ea4ff06_fbd9_4660_89e1_55160df72184.slice/crio-d2eebc4c60807119b1c8d9ffdcc3066200a5477973b7e6f04bdf4d29576d9c3e WatchSource:0}: Error finding container d2eebc4c60807119b1c8d9ffdcc3066200a5477973b7e6f04bdf4d29576d9c3e: Status 404 returned error can't find the container with id d2eebc4c60807119b1c8d9ffdcc3066200a5477973b7e6f04bdf4d29576d9c3e Apr 24 21:25:05.961567 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:05.961544 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x"] Apr 24 21:25:05.964326 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:25:05.964297 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc327a679_a3f0_445a_919c_ae821c822efc.slice/crio-dcc4e6f5a888aeef2819ffc51d84a3c129a342ad6954273578fdd88da02ac32f WatchSource:0}: Error finding container dcc4e6f5a888aeef2819ffc51d84a3c129a342ad6954273578fdd88da02ac32f: Status 404 returned error can't find the container with id dcc4e6f5a888aeef2819ffc51d84a3c129a342ad6954273578fdd88da02ac32f Apr 24 21:25:06.328120 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:06.328083 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x" event={"ID":"c327a679-a3f0-445a-919c-ae821c822efc","Type":"ContainerStarted","Data":"dcc4e6f5a888aeef2819ffc51d84a3c129a342ad6954273578fdd88da02ac32f"} Apr 24 21:25:06.329331 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:06.329299 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-544648b5b5-4kc4n" event={"ID":"2ea4ff06-fbd9-4660-89e1-55160df72184","Type":"ContainerStarted","Data":"d2eebc4c60807119b1c8d9ffdcc3066200a5477973b7e6f04bdf4d29576d9c3e"} Apr 24 21:25:12.349944 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:12.349847 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x" event={"ID":"c327a679-a3f0-445a-919c-ae821c822efc","Type":"ContainerStarted","Data":"b9381378cf41c147f38f43735d4b2bbeb74fdd92b98f0dc2d23322a97b8f2cb8"} Apr 24 21:25:12.350408 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:12.350043 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x" Apr 24 21:25:12.351356 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:12.351320 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-544648b5b5-4kc4n" event={"ID":"2ea4ff06-fbd9-4660-89e1-55160df72184","Type":"ContainerStarted","Data":"d1e579a0df91423c9abb2fc7414a424e98855007fd1f16a7851fd49a94d77227"} Apr 24 21:25:12.351748 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:12.351727 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x" Apr 24 21:25:12.352763 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:12.352740 2578 generic.go:358] "Generic (PLEG): container finished" podID="6166023e-1837-49ce-a72f-c9547dc9e3a6" containerID="1be972cd3fd4879a00dae55a6acfb552c79ccef68af56d54d11727c80a69cc5e" exitCode=0 Apr 24 21:25:12.352832 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:12.352798 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl" event={"ID":"6166023e-1837-49ce-a72f-c9547dc9e3a6","Type":"ContainerDied","Data":"1be972cd3fd4879a00dae55a6acfb552c79ccef68af56d54d11727c80a69cc5e"} Apr 24 21:25:12.366698 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:12.366636 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-746dd99b58-f855x" podStartSLOduration=1.217152477 podStartE2EDuration="7.366624037s" podCreationTimestamp="2026-04-24 21:25:05 +0000 UTC" firstStartedPulling="2026-04-24 21:25:05.966282465 +0000 UTC m=+523.817470958" lastFinishedPulling="2026-04-24 21:25:12.115754041 +0000 UTC m=+529.966942518" observedRunningTime="2026-04-24 21:25:12.366068431 +0000 UTC m=+530.217256928" watchObservedRunningTime="2026-04-24 21:25:12.366624037 +0000 UTC m=+530.217812536" Apr 24 21:25:12.415056 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:12.415008 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-544648b5b5-4kc4n" podStartSLOduration=1.263751986 podStartE2EDuration="7.414993725s" podCreationTimestamp="2026-04-24 21:25:05 +0000 UTC" firstStartedPulling="2026-04-24 21:25:05.948797488 +0000 UTC m=+523.799985980" lastFinishedPulling="2026-04-24 21:25:12.100039239 +0000 UTC m=+529.951227719" observedRunningTime="2026-04-24 21:25:12.413713061 +0000 UTC m=+530.264901557" watchObservedRunningTime="2026-04-24 21:25:12.414993725 +0000 UTC m=+530.266182224" Apr 24 21:25:15.364897 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:15.364859 2578 generic.go:358] "Generic (PLEG): container finished" podID="6166023e-1837-49ce-a72f-c9547dc9e3a6" containerID="3ea03e7cb57f067f97435a3777f108e4caf9feec8246a680622be871e80e1526" exitCode=0 Apr 24 21:25:15.365281 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:15.364943 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl" event={"ID":"6166023e-1837-49ce-a72f-c9547dc9e3a6","Type":"ContainerDied","Data":"3ea03e7cb57f067f97435a3777f108e4caf9feec8246a680622be871e80e1526"} Apr 24 21:25:22.386887 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:22.386785 2578 generic.go:358] "Generic (PLEG): container finished" podID="6166023e-1837-49ce-a72f-c9547dc9e3a6" containerID="d7f2f998da54cca9bcfc615cbdb524e029f36a97d009c0574a1494e2cef717c4" exitCode=0 Apr 24 21:25:22.386887 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:22.386825 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl" event={"ID":"6166023e-1837-49ce-a72f-c9547dc9e3a6","Type":"ContainerDied","Data":"d7f2f998da54cca9bcfc615cbdb524e029f36a97d009c0574a1494e2cef717c4"} Apr 24 21:25:23.509299 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:23.509269 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl" Apr 24 21:25:23.583100 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:23.583062 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6166023e-1837-49ce-a72f-c9547dc9e3a6-util\") pod \"6166023e-1837-49ce-a72f-c9547dc9e3a6\" (UID: \"6166023e-1837-49ce-a72f-c9547dc9e3a6\") " Apr 24 21:25:23.583268 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:23.583115 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdkk5\" (UniqueName: \"kubernetes.io/projected/6166023e-1837-49ce-a72f-c9547dc9e3a6-kube-api-access-vdkk5\") pod \"6166023e-1837-49ce-a72f-c9547dc9e3a6\" (UID: \"6166023e-1837-49ce-a72f-c9547dc9e3a6\") " Apr 24 21:25:23.583268 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:23.583139 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6166023e-1837-49ce-a72f-c9547dc9e3a6-bundle\") pod \"6166023e-1837-49ce-a72f-c9547dc9e3a6\" (UID: \"6166023e-1837-49ce-a72f-c9547dc9e3a6\") " Apr 24 21:25:23.583746 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:23.583720 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6166023e-1837-49ce-a72f-c9547dc9e3a6-bundle" (OuterVolumeSpecName: "bundle") pod "6166023e-1837-49ce-a72f-c9547dc9e3a6" (UID: "6166023e-1837-49ce-a72f-c9547dc9e3a6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:25:23.585397 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:23.585350 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6166023e-1837-49ce-a72f-c9547dc9e3a6-kube-api-access-vdkk5" (OuterVolumeSpecName: "kube-api-access-vdkk5") pod "6166023e-1837-49ce-a72f-c9547dc9e3a6" (UID: "6166023e-1837-49ce-a72f-c9547dc9e3a6"). InnerVolumeSpecName "kube-api-access-vdkk5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:25:23.587327 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:23.587306 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6166023e-1837-49ce-a72f-c9547dc9e3a6-util" (OuterVolumeSpecName: "util") pod "6166023e-1837-49ce-a72f-c9547dc9e3a6" (UID: "6166023e-1837-49ce-a72f-c9547dc9e3a6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:25:23.684626 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:23.684530 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6166023e-1837-49ce-a72f-c9547dc9e3a6-util\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:25:23.684626 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:23.684564 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vdkk5\" (UniqueName: \"kubernetes.io/projected/6166023e-1837-49ce-a72f-c9547dc9e3a6-kube-api-access-vdkk5\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:25:23.684626 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:23.684578 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6166023e-1837-49ce-a72f-c9547dc9e3a6-bundle\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:25:24.393652 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:24.393614 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl" event={"ID":"6166023e-1837-49ce-a72f-c9547dc9e3a6","Type":"ContainerDied","Data":"a700ea62b581b5a2f35162a2c0f7b27eabd31772229f1eb2544a05a044200268"} Apr 24 21:25:24.393652 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:24.393659 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a700ea62b581b5a2f35162a2c0f7b27eabd31772229f1eb2544a05a044200268" Apr 24 21:25:24.393882 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:24.393628 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckvhnl" Apr 24 21:25:31.116860 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.116825 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7"] Apr 24 21:25:31.117489 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.117214 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6166023e-1837-49ce-a72f-c9547dc9e3a6" containerName="util" Apr 24 21:25:31.117489 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.117231 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6166023e-1837-49ce-a72f-c9547dc9e3a6" containerName="util" Apr 24 21:25:31.117489 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.117253 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6166023e-1837-49ce-a72f-c9547dc9e3a6" containerName="extract" Apr 24 21:25:31.117489 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.117261 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6166023e-1837-49ce-a72f-c9547dc9e3a6" containerName="extract" Apr 24 21:25:31.117489 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.117273 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6166023e-1837-49ce-a72f-c9547dc9e3a6" containerName="pull" Apr 24 21:25:31.117489 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.117281 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6166023e-1837-49ce-a72f-c9547dc9e3a6" containerName="pull" Apr 24 21:25:31.117489 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.117361 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6166023e-1837-49ce-a72f-c9547dc9e3a6" containerName="extract" Apr 24 21:25:31.179632 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.179591 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7"] Apr 24 21:25:31.179830 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.179763 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7" Apr 24 21:25:31.183138 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.183109 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 21:25:31.183138 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.183129 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 21:25:31.183371 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.183200 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 21:25:31.183480 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.183466 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-ffw4c\"" Apr 24 21:25:31.245262 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.245226 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhqvh\" (UniqueName: \"kubernetes.io/projected/57f78488-4e17-4a0f-a738-42d5b0dfa6f0-kube-api-access-qhqvh\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7\" (UID: \"57f78488-4e17-4a0f-a738-42d5b0dfa6f0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7" Apr 24 21:25:31.245441 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.245279 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/57f78488-4e17-4a0f-a738-42d5b0dfa6f0-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7\" (UID: \"57f78488-4e17-4a0f-a738-42d5b0dfa6f0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7" Apr 24 21:25:31.346217 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.346176 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/57f78488-4e17-4a0f-a738-42d5b0dfa6f0-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7\" (UID: \"57f78488-4e17-4a0f-a738-42d5b0dfa6f0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7" Apr 24 21:25:31.346380 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.346247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhqvh\" (UniqueName: \"kubernetes.io/projected/57f78488-4e17-4a0f-a738-42d5b0dfa6f0-kube-api-access-qhqvh\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7\" (UID: \"57f78488-4e17-4a0f-a738-42d5b0dfa6f0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7" Apr 24 21:25:31.348477 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.348454 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/57f78488-4e17-4a0f-a738-42d5b0dfa6f0-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7\" (UID: \"57f78488-4e17-4a0f-a738-42d5b0dfa6f0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7" Apr 24 21:25:31.355006 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.354975 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhqvh\" (UniqueName: \"kubernetes.io/projected/57f78488-4e17-4a0f-a738-42d5b0dfa6f0-kube-api-access-qhqvh\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7\" (UID: \"57f78488-4e17-4a0f-a738-42d5b0dfa6f0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7" Apr 24 21:25:31.489889 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.489858 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7" Apr 24 21:25:31.617109 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:31.617074 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7"] Apr 24 21:25:31.620481 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:25:31.620456 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57f78488_4e17_4a0f_a738_42d5b0dfa6f0.slice/crio-9c2c29755853cff04925858c62b57a93ea8b754aef181927877089aad6fda0d0 WatchSource:0}: Error finding container 9c2c29755853cff04925858c62b57a93ea8b754aef181927877089aad6fda0d0: Status 404 returned error can't find the container with id 9c2c29755853cff04925858c62b57a93ea8b754aef181927877089aad6fda0d0 Apr 24 21:25:32.417889 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:32.417847 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7" event={"ID":"57f78488-4e17-4a0f-a738-42d5b0dfa6f0","Type":"ContainerStarted","Data":"9c2c29755853cff04925858c62b57a93ea8b754aef181927877089aad6fda0d0"} Apr 24 21:25:35.290795 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:35.290762 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-l8kq8"] Apr 24 21:25:35.293909 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:35.293890 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" Apr 24 21:25:35.298126 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:35.298108 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 21:25:35.298126 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:35.298112 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 21:25:35.298292 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:35.298145 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-gl4bn\"" Apr 24 21:25:35.303038 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:35.303017 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-l8kq8"] Apr 24 21:25:35.381694 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:35.381640 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/2e0ef208-e8d1-4989-a6b4-23b811b0b865-cabundle0\") pod \"keda-operator-ffbb595cb-l8kq8\" (UID: \"2e0ef208-e8d1-4989-a6b4-23b811b0b865\") " pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" Apr 24 21:25:35.381875 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:35.381721 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72c5m\" (UniqueName: \"kubernetes.io/projected/2e0ef208-e8d1-4989-a6b4-23b811b0b865-kube-api-access-72c5m\") pod \"keda-operator-ffbb595cb-l8kq8\" (UID: \"2e0ef208-e8d1-4989-a6b4-23b811b0b865\") " pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" Apr 24 21:25:35.381875 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:35.381780 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2e0ef208-e8d1-4989-a6b4-23b811b0b865-certificates\") pod \"keda-operator-ffbb595cb-l8kq8\" (UID: \"2e0ef208-e8d1-4989-a6b4-23b811b0b865\") " pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" Apr 24 21:25:35.427841 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:35.427800 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7" event={"ID":"57f78488-4e17-4a0f-a738-42d5b0dfa6f0","Type":"ContainerStarted","Data":"fd62bfdef8daf3c9526ac053b35ee96ec428bb5f6138d9ae975175159a6f8c05"} Apr 24 21:25:35.428031 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:35.428013 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7" Apr 24 21:25:35.463006 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:35.462960 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7" podStartSLOduration=1.284517182 podStartE2EDuration="4.462946263s" podCreationTimestamp="2026-04-24 21:25:31 +0000 UTC" firstStartedPulling="2026-04-24 21:25:31.622838923 +0000 UTC m=+549.474027404" lastFinishedPulling="2026-04-24 21:25:34.801268008 +0000 UTC m=+552.652456485" observedRunningTime="2026-04-24 21:25:35.455770438 +0000 UTC m=+553.306958936" watchObservedRunningTime="2026-04-24 21:25:35.462946263 +0000 UTC m=+553.314134760" Apr 24 21:25:35.482377 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:35.482344 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/2e0ef208-e8d1-4989-a6b4-23b811b0b865-cabundle0\") pod \"keda-operator-ffbb595cb-l8kq8\" (UID: \"2e0ef208-e8d1-4989-a6b4-23b811b0b865\") " pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" Apr 24 21:25:35.482513 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:35.482388 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72c5m\" (UniqueName: \"kubernetes.io/projected/2e0ef208-e8d1-4989-a6b4-23b811b0b865-kube-api-access-72c5m\") pod \"keda-operator-ffbb595cb-l8kq8\" (UID: \"2e0ef208-e8d1-4989-a6b4-23b811b0b865\") " pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" Apr 24 21:25:35.482513 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:35.482452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2e0ef208-e8d1-4989-a6b4-23b811b0b865-certificates\") pod \"keda-operator-ffbb595cb-l8kq8\" (UID: \"2e0ef208-e8d1-4989-a6b4-23b811b0b865\") " pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" Apr 24 21:25:35.482622 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:25:35.482586 2578 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 24 21:25:35.482622 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:25:35.482605 2578 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:25:35.482622 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:25:35.482616 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:25:35.482789 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:25:35.482631 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-l8kq8: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 21:25:35.482789 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:25:35.482722 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e0ef208-e8d1-4989-a6b4-23b811b0b865-certificates podName:2e0ef208-e8d1-4989-a6b4-23b811b0b865 nodeName:}" failed. No retries permitted until 2026-04-24 21:25:35.982674234 +0000 UTC m=+553.833862726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2e0ef208-e8d1-4989-a6b4-23b811b0b865-certificates") pod "keda-operator-ffbb595cb-l8kq8" (UID: "2e0ef208-e8d1-4989-a6b4-23b811b0b865") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 21:25:35.483553 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:35.483535 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/2e0ef208-e8d1-4989-a6b4-23b811b0b865-cabundle0\") pod \"keda-operator-ffbb595cb-l8kq8\" (UID: \"2e0ef208-e8d1-4989-a6b4-23b811b0b865\") " pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" Apr 24 21:25:35.499664 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:35.499638 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72c5m\" (UniqueName: \"kubernetes.io/projected/2e0ef208-e8d1-4989-a6b4-23b811b0b865-kube-api-access-72c5m\") pod \"keda-operator-ffbb595cb-l8kq8\" (UID: \"2e0ef208-e8d1-4989-a6b4-23b811b0b865\") " pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" Apr 24 21:25:35.985921 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:35.985862 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2e0ef208-e8d1-4989-a6b4-23b811b0b865-certificates\") pod \"keda-operator-ffbb595cb-l8kq8\" (UID: \"2e0ef208-e8d1-4989-a6b4-23b811b0b865\") " pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" Apr 24 21:25:35.986110 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:25:35.985990 2578 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:25:35.986110 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:25:35.986011 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:25:35.986110 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:25:35.986020 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-l8kq8: references non-existent secret key: ca.crt Apr 24 21:25:35.986110 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:25:35.986071 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e0ef208-e8d1-4989-a6b4-23b811b0b865-certificates podName:2e0ef208-e8d1-4989-a6b4-23b811b0b865 nodeName:}" failed. No retries permitted until 2026-04-24 21:25:36.986056909 +0000 UTC m=+554.837245391 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2e0ef208-e8d1-4989-a6b4-23b811b0b865-certificates") pod "keda-operator-ffbb595cb-l8kq8" (UID: "2e0ef208-e8d1-4989-a6b4-23b811b0b865") : references non-existent secret key: ca.crt Apr 24 21:25:36.994985 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:36.994950 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2e0ef208-e8d1-4989-a6b4-23b811b0b865-certificates\") pod \"keda-operator-ffbb595cb-l8kq8\" (UID: \"2e0ef208-e8d1-4989-a6b4-23b811b0b865\") " pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" Apr 24 21:25:36.995386 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:25:36.995086 2578 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:25:36.995386 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:25:36.995107 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:25:36.995386 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:25:36.995116 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-l8kq8: references non-existent secret key: ca.crt Apr 24 21:25:36.995386 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:25:36.995168 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e0ef208-e8d1-4989-a6b4-23b811b0b865-certificates podName:2e0ef208-e8d1-4989-a6b4-23b811b0b865 nodeName:}" failed. No retries permitted until 2026-04-24 21:25:38.99515444 +0000 UTC m=+556.846342916 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2e0ef208-e8d1-4989-a6b4-23b811b0b865-certificates") pod "keda-operator-ffbb595cb-l8kq8" (UID: "2e0ef208-e8d1-4989-a6b4-23b811b0b865") : references non-existent secret key: ca.crt Apr 24 21:25:39.013742 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:39.013689 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2e0ef208-e8d1-4989-a6b4-23b811b0b865-certificates\") pod \"keda-operator-ffbb595cb-l8kq8\" (UID: \"2e0ef208-e8d1-4989-a6b4-23b811b0b865\") " pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" Apr 24 21:25:39.016291 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:39.016266 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2e0ef208-e8d1-4989-a6b4-23b811b0b865-certificates\") pod \"keda-operator-ffbb595cb-l8kq8\" (UID: \"2e0ef208-e8d1-4989-a6b4-23b811b0b865\") " pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" Apr 24 21:25:39.204777 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:39.204737 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" Apr 24 21:25:39.328846 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:39.328811 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-l8kq8"] Apr 24 21:25:39.331819 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:25:39.331788 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e0ef208_e8d1_4989_a6b4_23b811b0b865.slice/crio-98f6e1f8fa60527f2abe0c2e8ff9d2eae14be6fe8991c07b62d5e4b5ffd06664 WatchSource:0}: Error finding container 98f6e1f8fa60527f2abe0c2e8ff9d2eae14be6fe8991c07b62d5e4b5ffd06664: Status 404 returned error can't find the container with id 98f6e1f8fa60527f2abe0c2e8ff9d2eae14be6fe8991c07b62d5e4b5ffd06664 Apr 24 21:25:39.439703 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:39.439640 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" event={"ID":"2e0ef208-e8d1-4989-a6b4-23b811b0b865","Type":"ContainerStarted","Data":"98f6e1f8fa60527f2abe0c2e8ff9d2eae14be6fe8991c07b62d5e4b5ffd06664"} Apr 24 21:25:42.451184 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:42.451141 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" event={"ID":"2e0ef208-e8d1-4989-a6b4-23b811b0b865","Type":"ContainerStarted","Data":"43ef86da2eff72feb2ba62f5a40bdb301543751b32841b318c6c16c9e73b9c98"} Apr 24 21:25:42.451544 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:42.451311 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" Apr 24 21:25:42.469310 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:42.469260 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" podStartSLOduration=4.510713893 podStartE2EDuration="7.469246901s" podCreationTimestamp="2026-04-24 21:25:35 +0000 UTC" firstStartedPulling="2026-04-24 21:25:39.333566298 +0000 UTC m=+557.184754793" lastFinishedPulling="2026-04-24 21:25:42.29209932 +0000 UTC m=+560.143287801" observedRunningTime="2026-04-24 21:25:42.468258846 +0000 UTC m=+560.319447336" watchObservedRunningTime="2026-04-24 21:25:42.469246901 +0000 UTC m=+560.320435400" Apr 24 21:25:56.434181 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:25:56.434145 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xv6q7" Apr 24 21:26:03.457205 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:03.457175 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-l8kq8" Apr 24 21:26:22.653767 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:22.653735 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:26:22.654255 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:22.654207 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:26:43.374389 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.374297 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-gzdl2"] Apr 24 21:26:43.377608 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.377582 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gzdl2" Apr 24 21:26:43.380384 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.380356 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 21:26:43.381098 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.381079 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-6w7pm\"" Apr 24 21:26:43.381228 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.381081 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:26:43.381896 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.381874 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:26:43.394152 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.394121 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-gzdl2"] Apr 24 21:26:43.407591 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.407550 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-dnd4k"] Apr 24 21:26:43.411195 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.411170 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-dnd4k" Apr 24 21:26:43.413965 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.413941 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-9gzd4\"" Apr 24 21:26:43.414110 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.414008 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:26:43.422409 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.422378 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-dnd4k"] Apr 24 21:26:43.480461 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.480416 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfgp5\" (UniqueName: \"kubernetes.io/projected/31a0d55e-bdad-4e40-904b-9a677606a391-kube-api-access-wfgp5\") pod \"llmisvc-controller-manager-68cc5db7c4-gzdl2\" (UID: \"31a0d55e-bdad-4e40-904b-9a677606a391\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gzdl2" Apr 24 21:26:43.480461 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.480465 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f01308fc-a4d6-482c-abcd-2c45cba9f156-data\") pod \"seaweedfs-86cc847c5c-dnd4k\" (UID: \"f01308fc-a4d6-482c-abcd-2c45cba9f156\") " pod="kserve/seaweedfs-86cc847c5c-dnd4k" Apr 24 21:26:43.480730 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.480498 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rctj7\" (UniqueName: \"kubernetes.io/projected/f01308fc-a4d6-482c-abcd-2c45cba9f156-kube-api-access-rctj7\") pod \"seaweedfs-86cc847c5c-dnd4k\" (UID: \"f01308fc-a4d6-482c-abcd-2c45cba9f156\") " pod="kserve/seaweedfs-86cc847c5c-dnd4k" Apr 24 21:26:43.480730 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.480551 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31a0d55e-bdad-4e40-904b-9a677606a391-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-gzdl2\" (UID: \"31a0d55e-bdad-4e40-904b-9a677606a391\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gzdl2" Apr 24 21:26:43.581061 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.581022 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfgp5\" (UniqueName: \"kubernetes.io/projected/31a0d55e-bdad-4e40-904b-9a677606a391-kube-api-access-wfgp5\") pod \"llmisvc-controller-manager-68cc5db7c4-gzdl2\" (UID: \"31a0d55e-bdad-4e40-904b-9a677606a391\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gzdl2" Apr 24 21:26:43.581254 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.581069 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f01308fc-a4d6-482c-abcd-2c45cba9f156-data\") pod \"seaweedfs-86cc847c5c-dnd4k\" (UID: \"f01308fc-a4d6-482c-abcd-2c45cba9f156\") " pod="kserve/seaweedfs-86cc847c5c-dnd4k" Apr 24 21:26:43.581254 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.581106 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rctj7\" (UniqueName: \"kubernetes.io/projected/f01308fc-a4d6-482c-abcd-2c45cba9f156-kube-api-access-rctj7\") pod \"seaweedfs-86cc847c5c-dnd4k\" (UID: \"f01308fc-a4d6-482c-abcd-2c45cba9f156\") " pod="kserve/seaweedfs-86cc847c5c-dnd4k" Apr 24 21:26:43.581254 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.581139 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31a0d55e-bdad-4e40-904b-9a677606a391-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-gzdl2\" (UID: \"31a0d55e-bdad-4e40-904b-9a677606a391\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gzdl2" Apr 24 21:26:43.581254 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:26:43.581249 2578 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 24 21:26:43.581443 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:26:43.581318 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31a0d55e-bdad-4e40-904b-9a677606a391-cert podName:31a0d55e-bdad-4e40-904b-9a677606a391 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:44.081295505 +0000 UTC m=+621.932483982 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/31a0d55e-bdad-4e40-904b-9a677606a391-cert") pod "llmisvc-controller-manager-68cc5db7c4-gzdl2" (UID: "31a0d55e-bdad-4e40-904b-9a677606a391") : secret "llmisvc-webhook-server-cert" not found Apr 24 21:26:43.581513 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.581489 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f01308fc-a4d6-482c-abcd-2c45cba9f156-data\") pod \"seaweedfs-86cc847c5c-dnd4k\" (UID: \"f01308fc-a4d6-482c-abcd-2c45cba9f156\") " pod="kserve/seaweedfs-86cc847c5c-dnd4k" Apr 24 21:26:43.592232 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.592179 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rctj7\" (UniqueName: \"kubernetes.io/projected/f01308fc-a4d6-482c-abcd-2c45cba9f156-kube-api-access-rctj7\") pod \"seaweedfs-86cc847c5c-dnd4k\" (UID: \"f01308fc-a4d6-482c-abcd-2c45cba9f156\") " pod="kserve/seaweedfs-86cc847c5c-dnd4k" Apr 24 21:26:43.593665 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.593640 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfgp5\" (UniqueName: \"kubernetes.io/projected/31a0d55e-bdad-4e40-904b-9a677606a391-kube-api-access-wfgp5\") pod \"llmisvc-controller-manager-68cc5db7c4-gzdl2\" (UID: \"31a0d55e-bdad-4e40-904b-9a677606a391\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gzdl2" Apr 24 21:26:43.722885 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.722791 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-dnd4k" Apr 24 21:26:43.853668 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:43.853485 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-dnd4k"] Apr 24 21:26:43.856315 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:26:43.856275 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf01308fc_a4d6_482c_abcd_2c45cba9f156.slice/crio-13777b53c6aaa814a89c2c51980e9df07690d58a312fca570b7450c548aaa7e4 WatchSource:0}: Error finding container 13777b53c6aaa814a89c2c51980e9df07690d58a312fca570b7450c548aaa7e4: Status 404 returned error can't find the container with id 13777b53c6aaa814a89c2c51980e9df07690d58a312fca570b7450c548aaa7e4 Apr 24 21:26:44.085284 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:44.085250 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31a0d55e-bdad-4e40-904b-9a677606a391-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-gzdl2\" (UID: \"31a0d55e-bdad-4e40-904b-9a677606a391\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gzdl2" Apr 24 21:26:44.088076 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:44.088053 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31a0d55e-bdad-4e40-904b-9a677606a391-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-gzdl2\" (UID: \"31a0d55e-bdad-4e40-904b-9a677606a391\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gzdl2" Apr 24 21:26:44.290714 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:44.290659 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gzdl2" Apr 24 21:26:44.496755 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:44.496730 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-gzdl2"] Apr 24 21:26:44.499748 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:26:44.499714 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod31a0d55e_bdad_4e40_904b_9a677606a391.slice/crio-4ad1d45c9ef9c8ed9e859dd2fb5d4b23b85521594fe208bb2cce6aa6f473b63f WatchSource:0}: Error finding container 4ad1d45c9ef9c8ed9e859dd2fb5d4b23b85521594fe208bb2cce6aa6f473b63f: Status 404 returned error can't find the container with id 4ad1d45c9ef9c8ed9e859dd2fb5d4b23b85521594fe208bb2cce6aa6f473b63f Apr 24 21:26:44.639665 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:44.639565 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-dnd4k" event={"ID":"f01308fc-a4d6-482c-abcd-2c45cba9f156","Type":"ContainerStarted","Data":"13777b53c6aaa814a89c2c51980e9df07690d58a312fca570b7450c548aaa7e4"} Apr 24 21:26:44.640747 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:44.640718 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gzdl2" event={"ID":"31a0d55e-bdad-4e40-904b-9a677606a391","Type":"ContainerStarted","Data":"4ad1d45c9ef9c8ed9e859dd2fb5d4b23b85521594fe208bb2cce6aa6f473b63f"} Apr 24 21:26:47.653251 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:47.653212 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gzdl2" event={"ID":"31a0d55e-bdad-4e40-904b-9a677606a391","Type":"ContainerStarted","Data":"7860b7a7d0ea4e690c30e4707ac4bf0e62fb670642c427ee9f488cc7809a393f"} Apr 24 21:26:47.653835 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:47.653323 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gzdl2" Apr 24 21:26:47.654472 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:47.654451 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-dnd4k" event={"ID":"f01308fc-a4d6-482c-abcd-2c45cba9f156","Type":"ContainerStarted","Data":"712f55bca7f91fa501564af9d40d9e50f6b128a1e59918a68cda6015282ac867"} Apr 24 21:26:47.654594 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:47.654580 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-dnd4k" Apr 24 21:26:47.669009 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:47.668953 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gzdl2" podStartSLOduration=1.615380673 podStartE2EDuration="4.668935795s" podCreationTimestamp="2026-04-24 21:26:43 +0000 UTC" firstStartedPulling="2026-04-24 21:26:44.5013248 +0000 UTC m=+622.352513281" lastFinishedPulling="2026-04-24 21:26:47.554879926 +0000 UTC m=+625.406068403" observedRunningTime="2026-04-24 21:26:47.668088085 +0000 UTC m=+625.519276585" watchObservedRunningTime="2026-04-24 21:26:47.668935795 +0000 UTC m=+625.520124296" Apr 24 21:26:47.683835 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:47.683783 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-dnd4k" podStartSLOduration=1.042576336 podStartE2EDuration="4.683766411s" podCreationTimestamp="2026-04-24 21:26:43 +0000 UTC" firstStartedPulling="2026-04-24 21:26:43.857720779 +0000 UTC m=+621.708909256" lastFinishedPulling="2026-04-24 21:26:47.498910854 +0000 UTC m=+625.350099331" observedRunningTime="2026-04-24 21:26:47.682236367 +0000 UTC m=+625.533424866" watchObservedRunningTime="2026-04-24 21:26:47.683766411 +0000 UTC m=+625.534954909" Apr 24 21:26:53.664503 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:26:53.664471 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-dnd4k" Apr 24 21:27:18.664998 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:18.664964 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gzdl2" Apr 24 21:27:57.634262 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.634223 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-57d8cfbd8d-qxglm"] Apr 24 21:27:57.637503 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.637485 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.647486 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.647458 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57d8cfbd8d-qxglm"] Apr 24 21:27:57.780581 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.780530 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb46568a-d191-418a-8d51-03b26e627c83-console-serving-cert\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.780581 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.780585 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb46568a-d191-418a-8d51-03b26e627c83-service-ca\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.780892 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.780729 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb46568a-d191-418a-8d51-03b26e627c83-trusted-ca-bundle\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.780892 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.780767 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fb46568a-d191-418a-8d51-03b26e627c83-console-config\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.780892 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.780814 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fb46568a-d191-418a-8d51-03b26e627c83-oauth-serving-cert\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.781030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.780899 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d2gr\" (UniqueName: \"kubernetes.io/projected/fb46568a-d191-418a-8d51-03b26e627c83-kube-api-access-8d2gr\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.781030 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.780929 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fb46568a-d191-418a-8d51-03b26e627c83-console-oauth-config\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.882326 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.882281 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb46568a-d191-418a-8d51-03b26e627c83-trusted-ca-bundle\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.882326 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.882323 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fb46568a-d191-418a-8d51-03b26e627c83-console-config\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.882585 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.882341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fb46568a-d191-418a-8d51-03b26e627c83-oauth-serving-cert\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.882585 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.882382 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8d2gr\" (UniqueName: \"kubernetes.io/projected/fb46568a-d191-418a-8d51-03b26e627c83-kube-api-access-8d2gr\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.882585 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.882399 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fb46568a-d191-418a-8d51-03b26e627c83-console-oauth-config\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.882585 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.882422 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb46568a-d191-418a-8d51-03b26e627c83-console-serving-cert\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.882585 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.882444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb46568a-d191-418a-8d51-03b26e627c83-service-ca\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.883178 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.883150 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fb46568a-d191-418a-8d51-03b26e627c83-oauth-serving-cert\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.883296 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.883174 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fb46568a-d191-418a-8d51-03b26e627c83-console-config\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.883296 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.883209 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb46568a-d191-418a-8d51-03b26e627c83-service-ca\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.883511 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.883493 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb46568a-d191-418a-8d51-03b26e627c83-trusted-ca-bundle\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.885066 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.885003 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb46568a-d191-418a-8d51-03b26e627c83-console-serving-cert\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.885164 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.885067 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fb46568a-d191-418a-8d51-03b26e627c83-console-oauth-config\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.893659 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.893631 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d2gr\" (UniqueName: \"kubernetes.io/projected/fb46568a-d191-418a-8d51-03b26e627c83-kube-api-access-8d2gr\") pod \"console-57d8cfbd8d-qxglm\" (UID: \"fb46568a-d191-418a-8d51-03b26e627c83\") " pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:57.947152 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:57.947121 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:27:58.077703 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:58.077655 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57d8cfbd8d-qxglm"] Apr 24 21:27:58.080310 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:27:58.080281 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb46568a_d191_418a_8d51_03b26e627c83.slice/crio-62b66d36c65a47794196f6bb87dddf8ea06cbbf4fdfd92b2abda8a3cc4ae64bb WatchSource:0}: Error finding container 62b66d36c65a47794196f6bb87dddf8ea06cbbf4fdfd92b2abda8a3cc4ae64bb: Status 404 returned error can't find the container with id 62b66d36c65a47794196f6bb87dddf8ea06cbbf4fdfd92b2abda8a3cc4ae64bb Apr 24 21:27:58.872625 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:58.872585 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57d8cfbd8d-qxglm" event={"ID":"fb46568a-d191-418a-8d51-03b26e627c83","Type":"ContainerStarted","Data":"146d03f81e165725471434c7203d351c3b5feddaf83f8816e28a4001caa1aa32"} Apr 24 21:27:58.872625 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:58.872622 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57d8cfbd8d-qxglm" event={"ID":"fb46568a-d191-418a-8d51-03b26e627c83","Type":"ContainerStarted","Data":"62b66d36c65a47794196f6bb87dddf8ea06cbbf4fdfd92b2abda8a3cc4ae64bb"} Apr 24 21:27:58.891608 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:27:58.891557 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57d8cfbd8d-qxglm" podStartSLOduration=1.89154095 podStartE2EDuration="1.89154095s" podCreationTimestamp="2026-04-24 21:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:58.890642611 +0000 UTC m=+696.741831110" watchObservedRunningTime="2026-04-24 21:27:58.89154095 +0000 UTC m=+696.742729475" Apr 24 21:28:07.948197 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:07.948150 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:28:07.948197 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:07.948197 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:28:07.953314 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:07.953288 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:28:08.100636 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:08.100594 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-x9gcs"] Apr 24 21:28:08.104084 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:08.104058 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-x9gcs" Apr 24 21:28:08.110589 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:08.110565 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-x9gcs"] Apr 24 21:28:08.262389 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:08.262358 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl7sz\" (UniqueName: \"kubernetes.io/projected/3e769146-edb3-4f7e-912a-8ecc88bd044d-kube-api-access-vl7sz\") pod \"s3-init-x9gcs\" (UID: \"3e769146-edb3-4f7e-912a-8ecc88bd044d\") " pod="kserve/s3-init-x9gcs" Apr 24 21:28:08.363271 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:08.363228 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vl7sz\" (UniqueName: \"kubernetes.io/projected/3e769146-edb3-4f7e-912a-8ecc88bd044d-kube-api-access-vl7sz\") pod \"s3-init-x9gcs\" (UID: \"3e769146-edb3-4f7e-912a-8ecc88bd044d\") " pod="kserve/s3-init-x9gcs" Apr 24 21:28:08.371591 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:08.371552 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl7sz\" (UniqueName: \"kubernetes.io/projected/3e769146-edb3-4f7e-912a-8ecc88bd044d-kube-api-access-vl7sz\") pod \"s3-init-x9gcs\" (UID: \"3e769146-edb3-4f7e-912a-8ecc88bd044d\") " pod="kserve/s3-init-x9gcs" Apr 24 21:28:08.414494 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:08.414454 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-x9gcs" Apr 24 21:28:08.531043 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:08.530969 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-x9gcs"] Apr 24 21:28:08.534211 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:28:08.534183 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e769146_edb3_4f7e_912a_8ecc88bd044d.slice/crio-078f8ae1c40dbbd23544bfb2a18dbf6c3917051fe086a14b1a690507d128417f WatchSource:0}: Error finding container 078f8ae1c40dbbd23544bfb2a18dbf6c3917051fe086a14b1a690507d128417f: Status 404 returned error can't find the container with id 078f8ae1c40dbbd23544bfb2a18dbf6c3917051fe086a14b1a690507d128417f Apr 24 21:28:08.906981 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:08.906893 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-x9gcs" event={"ID":"3e769146-edb3-4f7e-912a-8ecc88bd044d","Type":"ContainerStarted","Data":"078f8ae1c40dbbd23544bfb2a18dbf6c3917051fe086a14b1a690507d128417f"} Apr 24 21:28:08.911985 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:08.911956 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-57d8cfbd8d-qxglm" Apr 24 21:28:08.957264 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:08.956475 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67ddfbdbd5-g6vdq"] Apr 24 21:28:13.927063 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:13.927025 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-x9gcs" event={"ID":"3e769146-edb3-4f7e-912a-8ecc88bd044d","Type":"ContainerStarted","Data":"7db1bcaa41b60231d0e93da54865f1b2a1707bab36732c733f46180cd74adf59"} Apr 24 21:28:13.943387 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:13.943336 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-x9gcs" podStartSLOduration=1.394154717 podStartE2EDuration="5.943320174s" podCreationTimestamp="2026-04-24 21:28:08 +0000 UTC" firstStartedPulling="2026-04-24 21:28:08.536056861 +0000 UTC m=+706.387245342" lastFinishedPulling="2026-04-24 21:28:13.085222319 +0000 UTC m=+710.936410799" observedRunningTime="2026-04-24 21:28:13.941789789 +0000 UTC m=+711.792978288" watchObservedRunningTime="2026-04-24 21:28:13.943320174 +0000 UTC m=+711.794508672" Apr 24 21:28:16.937764 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:16.937723 2578 generic.go:358] "Generic (PLEG): container finished" podID="3e769146-edb3-4f7e-912a-8ecc88bd044d" containerID="7db1bcaa41b60231d0e93da54865f1b2a1707bab36732c733f46180cd74adf59" exitCode=0 Apr 24 21:28:16.938167 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:16.937794 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-x9gcs" event={"ID":"3e769146-edb3-4f7e-912a-8ecc88bd044d","Type":"ContainerDied","Data":"7db1bcaa41b60231d0e93da54865f1b2a1707bab36732c733f46180cd74adf59"} Apr 24 21:28:18.060294 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:18.060267 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-x9gcs" Apr 24 21:28:18.142664 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:18.142629 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl7sz\" (UniqueName: \"kubernetes.io/projected/3e769146-edb3-4f7e-912a-8ecc88bd044d-kube-api-access-vl7sz\") pod \"3e769146-edb3-4f7e-912a-8ecc88bd044d\" (UID: \"3e769146-edb3-4f7e-912a-8ecc88bd044d\") " Apr 24 21:28:18.144931 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:18.144890 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e769146-edb3-4f7e-912a-8ecc88bd044d-kube-api-access-vl7sz" (OuterVolumeSpecName: "kube-api-access-vl7sz") pod "3e769146-edb3-4f7e-912a-8ecc88bd044d" (UID: "3e769146-edb3-4f7e-912a-8ecc88bd044d"). InnerVolumeSpecName "kube-api-access-vl7sz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:28:18.244098 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:18.244064 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vl7sz\" (UniqueName: \"kubernetes.io/projected/3e769146-edb3-4f7e-912a-8ecc88bd044d-kube-api-access-vl7sz\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:28:18.946442 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:18.946413 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-x9gcs" Apr 24 21:28:18.946607 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:18.946411 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-x9gcs" event={"ID":"3e769146-edb3-4f7e-912a-8ecc88bd044d","Type":"ContainerDied","Data":"078f8ae1c40dbbd23544bfb2a18dbf6c3917051fe086a14b1a690507d128417f"} Apr 24 21:28:18.946607 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:18.946521 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="078f8ae1c40dbbd23544bfb2a18dbf6c3917051fe086a14b1a690507d128417f" Apr 24 21:28:29.025992 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.025958 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch"] Apr 24 21:28:29.026442 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.026299 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e769146-edb3-4f7e-912a-8ecc88bd044d" containerName="s3-init" Apr 24 21:28:29.026442 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.026312 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e769146-edb3-4f7e-912a-8ecc88bd044d" containerName="s3-init" Apr 24 21:28:29.026442 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.026372 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e769146-edb3-4f7e-912a-8ecc88bd044d" containerName="s3-init" Apr 24 21:28:29.030307 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.030289 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:28:29.032567 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.032542 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\"" Apr 24 21:28:29.032719 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.032666 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xkfdn\"" Apr 24 21:28:29.032785 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.032734 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:28:29.032836 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.032738 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:28:29.033180 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.033157 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j8xm\" (UniqueName: \"kubernetes.io/projected/412b7f5f-3972-48a5-a5b2-ef53db782271-kube-api-access-7j8xm\") pod \"isvc-sklearn-graph-1-predictor-66d58bff49-kwvch\" (UID: \"412b7f5f-3972-48a5-a5b2-ef53db782271\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:28:29.033278 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.033200 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/412b7f5f-3972-48a5-a5b2-ef53db782271-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-66d58bff49-kwvch\" (UID: \"412b7f5f-3972-48a5-a5b2-ef53db782271\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:28:29.033278 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.033229 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/412b7f5f-3972-48a5-a5b2-ef53db782271-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-66d58bff49-kwvch\" (UID: \"412b7f5f-3972-48a5-a5b2-ef53db782271\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:28:29.033278 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.033250 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/412b7f5f-3972-48a5-a5b2-ef53db782271-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-66d58bff49-kwvch\" (UID: \"412b7f5f-3972-48a5-a5b2-ef53db782271\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:28:29.033562 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.033546 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-predictor-serving-cert\"" Apr 24 21:28:29.040822 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.040801 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch"] Apr 24 21:28:29.042833 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.042795 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz"] Apr 24 21:28:29.046144 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.046117 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" Apr 24 21:28:29.048242 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.048222 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d41b1-predictor-serving-cert\"" Apr 24 21:28:29.048335 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.048226 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d41b1-kube-rbac-proxy-sar-config\"" Apr 24 21:28:29.056525 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.056499 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz"] Apr 24 21:28:29.133912 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.133874 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/412b7f5f-3972-48a5-a5b2-ef53db782271-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-66d58bff49-kwvch\" (UID: \"412b7f5f-3972-48a5-a5b2-ef53db782271\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:28:29.134111 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.133917 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/412b7f5f-3972-48a5-a5b2-ef53db782271-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-66d58bff49-kwvch\" (UID: \"412b7f5f-3972-48a5-a5b2-ef53db782271\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:28:29.134111 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.133939 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/412b7f5f-3972-48a5-a5b2-ef53db782271-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-66d58bff49-kwvch\" (UID: \"412b7f5f-3972-48a5-a5b2-ef53db782271\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:28:29.134111 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.133972 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-d41b1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/404559c5-ccb0-43a1-80ce-1c7609bc5692-error-404-isvc-d41b1-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d41b1-predictor-597b847c78-mknpz\" (UID: \"404559c5-ccb0-43a1-80ce-1c7609bc5692\") " pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" Apr 24 21:28:29.134111 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.134016 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-965t8\" (UniqueName: \"kubernetes.io/projected/404559c5-ccb0-43a1-80ce-1c7609bc5692-kube-api-access-965t8\") pod \"error-404-isvc-d41b1-predictor-597b847c78-mknpz\" (UID: \"404559c5-ccb0-43a1-80ce-1c7609bc5692\") " pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" Apr 24 21:28:29.134111 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.134049 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/404559c5-ccb0-43a1-80ce-1c7609bc5692-proxy-tls\") pod \"error-404-isvc-d41b1-predictor-597b847c78-mknpz\" (UID: \"404559c5-ccb0-43a1-80ce-1c7609bc5692\") " pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" Apr 24 21:28:29.134111 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:28:29.134058 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-serving-cert: secret "isvc-sklearn-graph-1-predictor-serving-cert" not found Apr 24 21:28:29.134111 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.134091 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7j8xm\" (UniqueName: \"kubernetes.io/projected/412b7f5f-3972-48a5-a5b2-ef53db782271-kube-api-access-7j8xm\") pod \"isvc-sklearn-graph-1-predictor-66d58bff49-kwvch\" (UID: \"412b7f5f-3972-48a5-a5b2-ef53db782271\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:28:29.134463 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:28:29.134145 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/412b7f5f-3972-48a5-a5b2-ef53db782271-proxy-tls podName:412b7f5f-3972-48a5-a5b2-ef53db782271 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:29.63412355 +0000 UTC m=+727.485312032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/412b7f5f-3972-48a5-a5b2-ef53db782271-proxy-tls") pod "isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" (UID: "412b7f5f-3972-48a5-a5b2-ef53db782271") : secret "isvc-sklearn-graph-1-predictor-serving-cert" not found Apr 24 21:28:29.134572 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.134554 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/412b7f5f-3972-48a5-a5b2-ef53db782271-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-66d58bff49-kwvch\" (UID: \"412b7f5f-3972-48a5-a5b2-ef53db782271\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:28:29.134690 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.134655 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/412b7f5f-3972-48a5-a5b2-ef53db782271-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-66d58bff49-kwvch\" (UID: \"412b7f5f-3972-48a5-a5b2-ef53db782271\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:28:29.145145 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.145120 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j8xm\" (UniqueName: \"kubernetes.io/projected/412b7f5f-3972-48a5-a5b2-ef53db782271-kube-api-access-7j8xm\") pod \"isvc-sklearn-graph-1-predictor-66d58bff49-kwvch\" (UID: \"412b7f5f-3972-48a5-a5b2-ef53db782271\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:28:29.235594 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.235557 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-965t8\" (UniqueName: \"kubernetes.io/projected/404559c5-ccb0-43a1-80ce-1c7609bc5692-kube-api-access-965t8\") pod \"error-404-isvc-d41b1-predictor-597b847c78-mknpz\" (UID: \"404559c5-ccb0-43a1-80ce-1c7609bc5692\") " pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" Apr 24 21:28:29.235785 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.235613 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/404559c5-ccb0-43a1-80ce-1c7609bc5692-proxy-tls\") pod \"error-404-isvc-d41b1-predictor-597b847c78-mknpz\" (UID: \"404559c5-ccb0-43a1-80ce-1c7609bc5692\") " pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" Apr 24 21:28:29.235785 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.235750 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-d41b1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/404559c5-ccb0-43a1-80ce-1c7609bc5692-error-404-isvc-d41b1-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d41b1-predictor-597b847c78-mknpz\" (UID: \"404559c5-ccb0-43a1-80ce-1c7609bc5692\") " pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" Apr 24 21:28:29.235785 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:28:29.235778 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-serving-cert: secret "error-404-isvc-d41b1-predictor-serving-cert" not found Apr 24 21:28:29.235947 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:28:29.235869 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/404559c5-ccb0-43a1-80ce-1c7609bc5692-proxy-tls podName:404559c5-ccb0-43a1-80ce-1c7609bc5692 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:29.735850176 +0000 UTC m=+727.587038653 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/404559c5-ccb0-43a1-80ce-1c7609bc5692-proxy-tls") pod "error-404-isvc-d41b1-predictor-597b847c78-mknpz" (UID: "404559c5-ccb0-43a1-80ce-1c7609bc5692") : secret "error-404-isvc-d41b1-predictor-serving-cert" not found Apr 24 21:28:29.236493 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.236470 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-d41b1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/404559c5-ccb0-43a1-80ce-1c7609bc5692-error-404-isvc-d41b1-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d41b1-predictor-597b847c78-mknpz\" (UID: \"404559c5-ccb0-43a1-80ce-1c7609bc5692\") " pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" Apr 24 21:28:29.244660 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.244639 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-965t8\" (UniqueName: \"kubernetes.io/projected/404559c5-ccb0-43a1-80ce-1c7609bc5692-kube-api-access-965t8\") pod \"error-404-isvc-d41b1-predictor-597b847c78-mknpz\" (UID: \"404559c5-ccb0-43a1-80ce-1c7609bc5692\") " pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" Apr 24 21:28:29.639059 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.639018 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/412b7f5f-3972-48a5-a5b2-ef53db782271-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-66d58bff49-kwvch\" (UID: \"412b7f5f-3972-48a5-a5b2-ef53db782271\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:28:29.641415 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.641385 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/412b7f5f-3972-48a5-a5b2-ef53db782271-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-66d58bff49-kwvch\" (UID: \"412b7f5f-3972-48a5-a5b2-ef53db782271\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:28:29.739726 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.739671 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/404559c5-ccb0-43a1-80ce-1c7609bc5692-proxy-tls\") pod \"error-404-isvc-d41b1-predictor-597b847c78-mknpz\" (UID: \"404559c5-ccb0-43a1-80ce-1c7609bc5692\") " pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" Apr 24 21:28:29.742187 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.742155 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/404559c5-ccb0-43a1-80ce-1c7609bc5692-proxy-tls\") pod \"error-404-isvc-d41b1-predictor-597b847c78-mknpz\" (UID: \"404559c5-ccb0-43a1-80ce-1c7609bc5692\") " pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" Apr 24 21:28:29.941238 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.941138 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:28:29.961147 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.961109 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" Apr 24 21:28:29.994181 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.993579 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml"] Apr 24 21:28:29.999036 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:29.999013 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:28:30.002027 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.001799 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\"" Apr 24 21:28:30.002344 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.002308 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-predictor-serving-cert\"" Apr 24 21:28:30.006195 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.006133 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml"] Apr 24 21:28:30.042336 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.042289 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-87d6c5875-x4cml\" (UID: \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:28:30.042336 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.042329 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkbht\" (UniqueName: \"kubernetes.io/projected/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-kube-api-access-rkbht\") pod \"isvc-sklearn-graph-2-predictor-87d6c5875-x4cml\" (UID: \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:28:30.042840 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.042365 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-87d6c5875-x4cml\" (UID: \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:28:30.042840 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.042388 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-87d6c5875-x4cml\" (UID: \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:28:30.078439 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.078266 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch"] Apr 24 21:28:30.081345 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:28:30.081317 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod412b7f5f_3972_48a5_a5b2_ef53db782271.slice/crio-26cc82a4a8b924bbb08e458be7018d3380aa2c16ace169bbfcf6d4e6f4e6970c WatchSource:0}: Error finding container 26cc82a4a8b924bbb08e458be7018d3380aa2c16ace169bbfcf6d4e6f4e6970c: Status 404 returned error can't find the container with id 26cc82a4a8b924bbb08e458be7018d3380aa2c16ace169bbfcf6d4e6f4e6970c Apr 24 21:28:30.099883 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.099859 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz"] Apr 24 21:28:30.103081 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:28:30.103056 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod404559c5_ccb0_43a1_80ce_1c7609bc5692.slice/crio-dea944230a26b0db87b17cc6e44a133a9488e950cbb2c306929103be7b60afdb WatchSource:0}: Error finding container dea944230a26b0db87b17cc6e44a133a9488e950cbb2c306929103be7b60afdb: Status 404 returned error can't find the container with id dea944230a26b0db87b17cc6e44a133a9488e950cbb2c306929103be7b60afdb Apr 24 21:28:30.143209 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.143171 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-87d6c5875-x4cml\" (UID: \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:28:30.143404 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.143215 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-87d6c5875-x4cml\" (UID: \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:28:30.143404 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.143252 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-87d6c5875-x4cml\" (UID: \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:28:30.143404 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.143279 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkbht\" (UniqueName: \"kubernetes.io/projected/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-kube-api-access-rkbht\") pod \"isvc-sklearn-graph-2-predictor-87d6c5875-x4cml\" (UID: \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:28:30.143781 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.143757 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-87d6c5875-x4cml\" (UID: \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:28:30.144062 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.144039 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-87d6c5875-x4cml\" (UID: \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:28:30.145610 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.145580 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-87d6c5875-x4cml\" (UID: \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:28:30.151768 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.151741 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkbht\" (UniqueName: \"kubernetes.io/projected/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-kube-api-access-rkbht\") pod \"isvc-sklearn-graph-2-predictor-87d6c5875-x4cml\" (UID: \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:28:30.320014 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.319967 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:28:30.444003 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:30.443979 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml"] Apr 24 21:28:30.446179 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:28:30.446152 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfacf56c6_f309_4e5d_8667_c2dd2f6fe3f1.slice/crio-4d38e909fd6c96fca835ba9f2c58093282ec4b2e206396e986125d5d96ba9956 WatchSource:0}: Error finding container 4d38e909fd6c96fca835ba9f2c58093282ec4b2e206396e986125d5d96ba9956: Status 404 returned error can't find the container with id 4d38e909fd6c96fca835ba9f2c58093282ec4b2e206396e986125d5d96ba9956 Apr 24 21:28:31.008084 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:31.008037 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" event={"ID":"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1","Type":"ContainerStarted","Data":"4d38e909fd6c96fca835ba9f2c58093282ec4b2e206396e986125d5d96ba9956"} Apr 24 21:28:31.016555 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:31.016515 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" event={"ID":"404559c5-ccb0-43a1-80ce-1c7609bc5692","Type":"ContainerStarted","Data":"dea944230a26b0db87b17cc6e44a133a9488e950cbb2c306929103be7b60afdb"} Apr 24 21:28:31.027651 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:31.027589 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" event={"ID":"412b7f5f-3972-48a5-a5b2-ef53db782271","Type":"ContainerStarted","Data":"26cc82a4a8b924bbb08e458be7018d3380aa2c16ace169bbfcf6d4e6f4e6970c"} Apr 24 21:28:33.989935 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:33.989887 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-67ddfbdbd5-g6vdq" podUID="035ee76b-7011-4224-94c4-cbcf1b848a0b" containerName="console" containerID="cri-o://e7982edd1825d958b12b5141acac4d2e597b2fe949eac86ff43d3232cb95bc14" gracePeriod=15 Apr 24 21:28:35.051109 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.050856 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67ddfbdbd5-g6vdq_035ee76b-7011-4224-94c4-cbcf1b848a0b/console/0.log" Apr 24 21:28:35.051109 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.050906 2578 generic.go:358] "Generic (PLEG): container finished" podID="035ee76b-7011-4224-94c4-cbcf1b848a0b" containerID="e7982edd1825d958b12b5141acac4d2e597b2fe949eac86ff43d3232cb95bc14" exitCode=2 Apr 24 21:28:35.051109 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.051069 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67ddfbdbd5-g6vdq" event={"ID":"035ee76b-7011-4224-94c4-cbcf1b848a0b","Type":"ContainerDied","Data":"e7982edd1825d958b12b5141acac4d2e597b2fe949eac86ff43d3232cb95bc14"} Apr 24 21:28:35.835094 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.835069 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67ddfbdbd5-g6vdq_035ee76b-7011-4224-94c4-cbcf1b848a0b/console/0.log" Apr 24 21:28:35.835231 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.835131 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:28:35.899539 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.899502 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-trusted-ca-bundle\") pod \"035ee76b-7011-4224-94c4-cbcf1b848a0b\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " Apr 24 21:28:35.899737 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.899557 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/035ee76b-7011-4224-94c4-cbcf1b848a0b-console-oauth-config\") pod \"035ee76b-7011-4224-94c4-cbcf1b848a0b\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " Apr 24 21:28:35.899737 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.899590 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6z7q\" (UniqueName: \"kubernetes.io/projected/035ee76b-7011-4224-94c4-cbcf1b848a0b-kube-api-access-t6z7q\") pod \"035ee76b-7011-4224-94c4-cbcf1b848a0b\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " Apr 24 21:28:35.899737 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.899623 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/035ee76b-7011-4224-94c4-cbcf1b848a0b-console-serving-cert\") pod \"035ee76b-7011-4224-94c4-cbcf1b848a0b\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " Apr 24 21:28:35.899737 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.899723 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-oauth-serving-cert\") pod \"035ee76b-7011-4224-94c4-cbcf1b848a0b\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " Apr 24 21:28:35.899966 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.899770 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-service-ca\") pod \"035ee76b-7011-4224-94c4-cbcf1b848a0b\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " Apr 24 21:28:35.899966 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.899808 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-console-config\") pod \"035ee76b-7011-4224-94c4-cbcf1b848a0b\" (UID: \"035ee76b-7011-4224-94c4-cbcf1b848a0b\") " Apr 24 21:28:35.900445 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.900276 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "035ee76b-7011-4224-94c4-cbcf1b848a0b" (UID: "035ee76b-7011-4224-94c4-cbcf1b848a0b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:28:35.900445 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.900298 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "035ee76b-7011-4224-94c4-cbcf1b848a0b" (UID: "035ee76b-7011-4224-94c4-cbcf1b848a0b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:28:35.900445 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.900349 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-console-config" (OuterVolumeSpecName: "console-config") pod "035ee76b-7011-4224-94c4-cbcf1b848a0b" (UID: "035ee76b-7011-4224-94c4-cbcf1b848a0b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:28:35.900711 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.900470 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-service-ca" (OuterVolumeSpecName: "service-ca") pod "035ee76b-7011-4224-94c4-cbcf1b848a0b" (UID: "035ee76b-7011-4224-94c4-cbcf1b848a0b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:28:35.902945 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.902899 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/035ee76b-7011-4224-94c4-cbcf1b848a0b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "035ee76b-7011-4224-94c4-cbcf1b848a0b" (UID: "035ee76b-7011-4224-94c4-cbcf1b848a0b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:28:35.902945 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.902934 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/035ee76b-7011-4224-94c4-cbcf1b848a0b-kube-api-access-t6z7q" (OuterVolumeSpecName: "kube-api-access-t6z7q") pod "035ee76b-7011-4224-94c4-cbcf1b848a0b" (UID: "035ee76b-7011-4224-94c4-cbcf1b848a0b"). InnerVolumeSpecName "kube-api-access-t6z7q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:28:35.903296 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:35.903257 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/035ee76b-7011-4224-94c4-cbcf1b848a0b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "035ee76b-7011-4224-94c4-cbcf1b848a0b" (UID: "035ee76b-7011-4224-94c4-cbcf1b848a0b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:28:36.000662 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:36.000623 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-console-config\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:28:36.000662 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:36.000663 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-trusted-ca-bundle\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:28:36.000917 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:36.000695 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/035ee76b-7011-4224-94c4-cbcf1b848a0b-console-oauth-config\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:28:36.000917 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:36.000709 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t6z7q\" (UniqueName: \"kubernetes.io/projected/035ee76b-7011-4224-94c4-cbcf1b848a0b-kube-api-access-t6z7q\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:28:36.000917 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:36.000724 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/035ee76b-7011-4224-94c4-cbcf1b848a0b-console-serving-cert\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:28:36.000917 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:36.000737 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-oauth-serving-cert\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:28:36.000917 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:36.000751 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/035ee76b-7011-4224-94c4-cbcf1b848a0b-service-ca\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:28:36.059752 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:36.059723 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67ddfbdbd5-g6vdq_035ee76b-7011-4224-94c4-cbcf1b848a0b/console/0.log" Apr 24 21:28:36.060171 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:36.059840 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67ddfbdbd5-g6vdq" event={"ID":"035ee76b-7011-4224-94c4-cbcf1b848a0b","Type":"ContainerDied","Data":"4c6d4f5436235076853257f3399f972e1bcf595bd8221cdf5dfcd8662a4663af"} Apr 24 21:28:36.060171 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:36.059891 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67ddfbdbd5-g6vdq" Apr 24 21:28:36.060171 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:36.059906 2578 scope.go:117] "RemoveContainer" containerID="e7982edd1825d958b12b5141acac4d2e597b2fe949eac86ff43d3232cb95bc14" Apr 24 21:28:36.086814 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:36.086778 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67ddfbdbd5-g6vdq"] Apr 24 21:28:36.091130 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:36.091098 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-67ddfbdbd5-g6vdq"] Apr 24 21:28:36.713387 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:36.713352 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="035ee76b-7011-4224-94c4-cbcf1b848a0b" path="/var/lib/kubelet/pods/035ee76b-7011-4224-94c4-cbcf1b848a0b/volumes" Apr 24 21:28:45.101264 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:45.100664 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" event={"ID":"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1","Type":"ContainerStarted","Data":"6d742a3e67304dc0178739221f7302eca0124612503bf706ee38aa95fff2dcf6"} Apr 24 21:28:45.103036 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:45.102999 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" event={"ID":"404559c5-ccb0-43a1-80ce-1c7609bc5692","Type":"ContainerStarted","Data":"19fe970ea1f600352b1f3b71f8b52086f9a5ce327f7e2d267e016b71011ce9a0"} Apr 24 21:28:45.105286 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:45.105259 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" event={"ID":"412b7f5f-3972-48a5-a5b2-ef53db782271","Type":"ContainerStarted","Data":"b51a4b1f3dad0db9f3d35b0f037c38739c08694acb20c84daa5f461edae5f5a5"} Apr 24 21:28:47.116100 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:47.116059 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" event={"ID":"404559c5-ccb0-43a1-80ce-1c7609bc5692","Type":"ContainerStarted","Data":"172295b1ced15e359fa4ff7358161dafbc43d99a93eb7723eea1c0ae48b5d1af"} Apr 24 21:28:47.116577 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:47.116307 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" Apr 24 21:28:47.116577 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:47.116437 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" Apr 24 21:28:47.117606 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:47.117563 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" podUID="404559c5-ccb0-43a1-80ce-1c7609bc5692" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:28:47.136661 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:47.136592 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" podStartSLOduration=1.253981595 podStartE2EDuration="18.136579041s" podCreationTimestamp="2026-04-24 21:28:29 +0000 UTC" firstStartedPulling="2026-04-24 21:28:30.104700826 +0000 UTC m=+727.955889303" lastFinishedPulling="2026-04-24 21:28:46.987298258 +0000 UTC m=+744.838486749" observedRunningTime="2026-04-24 21:28:47.134660234 +0000 UTC m=+744.985848733" watchObservedRunningTime="2026-04-24 21:28:47.136579041 +0000 UTC m=+744.987767539" Apr 24 21:28:48.120275 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:48.120239 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" podUID="404559c5-ccb0-43a1-80ce-1c7609bc5692" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:28:49.124276 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:49.124240 2578 generic.go:358] "Generic (PLEG): container finished" podID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerID="b51a4b1f3dad0db9f3d35b0f037c38739c08694acb20c84daa5f461edae5f5a5" exitCode=0 Apr 24 21:28:49.124750 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:49.124314 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" event={"ID":"412b7f5f-3972-48a5-a5b2-ef53db782271","Type":"ContainerDied","Data":"b51a4b1f3dad0db9f3d35b0f037c38739c08694acb20c84daa5f461edae5f5a5"} Apr 24 21:28:49.125811 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:49.125746 2578 generic.go:358] "Generic (PLEG): container finished" podID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerID="6d742a3e67304dc0178739221f7302eca0124612503bf706ee38aa95fff2dcf6" exitCode=0 Apr 24 21:28:49.125811 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:49.125779 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" event={"ID":"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1","Type":"ContainerDied","Data":"6d742a3e67304dc0178739221f7302eca0124612503bf706ee38aa95fff2dcf6"} Apr 24 21:28:53.126616 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:53.126583 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" Apr 24 21:28:53.127251 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:53.127052 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" podUID="404559c5-ccb0-43a1-80ce-1c7609bc5692" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:28:56.151878 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:56.151839 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" event={"ID":"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1","Type":"ContainerStarted","Data":"e39e1f835a5b44eb8e696ce01ccde70361772827bf37c468dc4e485c4e773f55"} Apr 24 21:28:56.151878 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:56.151883 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" event={"ID":"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1","Type":"ContainerStarted","Data":"ad0a3961d2c6c726386537932de9ab91a9da3b8bd915d769a9accf3ea6fbada9"} Apr 24 21:28:56.152401 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:56.152190 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:28:56.152401 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:56.152316 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:28:56.153845 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:56.153815 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:28:56.153980 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:56.153843 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" event={"ID":"412b7f5f-3972-48a5-a5b2-ef53db782271","Type":"ContainerStarted","Data":"4e3c3a9bfa56e0fe8cd48b0beafd3f99a2eee517cbeacccf72ed5d57369b1577"} Apr 24 21:28:56.153980 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:56.153867 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" event={"ID":"412b7f5f-3972-48a5-a5b2-ef53db782271","Type":"ContainerStarted","Data":"657ef6de547f86a8ec99077021a25433352cc1f2263c409b0a910fa2872d0efd"} Apr 24 21:28:56.154092 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:56.154055 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:28:56.154092 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:56.154076 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:28:56.155045 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:56.155024 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:28:56.173231 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:56.173189 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" podStartSLOduration=2.101088042 podStartE2EDuration="27.173177589s" podCreationTimestamp="2026-04-24 21:28:29 +0000 UTC" firstStartedPulling="2026-04-24 21:28:30.448522392 +0000 UTC m=+728.299710870" lastFinishedPulling="2026-04-24 21:28:55.520611927 +0000 UTC m=+753.371800417" observedRunningTime="2026-04-24 21:28:56.172207099 +0000 UTC m=+754.023395576" watchObservedRunningTime="2026-04-24 21:28:56.173177589 +0000 UTC m=+754.024366087" Apr 24 21:28:56.192100 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:56.192054 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" podStartSLOduration=1.734293424 podStartE2EDuration="27.19204101s" podCreationTimestamp="2026-04-24 21:28:29 +0000 UTC" firstStartedPulling="2026-04-24 21:28:30.083482889 +0000 UTC m=+727.934671367" lastFinishedPulling="2026-04-24 21:28:55.54123046 +0000 UTC m=+753.392418953" observedRunningTime="2026-04-24 21:28:56.190535083 +0000 UTC m=+754.041723592" watchObservedRunningTime="2026-04-24 21:28:56.19204101 +0000 UTC m=+754.043229526" Apr 24 21:28:57.157436 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:57.157389 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:28:57.157957 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:28:57.157587 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:29:02.161339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:29:02.161309 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:29:02.161812 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:29:02.161375 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:29:02.161914 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:29:02.161889 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:29:02.162047 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:29:02.162021 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:29:03.127416 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:29:03.127371 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" podUID="404559c5-ccb0-43a1-80ce-1c7609bc5692" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:29:12.162626 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:29:12.162579 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:29:12.163077 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:29:12.162592 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:29:13.127290 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:29:13.127242 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" podUID="404559c5-ccb0-43a1-80ce-1c7609bc5692" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:29:22.161936 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:29:22.161890 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:29:22.162426 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:29:22.161889 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:29:23.127248 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:29:23.127203 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" podUID="404559c5-ccb0-43a1-80ce-1c7609bc5692" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:29:32.162606 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:29:32.162560 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:29:32.162606 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:29:32.162584 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:29:33.127878 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:29:33.127841 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" Apr 24 21:29:42.161979 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:29:42.161869 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:29:42.161979 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:29:42.161934 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:29:52.161973 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:29:52.161922 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:29:52.162450 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:29:52.161981 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:30:02.162823 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:02.162791 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:30:02.163408 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:02.162861 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:30:03.056933 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.056899 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz"] Apr 24 21:30:03.057280 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.057253 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" podUID="404559c5-ccb0-43a1-80ce-1c7609bc5692" containerName="kserve-container" containerID="cri-o://19fe970ea1f600352b1f3b71f8b52086f9a5ce327f7e2d267e016b71011ce9a0" gracePeriod=30 Apr 24 21:30:03.057401 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.057324 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" podUID="404559c5-ccb0-43a1-80ce-1c7609bc5692" containerName="kube-rbac-proxy" containerID="cri-o://172295b1ced15e359fa4ff7358161dafbc43d99a93eb7723eea1c0ae48b5d1af" gracePeriod=30 Apr 24 21:30:03.121314 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.121274 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" podUID="404559c5-ccb0-43a1-80ce-1c7609bc5692" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 24 21:30:03.127552 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.127503 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" podUID="404559c5-ccb0-43a1-80ce-1c7609bc5692" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:30:03.376976 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.376888 2578 generic.go:358] "Generic (PLEG): container finished" podID="404559c5-ccb0-43a1-80ce-1c7609bc5692" containerID="172295b1ced15e359fa4ff7358161dafbc43d99a93eb7723eea1c0ae48b5d1af" exitCode=2 Apr 24 21:30:03.376976 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.376936 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" event={"ID":"404559c5-ccb0-43a1-80ce-1c7609bc5692","Type":"ContainerDied","Data":"172295b1ced15e359fa4ff7358161dafbc43d99a93eb7723eea1c0ae48b5d1af"} Apr 24 21:30:03.633793 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.633704 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk"] Apr 24 21:30:03.634274 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.634257 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="035ee76b-7011-4224-94c4-cbcf1b848a0b" containerName="console" Apr 24 21:30:03.634320 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.634280 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="035ee76b-7011-4224-94c4-cbcf1b848a0b" containerName="console" Apr 24 21:30:03.634372 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.634360 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="035ee76b-7011-4224-94c4-cbcf1b848a0b" containerName="console" Apr 24 21:30:03.637751 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.637733 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" Apr 24 21:30:03.640157 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.640129 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-b868d-predictor-serving-cert\"" Apr 24 21:30:03.640286 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.640229 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-b868d-kube-rbac-proxy-sar-config\"" Apr 24 21:30:03.647945 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.647915 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk"] Apr 24 21:30:03.716749 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.716713 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/319d9352-b194-4a33-83c2-3906eaa110fd-proxy-tls\") pod \"error-404-isvc-b868d-predictor-5b5b76b898-87zsk\" (UID: \"319d9352-b194-4a33-83c2-3906eaa110fd\") " pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" Apr 24 21:30:03.716749 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.716749 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqgdb\" (UniqueName: \"kubernetes.io/projected/319d9352-b194-4a33-83c2-3906eaa110fd-kube-api-access-gqgdb\") pod \"error-404-isvc-b868d-predictor-5b5b76b898-87zsk\" (UID: \"319d9352-b194-4a33-83c2-3906eaa110fd\") " pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" Apr 24 21:30:03.716956 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.716776 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-b868d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/319d9352-b194-4a33-83c2-3906eaa110fd-error-404-isvc-b868d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-b868d-predictor-5b5b76b898-87zsk\" (UID: \"319d9352-b194-4a33-83c2-3906eaa110fd\") " pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" Apr 24 21:30:03.817253 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.817211 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/319d9352-b194-4a33-83c2-3906eaa110fd-proxy-tls\") pod \"error-404-isvc-b868d-predictor-5b5b76b898-87zsk\" (UID: \"319d9352-b194-4a33-83c2-3906eaa110fd\") " pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" Apr 24 21:30:03.817455 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:30:03.817320 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-b868d-predictor-serving-cert: secret "error-404-isvc-b868d-predictor-serving-cert" not found Apr 24 21:30:03.817455 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.817344 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqgdb\" (UniqueName: \"kubernetes.io/projected/319d9352-b194-4a33-83c2-3906eaa110fd-kube-api-access-gqgdb\") pod \"error-404-isvc-b868d-predictor-5b5b76b898-87zsk\" (UID: \"319d9352-b194-4a33-83c2-3906eaa110fd\") " pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" Apr 24 21:30:03.817455 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:30:03.817383 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/319d9352-b194-4a33-83c2-3906eaa110fd-proxy-tls podName:319d9352-b194-4a33-83c2-3906eaa110fd nodeName:}" failed. No retries permitted until 2026-04-24 21:30:04.317363415 +0000 UTC m=+822.168551905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/319d9352-b194-4a33-83c2-3906eaa110fd-proxy-tls") pod "error-404-isvc-b868d-predictor-5b5b76b898-87zsk" (UID: "319d9352-b194-4a33-83c2-3906eaa110fd") : secret "error-404-isvc-b868d-predictor-serving-cert" not found Apr 24 21:30:03.817455 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.817422 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-b868d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/319d9352-b194-4a33-83c2-3906eaa110fd-error-404-isvc-b868d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-b868d-predictor-5b5b76b898-87zsk\" (UID: \"319d9352-b194-4a33-83c2-3906eaa110fd\") " pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" Apr 24 21:30:03.818081 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.818056 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-b868d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/319d9352-b194-4a33-83c2-3906eaa110fd-error-404-isvc-b868d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-b868d-predictor-5b5b76b898-87zsk\" (UID: \"319d9352-b194-4a33-83c2-3906eaa110fd\") " pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" Apr 24 21:30:03.825819 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:03.825792 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqgdb\" (UniqueName: \"kubernetes.io/projected/319d9352-b194-4a33-83c2-3906eaa110fd-kube-api-access-gqgdb\") pod \"error-404-isvc-b868d-predictor-5b5b76b898-87zsk\" (UID: \"319d9352-b194-4a33-83c2-3906eaa110fd\") " pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" Apr 24 21:30:04.321081 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:04.321042 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/319d9352-b194-4a33-83c2-3906eaa110fd-proxy-tls\") pod \"error-404-isvc-b868d-predictor-5b5b76b898-87zsk\" (UID: \"319d9352-b194-4a33-83c2-3906eaa110fd\") " pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" Apr 24 21:30:04.323594 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:04.323565 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/319d9352-b194-4a33-83c2-3906eaa110fd-proxy-tls\") pod \"error-404-isvc-b868d-predictor-5b5b76b898-87zsk\" (UID: \"319d9352-b194-4a33-83c2-3906eaa110fd\") " pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" Apr 24 21:30:04.548956 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:04.548918 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" Apr 24 21:30:04.671469 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:04.671445 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk"] Apr 24 21:30:04.673809 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:30:04.673782 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod319d9352_b194_4a33_83c2_3906eaa110fd.slice/crio-22ca133271eb0e75ee74bb8b19600e23302db103b542175d2e7120d48baea1e3 WatchSource:0}: Error finding container 22ca133271eb0e75ee74bb8b19600e23302db103b542175d2e7120d48baea1e3: Status 404 returned error can't find the container with id 22ca133271eb0e75ee74bb8b19600e23302db103b542175d2e7120d48baea1e3 Apr 24 21:30:04.675574 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:04.675554 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:30:05.385626 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:05.385589 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" event={"ID":"319d9352-b194-4a33-83c2-3906eaa110fd","Type":"ContainerStarted","Data":"505be08ab13336c0520308f75512aa19d5d26e7c64c51cee0ba4cf5828872cbd"} Apr 24 21:30:05.385626 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:05.385631 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" event={"ID":"319d9352-b194-4a33-83c2-3906eaa110fd","Type":"ContainerStarted","Data":"bf686605a33d5c5efe99d585736df924df363b3e6f895d5b7bfb2d6da732eba5"} Apr 24 21:30:05.385891 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:05.385646 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" event={"ID":"319d9352-b194-4a33-83c2-3906eaa110fd","Type":"ContainerStarted","Data":"22ca133271eb0e75ee74bb8b19600e23302db103b542175d2e7120d48baea1e3"} Apr 24 21:30:05.385891 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:05.385796 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" Apr 24 21:30:05.385986 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:05.385927 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" Apr 24 21:30:05.387233 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:05.387208 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" podUID="319d9352-b194-4a33-83c2-3906eaa110fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 21:30:05.403438 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:05.403382 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" podStartSLOduration=2.403368964 podStartE2EDuration="2.403368964s" podCreationTimestamp="2026-04-24 21:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:05.400878276 +0000 UTC m=+823.252066775" watchObservedRunningTime="2026-04-24 21:30:05.403368964 +0000 UTC m=+823.254557462" Apr 24 21:30:06.310596 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.305980 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" Apr 24 21:30:06.391512 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.391424 2578 generic.go:358] "Generic (PLEG): container finished" podID="404559c5-ccb0-43a1-80ce-1c7609bc5692" containerID="19fe970ea1f600352b1f3b71f8b52086f9a5ce327f7e2d267e016b71011ce9a0" exitCode=0 Apr 24 21:30:06.391647 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.391512 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" event={"ID":"404559c5-ccb0-43a1-80ce-1c7609bc5692","Type":"ContainerDied","Data":"19fe970ea1f600352b1f3b71f8b52086f9a5ce327f7e2d267e016b71011ce9a0"} Apr 24 21:30:06.391647 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.391555 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" event={"ID":"404559c5-ccb0-43a1-80ce-1c7609bc5692","Type":"ContainerDied","Data":"dea944230a26b0db87b17cc6e44a133a9488e950cbb2c306929103be7b60afdb"} Apr 24 21:30:06.391647 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.391576 2578 scope.go:117] "RemoveContainer" containerID="172295b1ced15e359fa4ff7358161dafbc43d99a93eb7723eea1c0ae48b5d1af" Apr 24 21:30:06.391647 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.391519 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz" Apr 24 21:30:06.392056 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.392029 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" podUID="319d9352-b194-4a33-83c2-3906eaa110fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 21:30:06.399960 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.399928 2578 scope.go:117] "RemoveContainer" containerID="19fe970ea1f600352b1f3b71f8b52086f9a5ce327f7e2d267e016b71011ce9a0" Apr 24 21:30:06.407281 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.407250 2578 scope.go:117] "RemoveContainer" containerID="172295b1ced15e359fa4ff7358161dafbc43d99a93eb7723eea1c0ae48b5d1af" Apr 24 21:30:06.407547 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:30:06.407527 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"172295b1ced15e359fa4ff7358161dafbc43d99a93eb7723eea1c0ae48b5d1af\": container with ID starting with 172295b1ced15e359fa4ff7358161dafbc43d99a93eb7723eea1c0ae48b5d1af not found: ID does not exist" containerID="172295b1ced15e359fa4ff7358161dafbc43d99a93eb7723eea1c0ae48b5d1af" Apr 24 21:30:06.407624 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.407559 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"172295b1ced15e359fa4ff7358161dafbc43d99a93eb7723eea1c0ae48b5d1af"} err="failed to get container status \"172295b1ced15e359fa4ff7358161dafbc43d99a93eb7723eea1c0ae48b5d1af\": rpc error: code = NotFound desc = could not find container \"172295b1ced15e359fa4ff7358161dafbc43d99a93eb7723eea1c0ae48b5d1af\": container with ID starting with 172295b1ced15e359fa4ff7358161dafbc43d99a93eb7723eea1c0ae48b5d1af not found: ID does not exist" Apr 24 21:30:06.407624 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.407584 2578 scope.go:117] "RemoveContainer" containerID="19fe970ea1f600352b1f3b71f8b52086f9a5ce327f7e2d267e016b71011ce9a0" Apr 24 21:30:06.407847 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:30:06.407830 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19fe970ea1f600352b1f3b71f8b52086f9a5ce327f7e2d267e016b71011ce9a0\": container with ID starting with 19fe970ea1f600352b1f3b71f8b52086f9a5ce327f7e2d267e016b71011ce9a0 not found: ID does not exist" containerID="19fe970ea1f600352b1f3b71f8b52086f9a5ce327f7e2d267e016b71011ce9a0" Apr 24 21:30:06.407890 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.407854 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19fe970ea1f600352b1f3b71f8b52086f9a5ce327f7e2d267e016b71011ce9a0"} err="failed to get container status \"19fe970ea1f600352b1f3b71f8b52086f9a5ce327f7e2d267e016b71011ce9a0\": rpc error: code = NotFound desc = could not find container \"19fe970ea1f600352b1f3b71f8b52086f9a5ce327f7e2d267e016b71011ce9a0\": container with ID starting with 19fe970ea1f600352b1f3b71f8b52086f9a5ce327f7e2d267e016b71011ce9a0 not found: ID does not exist" Apr 24 21:30:06.439154 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.439120 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-965t8\" (UniqueName: \"kubernetes.io/projected/404559c5-ccb0-43a1-80ce-1c7609bc5692-kube-api-access-965t8\") pod \"404559c5-ccb0-43a1-80ce-1c7609bc5692\" (UID: \"404559c5-ccb0-43a1-80ce-1c7609bc5692\") " Apr 24 21:30:06.439319 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.439163 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/404559c5-ccb0-43a1-80ce-1c7609bc5692-proxy-tls\") pod \"404559c5-ccb0-43a1-80ce-1c7609bc5692\" (UID: \"404559c5-ccb0-43a1-80ce-1c7609bc5692\") " Apr 24 21:30:06.439319 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.439200 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-d41b1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/404559c5-ccb0-43a1-80ce-1c7609bc5692-error-404-isvc-d41b1-kube-rbac-proxy-sar-config\") pod \"404559c5-ccb0-43a1-80ce-1c7609bc5692\" (UID: \"404559c5-ccb0-43a1-80ce-1c7609bc5692\") " Apr 24 21:30:06.439821 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.439788 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/404559c5-ccb0-43a1-80ce-1c7609bc5692-error-404-isvc-d41b1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-d41b1-kube-rbac-proxy-sar-config") pod "404559c5-ccb0-43a1-80ce-1c7609bc5692" (UID: "404559c5-ccb0-43a1-80ce-1c7609bc5692"). InnerVolumeSpecName "error-404-isvc-d41b1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:06.441397 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.441375 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/404559c5-ccb0-43a1-80ce-1c7609bc5692-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "404559c5-ccb0-43a1-80ce-1c7609bc5692" (UID: "404559c5-ccb0-43a1-80ce-1c7609bc5692"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:06.441477 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.441434 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/404559c5-ccb0-43a1-80ce-1c7609bc5692-kube-api-access-965t8" (OuterVolumeSpecName: "kube-api-access-965t8") pod "404559c5-ccb0-43a1-80ce-1c7609bc5692" (UID: "404559c5-ccb0-43a1-80ce-1c7609bc5692"). InnerVolumeSpecName "kube-api-access-965t8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:06.539877 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.539844 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-965t8\" (UniqueName: \"kubernetes.io/projected/404559c5-ccb0-43a1-80ce-1c7609bc5692-kube-api-access-965t8\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:30:06.539877 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.539874 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/404559c5-ccb0-43a1-80ce-1c7609bc5692-proxy-tls\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:30:06.539877 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.539886 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-d41b1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/404559c5-ccb0-43a1-80ce-1c7609bc5692-error-404-isvc-d41b1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:30:06.713347 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.713280 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz"] Apr 24 21:30:06.718357 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:06.718325 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz"] Apr 24 21:30:08.712216 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:08.712177 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="404559c5-ccb0-43a1-80ce-1c7609bc5692" path="/var/lib/kubelet/pods/404559c5-ccb0-43a1-80ce-1c7609bc5692/volumes" Apr 24 21:30:11.397243 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:11.397213 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" Apr 24 21:30:11.397731 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:11.397645 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" podUID="319d9352-b194-4a33-83c2-3906eaa110fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 21:30:21.398233 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:21.398176 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" podUID="319d9352-b194-4a33-83c2-3906eaa110fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 21:30:31.398482 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:31.398428 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" podUID="319d9352-b194-4a33-83c2-3906eaa110fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 21:30:38.975196 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:38.975163 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml"] Apr 24 21:30:38.975818 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:38.975592 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="kserve-container" containerID="cri-o://ad0a3961d2c6c726386537932de9ab91a9da3b8bd915d769a9accf3ea6fbada9" gracePeriod=30 Apr 24 21:30:38.975818 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:38.975670 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="kube-rbac-proxy" containerID="cri-o://e39e1f835a5b44eb8e696ce01ccde70361772827bf37c468dc4e485c4e773f55" gracePeriod=30 Apr 24 21:30:39.018915 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.018882 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch"] Apr 24 21:30:39.019277 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.019233 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="kserve-container" containerID="cri-o://657ef6de547f86a8ec99077021a25433352cc1f2263c409b0a910fa2872d0efd" gracePeriod=30 Apr 24 21:30:39.019394 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.019280 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="kube-rbac-proxy" containerID="cri-o://4e3c3a9bfa56e0fe8cd48b0beafd3f99a2eee517cbeacccf72ed5d57369b1577" gracePeriod=30 Apr 24 21:30:39.105751 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.105714 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd"] Apr 24 21:30:39.106102 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.106089 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="404559c5-ccb0-43a1-80ce-1c7609bc5692" containerName="kserve-container" Apr 24 21:30:39.106187 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.106104 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="404559c5-ccb0-43a1-80ce-1c7609bc5692" containerName="kserve-container" Apr 24 21:30:39.106187 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.106128 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="404559c5-ccb0-43a1-80ce-1c7609bc5692" containerName="kube-rbac-proxy" Apr 24 21:30:39.106187 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.106137 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="404559c5-ccb0-43a1-80ce-1c7609bc5692" containerName="kube-rbac-proxy" Apr 24 21:30:39.106275 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.106187 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="404559c5-ccb0-43a1-80ce-1c7609bc5692" containerName="kserve-container" Apr 24 21:30:39.106275 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.106199 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="404559c5-ccb0-43a1-80ce-1c7609bc5692" containerName="kube-rbac-proxy" Apr 24 21:30:39.109148 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.109130 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" Apr 24 21:30:39.113729 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.113705 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-3a238-predictor-serving-cert\"" Apr 24 21:30:39.114474 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.114430 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-3a238-kube-rbac-proxy-sar-config\"" Apr 24 21:30:39.116697 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.116657 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e267e3e-db1d-4581-837d-bf2a8ed36066-proxy-tls\") pod \"error-404-isvc-3a238-predictor-6d56d496bd-kgdhd\" (UID: \"2e267e3e-db1d-4581-837d-bf2a8ed36066\") " pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" Apr 24 21:30:39.116794 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.116718 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxdjv\" (UniqueName: \"kubernetes.io/projected/2e267e3e-db1d-4581-837d-bf2a8ed36066-kube-api-access-gxdjv\") pod \"error-404-isvc-3a238-predictor-6d56d496bd-kgdhd\" (UID: \"2e267e3e-db1d-4581-837d-bf2a8ed36066\") " pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" Apr 24 21:30:39.116794 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.116754 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-3a238-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e267e3e-db1d-4581-837d-bf2a8ed36066-error-404-isvc-3a238-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-3a238-predictor-6d56d496bd-kgdhd\" (UID: \"2e267e3e-db1d-4581-837d-bf2a8ed36066\") " pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" Apr 24 21:30:39.140562 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.140536 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd"] Apr 24 21:30:39.217120 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.217083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e267e3e-db1d-4581-837d-bf2a8ed36066-proxy-tls\") pod \"error-404-isvc-3a238-predictor-6d56d496bd-kgdhd\" (UID: \"2e267e3e-db1d-4581-837d-bf2a8ed36066\") " pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" Apr 24 21:30:39.217120 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.217120 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxdjv\" (UniqueName: \"kubernetes.io/projected/2e267e3e-db1d-4581-837d-bf2a8ed36066-kube-api-access-gxdjv\") pod \"error-404-isvc-3a238-predictor-6d56d496bd-kgdhd\" (UID: \"2e267e3e-db1d-4581-837d-bf2a8ed36066\") " pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" Apr 24 21:30:39.217402 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.217157 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-3a238-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e267e3e-db1d-4581-837d-bf2a8ed36066-error-404-isvc-3a238-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-3a238-predictor-6d56d496bd-kgdhd\" (UID: \"2e267e3e-db1d-4581-837d-bf2a8ed36066\") " pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" Apr 24 21:30:39.217402 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:30:39.217228 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-3a238-predictor-serving-cert: secret "error-404-isvc-3a238-predictor-serving-cert" not found Apr 24 21:30:39.217402 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:30:39.217299 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e267e3e-db1d-4581-837d-bf2a8ed36066-proxy-tls podName:2e267e3e-db1d-4581-837d-bf2a8ed36066 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:39.717276747 +0000 UTC m=+857.568465226 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2e267e3e-db1d-4581-837d-bf2a8ed36066-proxy-tls") pod "error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" (UID: "2e267e3e-db1d-4581-837d-bf2a8ed36066") : secret "error-404-isvc-3a238-predictor-serving-cert" not found Apr 24 21:30:39.217906 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.217887 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-3a238-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e267e3e-db1d-4581-837d-bf2a8ed36066-error-404-isvc-3a238-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-3a238-predictor-6d56d496bd-kgdhd\" (UID: \"2e267e3e-db1d-4581-837d-bf2a8ed36066\") " pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" Apr 24 21:30:39.240724 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.240673 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxdjv\" (UniqueName: \"kubernetes.io/projected/2e267e3e-db1d-4581-837d-bf2a8ed36066-kube-api-access-gxdjv\") pod \"error-404-isvc-3a238-predictor-6d56d496bd-kgdhd\" (UID: \"2e267e3e-db1d-4581-837d-bf2a8ed36066\") " pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" Apr 24 21:30:39.501706 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.501449 2578 generic.go:358] "Generic (PLEG): container finished" podID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerID="4e3c3a9bfa56e0fe8cd48b0beafd3f99a2eee517cbeacccf72ed5d57369b1577" exitCode=2 Apr 24 21:30:39.501706 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.501493 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" event={"ID":"412b7f5f-3972-48a5-a5b2-ef53db782271","Type":"ContainerDied","Data":"4e3c3a9bfa56e0fe8cd48b0beafd3f99a2eee517cbeacccf72ed5d57369b1577"} Apr 24 21:30:39.503825 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.503795 2578 generic.go:358] "Generic (PLEG): container finished" podID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerID="e39e1f835a5b44eb8e696ce01ccde70361772827bf37c468dc4e485c4e773f55" exitCode=2 Apr 24 21:30:39.503957 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.503835 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" event={"ID":"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1","Type":"ContainerDied","Data":"e39e1f835a5b44eb8e696ce01ccde70361772827bf37c468dc4e485c4e773f55"} Apr 24 21:30:39.721800 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.721769 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e267e3e-db1d-4581-837d-bf2a8ed36066-proxy-tls\") pod \"error-404-isvc-3a238-predictor-6d56d496bd-kgdhd\" (UID: \"2e267e3e-db1d-4581-837d-bf2a8ed36066\") " pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" Apr 24 21:30:39.724199 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:39.724167 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e267e3e-db1d-4581-837d-bf2a8ed36066-proxy-tls\") pod \"error-404-isvc-3a238-predictor-6d56d496bd-kgdhd\" (UID: \"2e267e3e-db1d-4581-837d-bf2a8ed36066\") " pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" Apr 24 21:30:40.019424 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:40.019383 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" Apr 24 21:30:40.140554 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:40.140528 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd"] Apr 24 21:30:40.142633 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:30:40.142606 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e267e3e_db1d_4581_837d_bf2a8ed36066.slice/crio-57d609691f0b834dae74ef76754f8e141f6c038f20565aab4f0fd7d469895159 WatchSource:0}: Error finding container 57d609691f0b834dae74ef76754f8e141f6c038f20565aab4f0fd7d469895159: Status 404 returned error can't find the container with id 57d609691f0b834dae74ef76754f8e141f6c038f20565aab4f0fd7d469895159 Apr 24 21:30:40.509051 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:40.509020 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" event={"ID":"2e267e3e-db1d-4581-837d-bf2a8ed36066","Type":"ContainerStarted","Data":"86664e4b5d234247d3231df0077383d0cebaee5361cd133080840f69357d5104"} Apr 24 21:30:40.509217 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:40.509058 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" event={"ID":"2e267e3e-db1d-4581-837d-bf2a8ed36066","Type":"ContainerStarted","Data":"635fb245bbac9de0d2f32acfb39272301c9044ed95966a6f6ead1279ea9603cc"} Apr 24 21:30:40.509217 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:40.509074 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" event={"ID":"2e267e3e-db1d-4581-837d-bf2a8ed36066","Type":"ContainerStarted","Data":"57d609691f0b834dae74ef76754f8e141f6c038f20565aab4f0fd7d469895159"} Apr 24 21:30:40.509217 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:40.509138 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" Apr 24 21:30:40.526048 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:40.525996 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" podStartSLOduration=1.5259820880000001 podStartE2EDuration="1.525982088s" podCreationTimestamp="2026-04-24 21:30:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:40.524253188 +0000 UTC m=+858.375441686" watchObservedRunningTime="2026-04-24 21:30:40.525982088 +0000 UTC m=+858.377170586" Apr 24 21:30:41.398523 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:41.398476 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" podUID="319d9352-b194-4a33-83c2-3906eaa110fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 21:30:41.512735 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:41.512704 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" Apr 24 21:30:41.513879 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:41.513851 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" podUID="2e267e3e-db1d-4581-837d-bf2a8ed36066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 21:30:42.157918 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:42.157870 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 24 21:30:42.158085 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:42.157871 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 24 21:30:42.162155 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:42.162125 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:30:42.162264 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:42.162172 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:30:42.515786 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:42.515742 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" podUID="2e267e3e-db1d-4581-837d-bf2a8ed36066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 21:30:43.521347 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:43.521313 2578 generic.go:358] "Generic (PLEG): container finished" podID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerID="ad0a3961d2c6c726386537932de9ab91a9da3b8bd915d769a9accf3ea6fbada9" exitCode=0 Apr 24 21:30:43.521749 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:43.521387 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" event={"ID":"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1","Type":"ContainerDied","Data":"ad0a3961d2c6c726386537932de9ab91a9da3b8bd915d769a9accf3ea6fbada9"} Apr 24 21:30:43.521749 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:43.521432 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" event={"ID":"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1","Type":"ContainerDied","Data":"4d38e909fd6c96fca835ba9f2c58093282ec4b2e206396e986125d5d96ba9956"} Apr 24 21:30:43.521749 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:43.521446 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d38e909fd6c96fca835ba9f2c58093282ec4b2e206396e986125d5d96ba9956" Apr 24 21:30:43.522051 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:43.522033 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:30:43.551898 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:43.551857 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-proxy-tls\") pod \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\" (UID: \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\") " Apr 24 21:30:43.552067 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:43.551954 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkbht\" (UniqueName: \"kubernetes.io/projected/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-kube-api-access-rkbht\") pod \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\" (UID: \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\") " Apr 24 21:30:43.552067 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:43.551999 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\" (UID: \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\") " Apr 24 21:30:43.552067 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:43.552051 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-kserve-provision-location\") pod \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\" (UID: \"facf56c6-f309-4e5d-8667-c2dd2f6fe3f1\") " Apr 24 21:30:43.552446 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:43.552411 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" (UID: "facf56c6-f309-4e5d-8667-c2dd2f6fe3f1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:30:43.552554 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:43.552438 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config") pod "facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" (UID: "facf56c6-f309-4e5d-8667-c2dd2f6fe3f1"). InnerVolumeSpecName "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:43.553971 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:43.553949 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" (UID: "facf56c6-f309-4e5d-8667-c2dd2f6fe3f1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:43.554051 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:43.554028 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-kube-api-access-rkbht" (OuterVolumeSpecName: "kube-api-access-rkbht") pod "facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" (UID: "facf56c6-f309-4e5d-8667-c2dd2f6fe3f1"). InnerVolumeSpecName "kube-api-access-rkbht". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:43.652778 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:43.652739 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-proxy-tls\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:30:43.652778 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:43.652770 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rkbht\" (UniqueName: \"kubernetes.io/projected/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-kube-api-access-rkbht\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:30:43.652778 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:43.652781 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:30:43.652998 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:43.652792 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1-kserve-provision-location\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:30:44.365118 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.365093 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:30:44.458076 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.457985 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/412b7f5f-3972-48a5-a5b2-ef53db782271-kserve-provision-location\") pod \"412b7f5f-3972-48a5-a5b2-ef53db782271\" (UID: \"412b7f5f-3972-48a5-a5b2-ef53db782271\") " Apr 24 21:30:44.458076 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.458058 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/412b7f5f-3972-48a5-a5b2-ef53db782271-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"412b7f5f-3972-48a5-a5b2-ef53db782271\" (UID: \"412b7f5f-3972-48a5-a5b2-ef53db782271\") " Apr 24 21:30:44.458076 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.458078 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j8xm\" (UniqueName: \"kubernetes.io/projected/412b7f5f-3972-48a5-a5b2-ef53db782271-kube-api-access-7j8xm\") pod \"412b7f5f-3972-48a5-a5b2-ef53db782271\" (UID: \"412b7f5f-3972-48a5-a5b2-ef53db782271\") " Apr 24 21:30:44.458306 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.458098 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/412b7f5f-3972-48a5-a5b2-ef53db782271-proxy-tls\") pod \"412b7f5f-3972-48a5-a5b2-ef53db782271\" (UID: \"412b7f5f-3972-48a5-a5b2-ef53db782271\") " Apr 24 21:30:44.458356 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.458326 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/412b7f5f-3972-48a5-a5b2-ef53db782271-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "412b7f5f-3972-48a5-a5b2-ef53db782271" (UID: "412b7f5f-3972-48a5-a5b2-ef53db782271"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:30:44.458420 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.458398 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/412b7f5f-3972-48a5-a5b2-ef53db782271-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config") pod "412b7f5f-3972-48a5-a5b2-ef53db782271" (UID: "412b7f5f-3972-48a5-a5b2-ef53db782271"). InnerVolumeSpecName "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:44.460112 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.460082 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/412b7f5f-3972-48a5-a5b2-ef53db782271-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "412b7f5f-3972-48a5-a5b2-ef53db782271" (UID: "412b7f5f-3972-48a5-a5b2-ef53db782271"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:44.460296 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.460278 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/412b7f5f-3972-48a5-a5b2-ef53db782271-kube-api-access-7j8xm" (OuterVolumeSpecName: "kube-api-access-7j8xm") pod "412b7f5f-3972-48a5-a5b2-ef53db782271" (UID: "412b7f5f-3972-48a5-a5b2-ef53db782271"). InnerVolumeSpecName "kube-api-access-7j8xm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:44.526206 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.526171 2578 generic.go:358] "Generic (PLEG): container finished" podID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerID="657ef6de547f86a8ec99077021a25433352cc1f2263c409b0a910fa2872d0efd" exitCode=0 Apr 24 21:30:44.526629 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.526251 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" Apr 24 21:30:44.526629 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.526256 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" event={"ID":"412b7f5f-3972-48a5-a5b2-ef53db782271","Type":"ContainerDied","Data":"657ef6de547f86a8ec99077021a25433352cc1f2263c409b0a910fa2872d0efd"} Apr 24 21:30:44.526629 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.526298 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch" event={"ID":"412b7f5f-3972-48a5-a5b2-ef53db782271","Type":"ContainerDied","Data":"26cc82a4a8b924bbb08e458be7018d3380aa2c16ace169bbfcf6d4e6f4e6970c"} Apr 24 21:30:44.526629 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.526320 2578 scope.go:117] "RemoveContainer" containerID="4e3c3a9bfa56e0fe8cd48b0beafd3f99a2eee517cbeacccf72ed5d57369b1577" Apr 24 21:30:44.526629 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.526505 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml" Apr 24 21:30:44.535769 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.535753 2578 scope.go:117] "RemoveContainer" containerID="657ef6de547f86a8ec99077021a25433352cc1f2263c409b0a910fa2872d0efd" Apr 24 21:30:44.542870 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.542846 2578 scope.go:117] "RemoveContainer" containerID="b51a4b1f3dad0db9f3d35b0f037c38739c08694acb20c84daa5f461edae5f5a5" Apr 24 21:30:44.549713 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.549687 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch"] Apr 24 21:30:44.549945 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.549926 2578 scope.go:117] "RemoveContainer" containerID="4e3c3a9bfa56e0fe8cd48b0beafd3f99a2eee517cbeacccf72ed5d57369b1577" Apr 24 21:30:44.550202 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:30:44.550179 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3c3a9bfa56e0fe8cd48b0beafd3f99a2eee517cbeacccf72ed5d57369b1577\": container with ID starting with 4e3c3a9bfa56e0fe8cd48b0beafd3f99a2eee517cbeacccf72ed5d57369b1577 not found: ID does not exist" containerID="4e3c3a9bfa56e0fe8cd48b0beafd3f99a2eee517cbeacccf72ed5d57369b1577" Apr 24 21:30:44.550267 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.550214 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3c3a9bfa56e0fe8cd48b0beafd3f99a2eee517cbeacccf72ed5d57369b1577"} err="failed to get container status \"4e3c3a9bfa56e0fe8cd48b0beafd3f99a2eee517cbeacccf72ed5d57369b1577\": rpc error: code = NotFound desc = could not find container \"4e3c3a9bfa56e0fe8cd48b0beafd3f99a2eee517cbeacccf72ed5d57369b1577\": container with ID starting with 4e3c3a9bfa56e0fe8cd48b0beafd3f99a2eee517cbeacccf72ed5d57369b1577 not found: ID does not exist" Apr 24 21:30:44.550267 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.550238 2578 scope.go:117] "RemoveContainer" containerID="657ef6de547f86a8ec99077021a25433352cc1f2263c409b0a910fa2872d0efd" Apr 24 21:30:44.550517 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:30:44.550500 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"657ef6de547f86a8ec99077021a25433352cc1f2263c409b0a910fa2872d0efd\": container with ID starting with 657ef6de547f86a8ec99077021a25433352cc1f2263c409b0a910fa2872d0efd not found: ID does not exist" containerID="657ef6de547f86a8ec99077021a25433352cc1f2263c409b0a910fa2872d0efd" Apr 24 21:30:44.550557 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.550524 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"657ef6de547f86a8ec99077021a25433352cc1f2263c409b0a910fa2872d0efd"} err="failed to get container status \"657ef6de547f86a8ec99077021a25433352cc1f2263c409b0a910fa2872d0efd\": rpc error: code = NotFound desc = could not find container \"657ef6de547f86a8ec99077021a25433352cc1f2263c409b0a910fa2872d0efd\": container with ID starting with 657ef6de547f86a8ec99077021a25433352cc1f2263c409b0a910fa2872d0efd not found: ID does not exist" Apr 24 21:30:44.550557 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.550541 2578 scope.go:117] "RemoveContainer" containerID="b51a4b1f3dad0db9f3d35b0f037c38739c08694acb20c84daa5f461edae5f5a5" Apr 24 21:30:44.550803 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:30:44.550777 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b51a4b1f3dad0db9f3d35b0f037c38739c08694acb20c84daa5f461edae5f5a5\": container with ID starting with b51a4b1f3dad0db9f3d35b0f037c38739c08694acb20c84daa5f461edae5f5a5 not found: ID does not exist" containerID="b51a4b1f3dad0db9f3d35b0f037c38739c08694acb20c84daa5f461edae5f5a5" Apr 24 21:30:44.550891 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.550811 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b51a4b1f3dad0db9f3d35b0f037c38739c08694acb20c84daa5f461edae5f5a5"} err="failed to get container status \"b51a4b1f3dad0db9f3d35b0f037c38739c08694acb20c84daa5f461edae5f5a5\": rpc error: code = NotFound desc = could not find container \"b51a4b1f3dad0db9f3d35b0f037c38739c08694acb20c84daa5f461edae5f5a5\": container with ID starting with b51a4b1f3dad0db9f3d35b0f037c38739c08694acb20c84daa5f461edae5f5a5 not found: ID does not exist" Apr 24 21:30:44.554222 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.554190 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch"] Apr 24 21:30:44.558967 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.558950 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/412b7f5f-3972-48a5-a5b2-ef53db782271-kserve-provision-location\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:30:44.559022 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.558970 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/412b7f5f-3972-48a5-a5b2-ef53db782271-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:30:44.559022 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.558982 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7j8xm\" (UniqueName: \"kubernetes.io/projected/412b7f5f-3972-48a5-a5b2-ef53db782271-kube-api-access-7j8xm\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:30:44.559022 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.558991 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/412b7f5f-3972-48a5-a5b2-ef53db782271-proxy-tls\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:30:44.565977 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.565955 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml"] Apr 24 21:30:44.570785 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.570764 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml"] Apr 24 21:30:44.712446 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.712367 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" path="/var/lib/kubelet/pods/412b7f5f-3972-48a5-a5b2-ef53db782271/volumes" Apr 24 21:30:44.712881 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:44.712868 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" path="/var/lib/kubelet/pods/facf56c6-f309-4e5d-8667-c2dd2f6fe3f1/volumes" Apr 24 21:30:47.520207 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:47.520174 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" Apr 24 21:30:47.520747 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:47.520720 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" podUID="2e267e3e-db1d-4581-837d-bf2a8ed36066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 21:30:51.398324 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:51.398296 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" Apr 24 21:30:57.521123 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:30:57.521083 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" podUID="2e267e3e-db1d-4581-837d-bf2a8ed36066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 21:31:07.521120 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:31:07.521076 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" podUID="2e267e3e-db1d-4581-837d-bf2a8ed36066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 21:31:17.521134 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:31:17.521048 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" podUID="2e267e3e-db1d-4581-837d-bf2a8ed36066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 21:31:22.675121 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:31:22.675088 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:31:22.677085 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:31:22.677064 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:31:27.522139 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:31:27.522111 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" Apr 24 21:35:22.704818 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:35:22.704788 2578 scope.go:117] "RemoveContainer" containerID="e39e1f835a5b44eb8e696ce01ccde70361772827bf37c468dc4e485c4e773f55" Apr 24 21:35:22.712708 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:35:22.712688 2578 scope.go:117] "RemoveContainer" containerID="ad0a3961d2c6c726386537932de9ab91a9da3b8bd915d769a9accf3ea6fbada9" Apr 24 21:35:22.719854 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:35:22.719829 2578 scope.go:117] "RemoveContainer" containerID="6d742a3e67304dc0178739221f7302eca0124612503bf706ee38aa95fff2dcf6" Apr 24 21:36:22.696011 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:36:22.695978 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:36:22.702218 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:36:22.702195 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:39:17.956854 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:17.956822 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk"] Apr 24 21:39:17.959196 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:17.957156 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" podUID="319d9352-b194-4a33-83c2-3906eaa110fd" containerName="kserve-container" containerID="cri-o://bf686605a33d5c5efe99d585736df924df363b3e6f895d5b7bfb2d6da732eba5" gracePeriod=30 Apr 24 21:39:17.959196 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:17.957192 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" podUID="319d9352-b194-4a33-83c2-3906eaa110fd" containerName="kube-rbac-proxy" containerID="cri-o://505be08ab13336c0520308f75512aa19d5d26e7c64c51cee0ba4cf5828872cbd" gracePeriod=30 Apr 24 21:39:18.044587 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.044544 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc"] Apr 24 21:39:18.044969 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.044950 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="storage-initializer" Apr 24 21:39:18.045057 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.044971 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="storage-initializer" Apr 24 21:39:18.045057 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.044989 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="storage-initializer" Apr 24 21:39:18.045057 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.044997 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="storage-initializer" Apr 24 21:39:18.045057 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.045018 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="kube-rbac-proxy" Apr 24 21:39:18.045057 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.045026 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="kube-rbac-proxy" Apr 24 21:39:18.045057 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.045044 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="kube-rbac-proxy" Apr 24 21:39:18.045057 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.045053 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="kube-rbac-proxy" Apr 24 21:39:18.045393 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.045065 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="kserve-container" Apr 24 21:39:18.045393 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.045074 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="kserve-container" Apr 24 21:39:18.045393 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.045085 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="kserve-container" Apr 24 21:39:18.045393 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.045094 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="kserve-container" Apr 24 21:39:18.045393 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.045172 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="kserve-container" Apr 24 21:39:18.045393 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.045187 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="kube-rbac-proxy" Apr 24 21:39:18.045393 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.045202 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="facf56c6-f309-4e5d-8667-c2dd2f6fe3f1" containerName="kserve-container" Apr 24 21:39:18.045393 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.045212 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="412b7f5f-3972-48a5-a5b2-ef53db782271" containerName="kube-rbac-proxy" Apr 24 21:39:18.048356 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.048333 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" Apr 24 21:39:18.050665 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.050637 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-f3231-predictor-serving-cert\"" Apr 24 21:39:18.050812 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.050645 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-f3231-kube-rbac-proxy-sar-config\"" Apr 24 21:39:18.065624 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.065595 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc"] Apr 24 21:39:18.139133 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.139092 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hfc5\" (UniqueName: \"kubernetes.io/projected/c3ff008d-cbf1-453b-b1a9-541247ef77fd-kube-api-access-7hfc5\") pod \"error-404-isvc-f3231-predictor-69f59b6d96-7wdnc\" (UID: \"c3ff008d-cbf1-453b-b1a9-541247ef77fd\") " pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" Apr 24 21:39:18.139319 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.139157 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3ff008d-cbf1-453b-b1a9-541247ef77fd-proxy-tls\") pod \"error-404-isvc-f3231-predictor-69f59b6d96-7wdnc\" (UID: \"c3ff008d-cbf1-453b-b1a9-541247ef77fd\") " pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" Apr 24 21:39:18.139319 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.139198 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-f3231-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3ff008d-cbf1-453b-b1a9-541247ef77fd-error-404-isvc-f3231-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f3231-predictor-69f59b6d96-7wdnc\" (UID: \"c3ff008d-cbf1-453b-b1a9-541247ef77fd\") " pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" Apr 24 21:39:18.203381 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.203348 2578 generic.go:358] "Generic (PLEG): container finished" podID="319d9352-b194-4a33-83c2-3906eaa110fd" containerID="505be08ab13336c0520308f75512aa19d5d26e7c64c51cee0ba4cf5828872cbd" exitCode=2 Apr 24 21:39:18.203579 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.203422 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" event={"ID":"319d9352-b194-4a33-83c2-3906eaa110fd","Type":"ContainerDied","Data":"505be08ab13336c0520308f75512aa19d5d26e7c64c51cee0ba4cf5828872cbd"} Apr 24 21:39:18.239560 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.239528 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3ff008d-cbf1-453b-b1a9-541247ef77fd-proxy-tls\") pod \"error-404-isvc-f3231-predictor-69f59b6d96-7wdnc\" (UID: \"c3ff008d-cbf1-453b-b1a9-541247ef77fd\") " pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" Apr 24 21:39:18.239752 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.239572 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-f3231-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3ff008d-cbf1-453b-b1a9-541247ef77fd-error-404-isvc-f3231-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f3231-predictor-69f59b6d96-7wdnc\" (UID: \"c3ff008d-cbf1-453b-b1a9-541247ef77fd\") " pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" Apr 24 21:39:18.239752 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.239619 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hfc5\" (UniqueName: \"kubernetes.io/projected/c3ff008d-cbf1-453b-b1a9-541247ef77fd-kube-api-access-7hfc5\") pod \"error-404-isvc-f3231-predictor-69f59b6d96-7wdnc\" (UID: \"c3ff008d-cbf1-453b-b1a9-541247ef77fd\") " pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" Apr 24 21:39:18.239752 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:39:18.239702 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-f3231-predictor-serving-cert: secret "error-404-isvc-f3231-predictor-serving-cert" not found Apr 24 21:39:18.239892 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:39:18.239775 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3ff008d-cbf1-453b-b1a9-541247ef77fd-proxy-tls podName:c3ff008d-cbf1-453b-b1a9-541247ef77fd nodeName:}" failed. No retries permitted until 2026-04-24 21:39:18.739753938 +0000 UTC m=+1376.590942414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c3ff008d-cbf1-453b-b1a9-541247ef77fd-proxy-tls") pod "error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" (UID: "c3ff008d-cbf1-453b-b1a9-541247ef77fd") : secret "error-404-isvc-f3231-predictor-serving-cert" not found Apr 24 21:39:18.240324 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.240306 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-f3231-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3ff008d-cbf1-453b-b1a9-541247ef77fd-error-404-isvc-f3231-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f3231-predictor-69f59b6d96-7wdnc\" (UID: \"c3ff008d-cbf1-453b-b1a9-541247ef77fd\") " pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" Apr 24 21:39:18.249028 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.248996 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hfc5\" (UniqueName: \"kubernetes.io/projected/c3ff008d-cbf1-453b-b1a9-541247ef77fd-kube-api-access-7hfc5\") pod \"error-404-isvc-f3231-predictor-69f59b6d96-7wdnc\" (UID: \"c3ff008d-cbf1-453b-b1a9-541247ef77fd\") " pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" Apr 24 21:39:18.742916 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.742884 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3ff008d-cbf1-453b-b1a9-541247ef77fd-proxy-tls\") pod \"error-404-isvc-f3231-predictor-69f59b6d96-7wdnc\" (UID: \"c3ff008d-cbf1-453b-b1a9-541247ef77fd\") " pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" Apr 24 21:39:18.745390 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.745362 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3ff008d-cbf1-453b-b1a9-541247ef77fd-proxy-tls\") pod \"error-404-isvc-f3231-predictor-69f59b6d96-7wdnc\" (UID: \"c3ff008d-cbf1-453b-b1a9-541247ef77fd\") " pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" Apr 24 21:39:18.959646 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:18.959607 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" Apr 24 21:39:19.109842 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:19.109803 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc"] Apr 24 21:39:19.112892 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:39:19.112864 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3ff008d_cbf1_453b_b1a9_541247ef77fd.slice/crio-917d95958d2c40be67c4106d7abec1a275437d56aa74a8c3aff28878c6b64a06 WatchSource:0}: Error finding container 917d95958d2c40be67c4106d7abec1a275437d56aa74a8c3aff28878c6b64a06: Status 404 returned error can't find the container with id 917d95958d2c40be67c4106d7abec1a275437d56aa74a8c3aff28878c6b64a06 Apr 24 21:39:19.114869 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:19.114848 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:39:19.208723 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:19.208672 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" event={"ID":"c3ff008d-cbf1-453b-b1a9-541247ef77fd","Type":"ContainerStarted","Data":"111e1f97c7d9e4a4a565ab20448880b834b69cddda5497ffdf069d87f1d5c82f"} Apr 24 21:39:19.208723 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:19.208726 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" event={"ID":"c3ff008d-cbf1-453b-b1a9-541247ef77fd","Type":"ContainerStarted","Data":"917d95958d2c40be67c4106d7abec1a275437d56aa74a8c3aff28878c6b64a06"} Apr 24 21:39:20.213019 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:20.212979 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" event={"ID":"c3ff008d-cbf1-453b-b1a9-541247ef77fd","Type":"ContainerStarted","Data":"b55d920dee90ade24762a95d01f957dff37df2c7fa0a2afab7dc9dd62ea3f149"} Apr 24 21:39:20.213433 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:20.213114 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" Apr 24 21:39:20.232694 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:20.232630 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" podStartSLOduration=2.232613568 podStartE2EDuration="2.232613568s" podCreationTimestamp="2026-04-24 21:39:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:39:20.230062234 +0000 UTC m=+1378.081250732" watchObservedRunningTime="2026-04-24 21:39:20.232613568 +0000 UTC m=+1378.083802131" Apr 24 21:39:21.216152 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:21.216118 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" Apr 24 21:39:21.217473 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:21.217436 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" podUID="c3ff008d-cbf1-453b-b1a9-541247ef77fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:39:21.407972 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:21.407948 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" Apr 24 21:39:21.462091 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:21.461983 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-b868d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/319d9352-b194-4a33-83c2-3906eaa110fd-error-404-isvc-b868d-kube-rbac-proxy-sar-config\") pod \"319d9352-b194-4a33-83c2-3906eaa110fd\" (UID: \"319d9352-b194-4a33-83c2-3906eaa110fd\") " Apr 24 21:39:21.462325 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:21.462301 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/319d9352-b194-4a33-83c2-3906eaa110fd-error-404-isvc-b868d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-b868d-kube-rbac-proxy-sar-config") pod "319d9352-b194-4a33-83c2-3906eaa110fd" (UID: "319d9352-b194-4a33-83c2-3906eaa110fd"). InnerVolumeSpecName "error-404-isvc-b868d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:21.562773 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:21.562735 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/319d9352-b194-4a33-83c2-3906eaa110fd-proxy-tls\") pod \"319d9352-b194-4a33-83c2-3906eaa110fd\" (UID: \"319d9352-b194-4a33-83c2-3906eaa110fd\") " Apr 24 21:39:21.562773 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:21.562776 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqgdb\" (UniqueName: \"kubernetes.io/projected/319d9352-b194-4a33-83c2-3906eaa110fd-kube-api-access-gqgdb\") pod \"319d9352-b194-4a33-83c2-3906eaa110fd\" (UID: \"319d9352-b194-4a33-83c2-3906eaa110fd\") " Apr 24 21:39:21.563007 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:21.562957 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-b868d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/319d9352-b194-4a33-83c2-3906eaa110fd-error-404-isvc-b868d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:39:21.564876 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:21.564840 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/319d9352-b194-4a33-83c2-3906eaa110fd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "319d9352-b194-4a33-83c2-3906eaa110fd" (UID: "319d9352-b194-4a33-83c2-3906eaa110fd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:39:21.564876 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:21.564862 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/319d9352-b194-4a33-83c2-3906eaa110fd-kube-api-access-gqgdb" (OuterVolumeSpecName: "kube-api-access-gqgdb") pod "319d9352-b194-4a33-83c2-3906eaa110fd" (UID: "319d9352-b194-4a33-83c2-3906eaa110fd"). InnerVolumeSpecName "kube-api-access-gqgdb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:39:21.663475 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:21.663423 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/319d9352-b194-4a33-83c2-3906eaa110fd-proxy-tls\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:39:21.663475 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:21.663469 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gqgdb\" (UniqueName: \"kubernetes.io/projected/319d9352-b194-4a33-83c2-3906eaa110fd-kube-api-access-gqgdb\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:39:22.219946 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:22.219909 2578 generic.go:358] "Generic (PLEG): container finished" podID="319d9352-b194-4a33-83c2-3906eaa110fd" containerID="bf686605a33d5c5efe99d585736df924df363b3e6f895d5b7bfb2d6da732eba5" exitCode=0 Apr 24 21:39:22.220349 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:22.219990 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" Apr 24 21:39:22.220349 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:22.219985 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" event={"ID":"319d9352-b194-4a33-83c2-3906eaa110fd","Type":"ContainerDied","Data":"bf686605a33d5c5efe99d585736df924df363b3e6f895d5b7bfb2d6da732eba5"} Apr 24 21:39:22.220349 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:22.220097 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" event={"ID":"319d9352-b194-4a33-83c2-3906eaa110fd","Type":"ContainerDied","Data":"22ca133271eb0e75ee74bb8b19600e23302db103b542175d2e7120d48baea1e3"} Apr 24 21:39:22.220349 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:22.220115 2578 scope.go:117] "RemoveContainer" containerID="505be08ab13336c0520308f75512aa19d5d26e7c64c51cee0ba4cf5828872cbd" Apr 24 21:39:22.220538 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:22.220504 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" podUID="c3ff008d-cbf1-453b-b1a9-541247ef77fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:39:22.228537 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:22.228521 2578 scope.go:117] "RemoveContainer" containerID="bf686605a33d5c5efe99d585736df924df363b3e6f895d5b7bfb2d6da732eba5" Apr 24 21:39:22.235598 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:22.235578 2578 scope.go:117] "RemoveContainer" containerID="505be08ab13336c0520308f75512aa19d5d26e7c64c51cee0ba4cf5828872cbd" Apr 24 21:39:22.235853 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:39:22.235834 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"505be08ab13336c0520308f75512aa19d5d26e7c64c51cee0ba4cf5828872cbd\": container with ID starting with 505be08ab13336c0520308f75512aa19d5d26e7c64c51cee0ba4cf5828872cbd not found: ID does not exist" containerID="505be08ab13336c0520308f75512aa19d5d26e7c64c51cee0ba4cf5828872cbd" Apr 24 21:39:22.235932 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:22.235868 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"505be08ab13336c0520308f75512aa19d5d26e7c64c51cee0ba4cf5828872cbd"} err="failed to get container status \"505be08ab13336c0520308f75512aa19d5d26e7c64c51cee0ba4cf5828872cbd\": rpc error: code = NotFound desc = could not find container \"505be08ab13336c0520308f75512aa19d5d26e7c64c51cee0ba4cf5828872cbd\": container with ID starting with 505be08ab13336c0520308f75512aa19d5d26e7c64c51cee0ba4cf5828872cbd not found: ID does not exist" Apr 24 21:39:22.235932 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:22.235892 2578 scope.go:117] "RemoveContainer" containerID="bf686605a33d5c5efe99d585736df924df363b3e6f895d5b7bfb2d6da732eba5" Apr 24 21:39:22.236115 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:39:22.236099 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf686605a33d5c5efe99d585736df924df363b3e6f895d5b7bfb2d6da732eba5\": container with ID starting with bf686605a33d5c5efe99d585736df924df363b3e6f895d5b7bfb2d6da732eba5 not found: ID does not exist" containerID="bf686605a33d5c5efe99d585736df924df363b3e6f895d5b7bfb2d6da732eba5" Apr 24 21:39:22.236154 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:22.236122 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf686605a33d5c5efe99d585736df924df363b3e6f895d5b7bfb2d6da732eba5"} err="failed to get container status \"bf686605a33d5c5efe99d585736df924df363b3e6f895d5b7bfb2d6da732eba5\": rpc error: code = NotFound desc = could not find container \"bf686605a33d5c5efe99d585736df924df363b3e6f895d5b7bfb2d6da732eba5\": container with ID starting with bf686605a33d5c5efe99d585736df924df363b3e6f895d5b7bfb2d6da732eba5 not found: ID does not exist" Apr 24 21:39:22.241144 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:22.241121 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk"] Apr 24 21:39:22.246534 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:22.246512 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk"] Apr 24 21:39:22.392446 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:22.392392 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" podUID="319d9352-b194-4a33-83c2-3906eaa110fd" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.29:8643/healthz\": context deadline exceeded" Apr 24 21:39:22.399185 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:22.399154 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk" podUID="319d9352-b194-4a33-83c2-3906eaa110fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: i/o timeout" Apr 24 21:39:22.712474 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:22.712438 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="319d9352-b194-4a33-83c2-3906eaa110fd" path="/var/lib/kubelet/pods/319d9352-b194-4a33-83c2-3906eaa110fd/volumes" Apr 24 21:39:27.225184 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:27.225157 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" Apr 24 21:39:27.225650 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:27.225625 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" podUID="c3ff008d-cbf1-453b-b1a9-541247ef77fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:39:37.225901 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:37.225862 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" podUID="c3ff008d-cbf1-453b-b1a9-541247ef77fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:39:47.226358 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:47.226318 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" podUID="c3ff008d-cbf1-453b-b1a9-541247ef77fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:39:53.970538 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:53.970504 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd"] Apr 24 21:39:53.971068 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:53.970805 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" podUID="2e267e3e-db1d-4581-837d-bf2a8ed36066" containerName="kserve-container" containerID="cri-o://635fb245bbac9de0d2f32acfb39272301c9044ed95966a6f6ead1279ea9603cc" gracePeriod=30 Apr 24 21:39:53.971068 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:53.970854 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" podUID="2e267e3e-db1d-4581-837d-bf2a8ed36066" containerName="kube-rbac-proxy" containerID="cri-o://86664e4b5d234247d3231df0077383d0cebaee5361cd133080840f69357d5104" gracePeriod=30 Apr 24 21:39:54.029772 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.029734 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh"] Apr 24 21:39:54.030134 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.030116 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="319d9352-b194-4a33-83c2-3906eaa110fd" containerName="kube-rbac-proxy" Apr 24 21:39:54.030216 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.030136 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="319d9352-b194-4a33-83c2-3906eaa110fd" containerName="kube-rbac-proxy" Apr 24 21:39:54.030216 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.030173 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="319d9352-b194-4a33-83c2-3906eaa110fd" containerName="kserve-container" Apr 24 21:39:54.030216 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.030183 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="319d9352-b194-4a33-83c2-3906eaa110fd" containerName="kserve-container" Apr 24 21:39:54.030366 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.030264 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="319d9352-b194-4a33-83c2-3906eaa110fd" containerName="kserve-container" Apr 24 21:39:54.030366 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.030280 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="319d9352-b194-4a33-83c2-3906eaa110fd" containerName="kube-rbac-proxy" Apr 24 21:39:54.033416 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.033393 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" Apr 24 21:39:54.035733 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.035707 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-c1a35-predictor-serving-cert\"" Apr 24 21:39:54.036316 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.036289 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-c1a35-kube-rbac-proxy-sar-config\"" Apr 24 21:39:54.045087 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.045066 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh"] Apr 24 21:39:54.085579 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.085546 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-c1a35-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cdf8dd72-b419-4e7d-9ee4-627089900967-error-404-isvc-c1a35-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-c1a35-predictor-5b5885dd-sx2hh\" (UID: \"cdf8dd72-b419-4e7d-9ee4-627089900967\") " pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" Apr 24 21:39:54.085753 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.085610 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdf8dd72-b419-4e7d-9ee4-627089900967-proxy-tls\") pod \"error-404-isvc-c1a35-predictor-5b5885dd-sx2hh\" (UID: \"cdf8dd72-b419-4e7d-9ee4-627089900967\") " pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" Apr 24 21:39:54.085824 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.085744 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dqzm\" (UniqueName: \"kubernetes.io/projected/cdf8dd72-b419-4e7d-9ee4-627089900967-kube-api-access-2dqzm\") pod \"error-404-isvc-c1a35-predictor-5b5885dd-sx2hh\" (UID: \"cdf8dd72-b419-4e7d-9ee4-627089900967\") " pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" Apr 24 21:39:54.186304 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.186263 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-c1a35-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cdf8dd72-b419-4e7d-9ee4-627089900967-error-404-isvc-c1a35-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-c1a35-predictor-5b5885dd-sx2hh\" (UID: \"cdf8dd72-b419-4e7d-9ee4-627089900967\") " pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" Apr 24 21:39:54.186505 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.186355 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdf8dd72-b419-4e7d-9ee4-627089900967-proxy-tls\") pod \"error-404-isvc-c1a35-predictor-5b5885dd-sx2hh\" (UID: \"cdf8dd72-b419-4e7d-9ee4-627089900967\") " pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" Apr 24 21:39:54.186505 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.186412 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dqzm\" (UniqueName: \"kubernetes.io/projected/cdf8dd72-b419-4e7d-9ee4-627089900967-kube-api-access-2dqzm\") pod \"error-404-isvc-c1a35-predictor-5b5885dd-sx2hh\" (UID: \"cdf8dd72-b419-4e7d-9ee4-627089900967\") " pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" Apr 24 21:39:54.186638 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:39:54.186522 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-serving-cert: secret "error-404-isvc-c1a35-predictor-serving-cert" not found Apr 24 21:39:54.186638 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:39:54.186626 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdf8dd72-b419-4e7d-9ee4-627089900967-proxy-tls podName:cdf8dd72-b419-4e7d-9ee4-627089900967 nodeName:}" failed. No retries permitted until 2026-04-24 21:39:54.686604854 +0000 UTC m=+1412.537793335 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cdf8dd72-b419-4e7d-9ee4-627089900967-proxy-tls") pod "error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" (UID: "cdf8dd72-b419-4e7d-9ee4-627089900967") : secret "error-404-isvc-c1a35-predictor-serving-cert" not found Apr 24 21:39:54.186953 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.186932 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-c1a35-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cdf8dd72-b419-4e7d-9ee4-627089900967-error-404-isvc-c1a35-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-c1a35-predictor-5b5885dd-sx2hh\" (UID: \"cdf8dd72-b419-4e7d-9ee4-627089900967\") " pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" Apr 24 21:39:54.195169 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.195145 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dqzm\" (UniqueName: \"kubernetes.io/projected/cdf8dd72-b419-4e7d-9ee4-627089900967-kube-api-access-2dqzm\") pod \"error-404-isvc-c1a35-predictor-5b5885dd-sx2hh\" (UID: \"cdf8dd72-b419-4e7d-9ee4-627089900967\") " pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" Apr 24 21:39:54.325521 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.325483 2578 generic.go:358] "Generic (PLEG): container finished" podID="2e267e3e-db1d-4581-837d-bf2a8ed36066" containerID="86664e4b5d234247d3231df0077383d0cebaee5361cd133080840f69357d5104" exitCode=2 Apr 24 21:39:54.325738 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.325535 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" event={"ID":"2e267e3e-db1d-4581-837d-bf2a8ed36066","Type":"ContainerDied","Data":"86664e4b5d234247d3231df0077383d0cebaee5361cd133080840f69357d5104"} Apr 24 21:39:54.693705 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.693602 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdf8dd72-b419-4e7d-9ee4-627089900967-proxy-tls\") pod \"error-404-isvc-c1a35-predictor-5b5885dd-sx2hh\" (UID: \"cdf8dd72-b419-4e7d-9ee4-627089900967\") " pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" Apr 24 21:39:54.696085 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.696049 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdf8dd72-b419-4e7d-9ee4-627089900967-proxy-tls\") pod \"error-404-isvc-c1a35-predictor-5b5885dd-sx2hh\" (UID: \"cdf8dd72-b419-4e7d-9ee4-627089900967\") " pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" Apr 24 21:39:54.944060 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:54.943956 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" Apr 24 21:39:55.065533 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:55.065495 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh"] Apr 24 21:39:55.071712 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:39:55.071668 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdf8dd72_b419_4e7d_9ee4_627089900967.slice/crio-38fe657b2972a2cedcf2ff981b018aa1eb4a0e12bf1cdbb17d4da9f5efdc612f WatchSource:0}: Error finding container 38fe657b2972a2cedcf2ff981b018aa1eb4a0e12bf1cdbb17d4da9f5efdc612f: Status 404 returned error can't find the container with id 38fe657b2972a2cedcf2ff981b018aa1eb4a0e12bf1cdbb17d4da9f5efdc612f Apr 24 21:39:55.331915 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:55.331874 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" event={"ID":"cdf8dd72-b419-4e7d-9ee4-627089900967","Type":"ContainerStarted","Data":"8ef59ccf171f04f0b8083ce8cfe72cb7861bbf143d93e461e90e537955ef0841"} Apr 24 21:39:55.331915 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:55.331919 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" event={"ID":"cdf8dd72-b419-4e7d-9ee4-627089900967","Type":"ContainerStarted","Data":"828ddfab8147285879c496355222db135525c59ba18934c4f520ff5f4387363e"} Apr 24 21:39:55.332138 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:55.331933 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" event={"ID":"cdf8dd72-b419-4e7d-9ee4-627089900967","Type":"ContainerStarted","Data":"38fe657b2972a2cedcf2ff981b018aa1eb4a0e12bf1cdbb17d4da9f5efdc612f"} Apr 24 21:39:55.332138 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:55.332025 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" Apr 24 21:39:55.356842 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:55.356792 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" podStartSLOduration=1.35677686 podStartE2EDuration="1.35677686s" podCreationTimestamp="2026-04-24 21:39:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:39:55.355611396 +0000 UTC m=+1413.206799895" watchObservedRunningTime="2026-04-24 21:39:55.35677686 +0000 UTC m=+1413.207965358" Apr 24 21:39:56.334693 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:56.334645 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" Apr 24 21:39:56.335964 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:56.335932 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" podUID="cdf8dd72-b419-4e7d-9ee4-627089900967" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:39:57.225713 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:57.225650 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" podUID="c3ff008d-cbf1-453b-b1a9-541247ef77fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:39:57.339999 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:57.339968 2578 generic.go:358] "Generic (PLEG): container finished" podID="2e267e3e-db1d-4581-837d-bf2a8ed36066" containerID="635fb245bbac9de0d2f32acfb39272301c9044ed95966a6f6ead1279ea9603cc" exitCode=0 Apr 24 21:39:57.340382 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:57.340037 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" event={"ID":"2e267e3e-db1d-4581-837d-bf2a8ed36066","Type":"ContainerDied","Data":"635fb245bbac9de0d2f32acfb39272301c9044ed95966a6f6ead1279ea9603cc"} Apr 24 21:39:57.340460 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:57.340439 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" podUID="cdf8dd72-b419-4e7d-9ee4-627089900967" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:39:57.421915 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:57.421891 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" Apr 24 21:39:57.519108 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:57.519062 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e267e3e-db1d-4581-837d-bf2a8ed36066-proxy-tls\") pod \"2e267e3e-db1d-4581-837d-bf2a8ed36066\" (UID: \"2e267e3e-db1d-4581-837d-bf2a8ed36066\") " Apr 24 21:39:57.519108 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:57.519113 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-3a238-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e267e3e-db1d-4581-837d-bf2a8ed36066-error-404-isvc-3a238-kube-rbac-proxy-sar-config\") pod \"2e267e3e-db1d-4581-837d-bf2a8ed36066\" (UID: \"2e267e3e-db1d-4581-837d-bf2a8ed36066\") " Apr 24 21:39:57.519361 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:57.519137 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxdjv\" (UniqueName: \"kubernetes.io/projected/2e267e3e-db1d-4581-837d-bf2a8ed36066-kube-api-access-gxdjv\") pod \"2e267e3e-db1d-4581-837d-bf2a8ed36066\" (UID: \"2e267e3e-db1d-4581-837d-bf2a8ed36066\") " Apr 24 21:39:57.519595 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:57.519565 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e267e3e-db1d-4581-837d-bf2a8ed36066-error-404-isvc-3a238-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-3a238-kube-rbac-proxy-sar-config") pod "2e267e3e-db1d-4581-837d-bf2a8ed36066" (UID: "2e267e3e-db1d-4581-837d-bf2a8ed36066"). InnerVolumeSpecName "error-404-isvc-3a238-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:57.521319 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:57.521288 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e267e3e-db1d-4581-837d-bf2a8ed36066-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2e267e3e-db1d-4581-837d-bf2a8ed36066" (UID: "2e267e3e-db1d-4581-837d-bf2a8ed36066"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:39:57.521413 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:57.521319 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e267e3e-db1d-4581-837d-bf2a8ed36066-kube-api-access-gxdjv" (OuterVolumeSpecName: "kube-api-access-gxdjv") pod "2e267e3e-db1d-4581-837d-bf2a8ed36066" (UID: "2e267e3e-db1d-4581-837d-bf2a8ed36066"). InnerVolumeSpecName "kube-api-access-gxdjv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:39:57.620657 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:57.620617 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gxdjv\" (UniqueName: \"kubernetes.io/projected/2e267e3e-db1d-4581-837d-bf2a8ed36066-kube-api-access-gxdjv\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:39:57.620657 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:57.620652 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e267e3e-db1d-4581-837d-bf2a8ed36066-proxy-tls\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:39:57.620657 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:57.620664 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-3a238-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e267e3e-db1d-4581-837d-bf2a8ed36066-error-404-isvc-3a238-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:39:58.344594 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:58.344559 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" event={"ID":"2e267e3e-db1d-4581-837d-bf2a8ed36066","Type":"ContainerDied","Data":"57d609691f0b834dae74ef76754f8e141f6c038f20565aab4f0fd7d469895159"} Apr 24 21:39:58.345014 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:58.344603 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd" Apr 24 21:39:58.345014 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:58.344607 2578 scope.go:117] "RemoveContainer" containerID="86664e4b5d234247d3231df0077383d0cebaee5361cd133080840f69357d5104" Apr 24 21:39:58.352948 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:58.352880 2578 scope.go:117] "RemoveContainer" containerID="635fb245bbac9de0d2f32acfb39272301c9044ed95966a6f6ead1279ea9603cc" Apr 24 21:39:58.366827 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:58.366799 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd"] Apr 24 21:39:58.371471 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:58.371446 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd"] Apr 24 21:39:58.713186 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:39:58.713088 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e267e3e-db1d-4581-837d-bf2a8ed36066" path="/var/lib/kubelet/pods/2e267e3e-db1d-4581-837d-bf2a8ed36066/volumes" Apr 24 21:40:02.346189 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:02.346156 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" Apr 24 21:40:02.346720 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:02.346691 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" podUID="cdf8dd72-b419-4e7d-9ee4-627089900967" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:40:07.226824 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:07.226793 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" Apr 24 21:40:12.347197 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:12.347109 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" podUID="cdf8dd72-b419-4e7d-9ee4-627089900967" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:40:22.347063 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:22.347016 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" podUID="cdf8dd72-b419-4e7d-9ee4-627089900967" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:40:28.253766 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.253727 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc"] Apr 24 21:40:28.254296 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.254053 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" podUID="c3ff008d-cbf1-453b-b1a9-541247ef77fd" containerName="kserve-container" containerID="cri-o://111e1f97c7d9e4a4a565ab20448880b834b69cddda5497ffdf069d87f1d5c82f" gracePeriod=30 Apr 24 21:40:28.254374 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.254295 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" podUID="c3ff008d-cbf1-453b-b1a9-541247ef77fd" containerName="kube-rbac-proxy" containerID="cri-o://b55d920dee90ade24762a95d01f957dff37df2c7fa0a2afab7dc9dd62ea3f149" gracePeriod=30 Apr 24 21:40:28.319699 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.319655 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp"] Apr 24 21:40:28.320029 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.320015 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e267e3e-db1d-4581-837d-bf2a8ed36066" containerName="kserve-container" Apr 24 21:40:28.320102 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.320030 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e267e3e-db1d-4581-837d-bf2a8ed36066" containerName="kserve-container" Apr 24 21:40:28.320102 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.320040 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e267e3e-db1d-4581-837d-bf2a8ed36066" containerName="kube-rbac-proxy" Apr 24 21:40:28.320102 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.320046 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e267e3e-db1d-4581-837d-bf2a8ed36066" containerName="kube-rbac-proxy" Apr 24 21:40:28.320102 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.320094 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e267e3e-db1d-4581-837d-bf2a8ed36066" containerName="kserve-container" Apr 24 21:40:28.320301 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.320106 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e267e3e-db1d-4581-837d-bf2a8ed36066" containerName="kube-rbac-proxy" Apr 24 21:40:28.324111 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.324093 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" Apr 24 21:40:28.326569 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.326544 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-75b0d-predictor-serving-cert\"" Apr 24 21:40:28.326703 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.326593 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-75b0d-kube-rbac-proxy-sar-config\"" Apr 24 21:40:28.332710 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.332605 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp"] Apr 24 21:40:28.454208 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.454167 2578 generic.go:358] "Generic (PLEG): container finished" podID="c3ff008d-cbf1-453b-b1a9-541247ef77fd" containerID="b55d920dee90ade24762a95d01f957dff37df2c7fa0a2afab7dc9dd62ea3f149" exitCode=2 Apr 24 21:40:28.454383 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.454213 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" event={"ID":"c3ff008d-cbf1-453b-b1a9-541247ef77fd","Type":"ContainerDied","Data":"b55d920dee90ade24762a95d01f957dff37df2c7fa0a2afab7dc9dd62ea3f149"} Apr 24 21:40:28.481705 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.481647 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2flc2\" (UniqueName: \"kubernetes.io/projected/105dc369-8ab5-42ab-bf68-ea4f803de9f1-kube-api-access-2flc2\") pod \"error-404-isvc-75b0d-predictor-758f7456fd-b6kvp\" (UID: \"105dc369-8ab5-42ab-bf68-ea4f803de9f1\") " pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" Apr 24 21:40:28.481890 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.481795 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/105dc369-8ab5-42ab-bf68-ea4f803de9f1-proxy-tls\") pod \"error-404-isvc-75b0d-predictor-758f7456fd-b6kvp\" (UID: \"105dc369-8ab5-42ab-bf68-ea4f803de9f1\") " pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" Apr 24 21:40:28.481890 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.481844 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-75b0d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/105dc369-8ab5-42ab-bf68-ea4f803de9f1-error-404-isvc-75b0d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-75b0d-predictor-758f7456fd-b6kvp\" (UID: \"105dc369-8ab5-42ab-bf68-ea4f803de9f1\") " pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" Apr 24 21:40:28.582493 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.582396 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/105dc369-8ab5-42ab-bf68-ea4f803de9f1-proxy-tls\") pod \"error-404-isvc-75b0d-predictor-758f7456fd-b6kvp\" (UID: \"105dc369-8ab5-42ab-bf68-ea4f803de9f1\") " pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" Apr 24 21:40:28.582493 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.582455 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-75b0d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/105dc369-8ab5-42ab-bf68-ea4f803de9f1-error-404-isvc-75b0d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-75b0d-predictor-758f7456fd-b6kvp\" (UID: \"105dc369-8ab5-42ab-bf68-ea4f803de9f1\") " pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" Apr 24 21:40:28.582761 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.582502 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2flc2\" (UniqueName: \"kubernetes.io/projected/105dc369-8ab5-42ab-bf68-ea4f803de9f1-kube-api-access-2flc2\") pod \"error-404-isvc-75b0d-predictor-758f7456fd-b6kvp\" (UID: \"105dc369-8ab5-42ab-bf68-ea4f803de9f1\") " pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" Apr 24 21:40:28.583239 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.583218 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-75b0d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/105dc369-8ab5-42ab-bf68-ea4f803de9f1-error-404-isvc-75b0d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-75b0d-predictor-758f7456fd-b6kvp\" (UID: \"105dc369-8ab5-42ab-bf68-ea4f803de9f1\") " pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" Apr 24 21:40:28.584947 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.584927 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/105dc369-8ab5-42ab-bf68-ea4f803de9f1-proxy-tls\") pod \"error-404-isvc-75b0d-predictor-758f7456fd-b6kvp\" (UID: \"105dc369-8ab5-42ab-bf68-ea4f803de9f1\") " pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" Apr 24 21:40:28.590877 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.590856 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2flc2\" (UniqueName: \"kubernetes.io/projected/105dc369-8ab5-42ab-bf68-ea4f803de9f1-kube-api-access-2flc2\") pod \"error-404-isvc-75b0d-predictor-758f7456fd-b6kvp\" (UID: \"105dc369-8ab5-42ab-bf68-ea4f803de9f1\") " pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" Apr 24 21:40:28.636659 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.636622 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" Apr 24 21:40:28.769614 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:28.769464 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp"] Apr 24 21:40:28.772082 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:40:28.772054 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod105dc369_8ab5_42ab_bf68_ea4f803de9f1.slice/crio-cf961d00b2c6643f231fbdd53693c1b3e8a34250d89f77de29a88bbc55355fc9 WatchSource:0}: Error finding container cf961d00b2c6643f231fbdd53693c1b3e8a34250d89f77de29a88bbc55355fc9: Status 404 returned error can't find the container with id cf961d00b2c6643f231fbdd53693c1b3e8a34250d89f77de29a88bbc55355fc9 Apr 24 21:40:29.459332 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:29.459295 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" event={"ID":"105dc369-8ab5-42ab-bf68-ea4f803de9f1","Type":"ContainerStarted","Data":"43675a7bb270bcdb6c3d7b0062cf6a5beb53de1aa9d2b4afc4e62709a4cd28cd"} Apr 24 21:40:29.459332 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:29.459334 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" event={"ID":"105dc369-8ab5-42ab-bf68-ea4f803de9f1","Type":"ContainerStarted","Data":"b5705772974d6c52eb2e04d2c32f05e26fda5cf8bdacf564df07cdbc7d9cd661"} Apr 24 21:40:29.459780 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:29.459350 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" event={"ID":"105dc369-8ab5-42ab-bf68-ea4f803de9f1","Type":"ContainerStarted","Data":"cf961d00b2c6643f231fbdd53693c1b3e8a34250d89f77de29a88bbc55355fc9"} Apr 24 21:40:29.459780 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:29.459417 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" Apr 24 21:40:29.476621 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:29.476571 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" podStartSLOduration=1.476555658 podStartE2EDuration="1.476555658s" podCreationTimestamp="2026-04-24 21:40:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:40:29.476177539 +0000 UTC m=+1447.327366037" watchObservedRunningTime="2026-04-24 21:40:29.476555658 +0000 UTC m=+1447.327744159" Apr 24 21:40:30.462827 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:30.462794 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" Apr 24 21:40:30.464017 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:30.463988 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" podUID="105dc369-8ab5-42ab-bf68-ea4f803de9f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 21:40:31.466427 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:31.466381 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" podUID="105dc369-8ab5-42ab-bf68-ea4f803de9f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 21:40:31.698169 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:31.698144 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" Apr 24 21:40:31.813244 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:31.813208 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hfc5\" (UniqueName: \"kubernetes.io/projected/c3ff008d-cbf1-453b-b1a9-541247ef77fd-kube-api-access-7hfc5\") pod \"c3ff008d-cbf1-453b-b1a9-541247ef77fd\" (UID: \"c3ff008d-cbf1-453b-b1a9-541247ef77fd\") " Apr 24 21:40:31.813395 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:31.813254 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3ff008d-cbf1-453b-b1a9-541247ef77fd-proxy-tls\") pod \"c3ff008d-cbf1-453b-b1a9-541247ef77fd\" (UID: \"c3ff008d-cbf1-453b-b1a9-541247ef77fd\") " Apr 24 21:40:31.813465 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:31.813379 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-f3231-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3ff008d-cbf1-453b-b1a9-541247ef77fd-error-404-isvc-f3231-kube-rbac-proxy-sar-config\") pod \"c3ff008d-cbf1-453b-b1a9-541247ef77fd\" (UID: \"c3ff008d-cbf1-453b-b1a9-541247ef77fd\") " Apr 24 21:40:31.813737 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:31.813708 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3ff008d-cbf1-453b-b1a9-541247ef77fd-error-404-isvc-f3231-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-f3231-kube-rbac-proxy-sar-config") pod "c3ff008d-cbf1-453b-b1a9-541247ef77fd" (UID: "c3ff008d-cbf1-453b-b1a9-541247ef77fd"). InnerVolumeSpecName "error-404-isvc-f3231-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:40:31.815349 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:31.815317 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3ff008d-cbf1-453b-b1a9-541247ef77fd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c3ff008d-cbf1-453b-b1a9-541247ef77fd" (UID: "c3ff008d-cbf1-453b-b1a9-541247ef77fd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:40:31.815349 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:31.815330 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3ff008d-cbf1-453b-b1a9-541247ef77fd-kube-api-access-7hfc5" (OuterVolumeSpecName: "kube-api-access-7hfc5") pod "c3ff008d-cbf1-453b-b1a9-541247ef77fd" (UID: "c3ff008d-cbf1-453b-b1a9-541247ef77fd"). InnerVolumeSpecName "kube-api-access-7hfc5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:40:31.914595 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:31.914559 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7hfc5\" (UniqueName: \"kubernetes.io/projected/c3ff008d-cbf1-453b-b1a9-541247ef77fd-kube-api-access-7hfc5\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:40:31.914595 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:31.914588 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3ff008d-cbf1-453b-b1a9-541247ef77fd-proxy-tls\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:40:31.914595 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:31.914600 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-f3231-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3ff008d-cbf1-453b-b1a9-541247ef77fd-error-404-isvc-f3231-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:40:32.347116 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:32.347072 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" podUID="cdf8dd72-b419-4e7d-9ee4-627089900967" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:40:32.471076 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:32.471036 2578 generic.go:358] "Generic (PLEG): container finished" podID="c3ff008d-cbf1-453b-b1a9-541247ef77fd" containerID="111e1f97c7d9e4a4a565ab20448880b834b69cddda5497ffdf069d87f1d5c82f" exitCode=0 Apr 24 21:40:32.471473 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:32.471118 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" Apr 24 21:40:32.471473 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:32.471125 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" event={"ID":"c3ff008d-cbf1-453b-b1a9-541247ef77fd","Type":"ContainerDied","Data":"111e1f97c7d9e4a4a565ab20448880b834b69cddda5497ffdf069d87f1d5c82f"} Apr 24 21:40:32.471473 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:32.471169 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc" event={"ID":"c3ff008d-cbf1-453b-b1a9-541247ef77fd","Type":"ContainerDied","Data":"917d95958d2c40be67c4106d7abec1a275437d56aa74a8c3aff28878c6b64a06"} Apr 24 21:40:32.471473 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:32.471186 2578 scope.go:117] "RemoveContainer" containerID="b55d920dee90ade24762a95d01f957dff37df2c7fa0a2afab7dc9dd62ea3f149" Apr 24 21:40:32.479358 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:32.479339 2578 scope.go:117] "RemoveContainer" containerID="111e1f97c7d9e4a4a565ab20448880b834b69cddda5497ffdf069d87f1d5c82f" Apr 24 21:40:32.487175 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:32.487152 2578 scope.go:117] "RemoveContainer" containerID="b55d920dee90ade24762a95d01f957dff37df2c7fa0a2afab7dc9dd62ea3f149" Apr 24 21:40:32.487457 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:40:32.487438 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55d920dee90ade24762a95d01f957dff37df2c7fa0a2afab7dc9dd62ea3f149\": container with ID starting with b55d920dee90ade24762a95d01f957dff37df2c7fa0a2afab7dc9dd62ea3f149 not found: ID does not exist" containerID="b55d920dee90ade24762a95d01f957dff37df2c7fa0a2afab7dc9dd62ea3f149" Apr 24 21:40:32.487499 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:32.487467 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55d920dee90ade24762a95d01f957dff37df2c7fa0a2afab7dc9dd62ea3f149"} err="failed to get container status \"b55d920dee90ade24762a95d01f957dff37df2c7fa0a2afab7dc9dd62ea3f149\": rpc error: code = NotFound desc = could not find container \"b55d920dee90ade24762a95d01f957dff37df2c7fa0a2afab7dc9dd62ea3f149\": container with ID starting with b55d920dee90ade24762a95d01f957dff37df2c7fa0a2afab7dc9dd62ea3f149 not found: ID does not exist" Apr 24 21:40:32.487499 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:32.487486 2578 scope.go:117] "RemoveContainer" containerID="111e1f97c7d9e4a4a565ab20448880b834b69cddda5497ffdf069d87f1d5c82f" Apr 24 21:40:32.487754 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:40:32.487733 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"111e1f97c7d9e4a4a565ab20448880b834b69cddda5497ffdf069d87f1d5c82f\": container with ID starting with 111e1f97c7d9e4a4a565ab20448880b834b69cddda5497ffdf069d87f1d5c82f not found: ID does not exist" containerID="111e1f97c7d9e4a4a565ab20448880b834b69cddda5497ffdf069d87f1d5c82f" Apr 24 21:40:32.487796 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:32.487761 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"111e1f97c7d9e4a4a565ab20448880b834b69cddda5497ffdf069d87f1d5c82f"} err="failed to get container status \"111e1f97c7d9e4a4a565ab20448880b834b69cddda5497ffdf069d87f1d5c82f\": rpc error: code = NotFound desc = could not find container \"111e1f97c7d9e4a4a565ab20448880b834b69cddda5497ffdf069d87f1d5c82f\": container with ID starting with 111e1f97c7d9e4a4a565ab20448880b834b69cddda5497ffdf069d87f1d5c82f not found: ID does not exist" Apr 24 21:40:32.494547 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:32.494521 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc"] Apr 24 21:40:32.501883 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:32.501845 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc"] Apr 24 21:40:32.712931 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:32.712838 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3ff008d-cbf1-453b-b1a9-541247ef77fd" path="/var/lib/kubelet/pods/c3ff008d-cbf1-453b-b1a9-541247ef77fd/volumes" Apr 24 21:40:36.471016 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:36.470984 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" Apr 24 21:40:36.471569 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:36.471546 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" podUID="105dc369-8ab5-42ab-bf68-ea4f803de9f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 21:40:42.346847 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:42.346816 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" Apr 24 21:40:46.472199 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:46.472149 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" podUID="105dc369-8ab5-42ab-bf68-ea4f803de9f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 21:40:56.472194 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:40:56.472149 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" podUID="105dc369-8ab5-42ab-bf68-ea4f803de9f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 21:41:04.051843 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.051804 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh"] Apr 24 21:41:04.052424 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.052193 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" podUID="cdf8dd72-b419-4e7d-9ee4-627089900967" containerName="kserve-container" containerID="cri-o://828ddfab8147285879c496355222db135525c59ba18934c4f520ff5f4387363e" gracePeriod=30 Apr 24 21:41:04.052424 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.052251 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" podUID="cdf8dd72-b419-4e7d-9ee4-627089900967" containerName="kube-rbac-proxy" containerID="cri-o://8ef59ccf171f04f0b8083ce8cfe72cb7861bbf143d93e461e90e537955ef0841" gracePeriod=30 Apr 24 21:41:04.238123 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.238084 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z"] Apr 24 21:41:04.238617 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.238602 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3ff008d-cbf1-453b-b1a9-541247ef77fd" containerName="kserve-container" Apr 24 21:41:04.238669 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.238620 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ff008d-cbf1-453b-b1a9-541247ef77fd" containerName="kserve-container" Apr 24 21:41:04.238669 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.238639 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3ff008d-cbf1-453b-b1a9-541247ef77fd" containerName="kube-rbac-proxy" Apr 24 21:41:04.238669 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.238648 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ff008d-cbf1-453b-b1a9-541247ef77fd" containerName="kube-rbac-proxy" Apr 24 21:41:04.238792 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.238755 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3ff008d-cbf1-453b-b1a9-541247ef77fd" containerName="kserve-container" Apr 24 21:41:04.238792 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.238771 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3ff008d-cbf1-453b-b1a9-541247ef77fd" containerName="kube-rbac-proxy" Apr 24 21:41:04.242230 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.242205 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" Apr 24 21:41:04.244432 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.244406 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-64806-kube-rbac-proxy-sar-config\"" Apr 24 21:41:04.244590 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.244505 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-64806-predictor-serving-cert\"" Apr 24 21:41:04.251020 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.250984 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z"] Apr 24 21:41:04.271533 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.271485 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b157333-a6fe-4cb3-9680-aa9b1570c066-proxy-tls\") pod \"error-404-isvc-64806-predictor-5cd95b49d6-g7v8z\" (UID: \"0b157333-a6fe-4cb3-9680-aa9b1570c066\") " pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" Apr 24 21:41:04.271777 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.271649 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-64806-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b157333-a6fe-4cb3-9680-aa9b1570c066-error-404-isvc-64806-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-64806-predictor-5cd95b49d6-g7v8z\" (UID: \"0b157333-a6fe-4cb3-9680-aa9b1570c066\") " pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" Apr 24 21:41:04.271777 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.271708 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr98t\" (UniqueName: \"kubernetes.io/projected/0b157333-a6fe-4cb3-9680-aa9b1570c066-kube-api-access-gr98t\") pod \"error-404-isvc-64806-predictor-5cd95b49d6-g7v8z\" (UID: \"0b157333-a6fe-4cb3-9680-aa9b1570c066\") " pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" Apr 24 21:41:04.372932 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.372825 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-64806-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b157333-a6fe-4cb3-9680-aa9b1570c066-error-404-isvc-64806-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-64806-predictor-5cd95b49d6-g7v8z\" (UID: \"0b157333-a6fe-4cb3-9680-aa9b1570c066\") " pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" Apr 24 21:41:04.372932 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.372867 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gr98t\" (UniqueName: \"kubernetes.io/projected/0b157333-a6fe-4cb3-9680-aa9b1570c066-kube-api-access-gr98t\") pod \"error-404-isvc-64806-predictor-5cd95b49d6-g7v8z\" (UID: \"0b157333-a6fe-4cb3-9680-aa9b1570c066\") " pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" Apr 24 21:41:04.372932 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.372911 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b157333-a6fe-4cb3-9680-aa9b1570c066-proxy-tls\") pod \"error-404-isvc-64806-predictor-5cd95b49d6-g7v8z\" (UID: \"0b157333-a6fe-4cb3-9680-aa9b1570c066\") " pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" Apr 24 21:41:04.373547 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.373517 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-64806-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b157333-a6fe-4cb3-9680-aa9b1570c066-error-404-isvc-64806-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-64806-predictor-5cd95b49d6-g7v8z\" (UID: \"0b157333-a6fe-4cb3-9680-aa9b1570c066\") " pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" Apr 24 21:41:04.375325 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.375306 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b157333-a6fe-4cb3-9680-aa9b1570c066-proxy-tls\") pod \"error-404-isvc-64806-predictor-5cd95b49d6-g7v8z\" (UID: \"0b157333-a6fe-4cb3-9680-aa9b1570c066\") " pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" Apr 24 21:41:04.380334 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.380310 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr98t\" (UniqueName: \"kubernetes.io/projected/0b157333-a6fe-4cb3-9680-aa9b1570c066-kube-api-access-gr98t\") pod \"error-404-isvc-64806-predictor-5cd95b49d6-g7v8z\" (UID: \"0b157333-a6fe-4cb3-9680-aa9b1570c066\") " pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" Apr 24 21:41:04.555856 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.555811 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" Apr 24 21:41:04.576387 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.576354 2578 generic.go:358] "Generic (PLEG): container finished" podID="cdf8dd72-b419-4e7d-9ee4-627089900967" containerID="8ef59ccf171f04f0b8083ce8cfe72cb7861bbf143d93e461e90e537955ef0841" exitCode=2 Apr 24 21:41:04.576543 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.576425 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" event={"ID":"cdf8dd72-b419-4e7d-9ee4-627089900967","Type":"ContainerDied","Data":"8ef59ccf171f04f0b8083ce8cfe72cb7861bbf143d93e461e90e537955ef0841"} Apr 24 21:41:04.684453 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:04.684426 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z"] Apr 24 21:41:04.687008 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:41:04.686979 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b157333_a6fe_4cb3_9680_aa9b1570c066.slice/crio-2f4e35033c4cd89d07735afefb27e63122f0b588e2844ec59c3014270dfdc019 WatchSource:0}: Error finding container 2f4e35033c4cd89d07735afefb27e63122f0b588e2844ec59c3014270dfdc019: Status 404 returned error can't find the container with id 2f4e35033c4cd89d07735afefb27e63122f0b588e2844ec59c3014270dfdc019 Apr 24 21:41:05.581416 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:05.581381 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" event={"ID":"0b157333-a6fe-4cb3-9680-aa9b1570c066","Type":"ContainerStarted","Data":"3b59b6a2a3c7ec3d5c4c142b0a36980124ca3d45424565c5e2a359de4201f238"} Apr 24 21:41:05.581416 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:05.581419 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" event={"ID":"0b157333-a6fe-4cb3-9680-aa9b1570c066","Type":"ContainerStarted","Data":"d889e37cd8aa44f38e0dbf2151b6f3b88401f8987d3c580a1c369e47d5f4e503"} Apr 24 21:41:05.581864 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:05.581430 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" event={"ID":"0b157333-a6fe-4cb3-9680-aa9b1570c066","Type":"ContainerStarted","Data":"2f4e35033c4cd89d07735afefb27e63122f0b588e2844ec59c3014270dfdc019"} Apr 24 21:41:05.581864 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:05.581585 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" Apr 24 21:41:05.597951 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:05.597891 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" podStartSLOduration=1.597872094 podStartE2EDuration="1.597872094s" podCreationTimestamp="2026-04-24 21:41:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:41:05.597346916 +0000 UTC m=+1483.448535428" watchObservedRunningTime="2026-04-24 21:41:05.597872094 +0000 UTC m=+1483.449060599" Apr 24 21:41:06.472415 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:06.472370 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" podUID="105dc369-8ab5-42ab-bf68-ea4f803de9f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 21:41:06.585355 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:06.585317 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" Apr 24 21:41:06.586750 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:06.586716 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" podUID="0b157333-a6fe-4cb3-9680-aa9b1570c066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 21:41:07.341536 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.341487 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" podUID="cdf8dd72-b419-4e7d-9ee4-627089900967" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 24 21:41:07.521327 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.521301 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" Apr 24 21:41:07.590521 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.590429 2578 generic.go:358] "Generic (PLEG): container finished" podID="cdf8dd72-b419-4e7d-9ee4-627089900967" containerID="828ddfab8147285879c496355222db135525c59ba18934c4f520ff5f4387363e" exitCode=0 Apr 24 21:41:07.590521 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.590503 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" Apr 24 21:41:07.591005 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.590516 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" event={"ID":"cdf8dd72-b419-4e7d-9ee4-627089900967","Type":"ContainerDied","Data":"828ddfab8147285879c496355222db135525c59ba18934c4f520ff5f4387363e"} Apr 24 21:41:07.591005 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.590553 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh" event={"ID":"cdf8dd72-b419-4e7d-9ee4-627089900967","Type":"ContainerDied","Data":"38fe657b2972a2cedcf2ff981b018aa1eb4a0e12bf1cdbb17d4da9f5efdc612f"} Apr 24 21:41:07.591005 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.590572 2578 scope.go:117] "RemoveContainer" containerID="8ef59ccf171f04f0b8083ce8cfe72cb7861bbf143d93e461e90e537955ef0841" Apr 24 21:41:07.591005 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.590920 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" podUID="0b157333-a6fe-4cb3-9680-aa9b1570c066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 21:41:07.599155 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.599125 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdf8dd72-b419-4e7d-9ee4-627089900967-proxy-tls\") pod \"cdf8dd72-b419-4e7d-9ee4-627089900967\" (UID: \"cdf8dd72-b419-4e7d-9ee4-627089900967\") " Apr 24 21:41:07.599309 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.599246 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-c1a35-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cdf8dd72-b419-4e7d-9ee4-627089900967-error-404-isvc-c1a35-kube-rbac-proxy-sar-config\") pod \"cdf8dd72-b419-4e7d-9ee4-627089900967\" (UID: \"cdf8dd72-b419-4e7d-9ee4-627089900967\") " Apr 24 21:41:07.599460 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.599310 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dqzm\" (UniqueName: \"kubernetes.io/projected/cdf8dd72-b419-4e7d-9ee4-627089900967-kube-api-access-2dqzm\") pod \"cdf8dd72-b419-4e7d-9ee4-627089900967\" (UID: \"cdf8dd72-b419-4e7d-9ee4-627089900967\") " Apr 24 21:41:07.599460 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.599380 2578 scope.go:117] "RemoveContainer" containerID="828ddfab8147285879c496355222db135525c59ba18934c4f520ff5f4387363e" Apr 24 21:41:07.599722 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.599654 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf8dd72-b419-4e7d-9ee4-627089900967-error-404-isvc-c1a35-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-c1a35-kube-rbac-proxy-sar-config") pod "cdf8dd72-b419-4e7d-9ee4-627089900967" (UID: "cdf8dd72-b419-4e7d-9ee4-627089900967"). InnerVolumeSpecName "error-404-isvc-c1a35-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:41:07.601603 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.601575 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdf8dd72-b419-4e7d-9ee4-627089900967-kube-api-access-2dqzm" (OuterVolumeSpecName: "kube-api-access-2dqzm") pod "cdf8dd72-b419-4e7d-9ee4-627089900967" (UID: "cdf8dd72-b419-4e7d-9ee4-627089900967"). InnerVolumeSpecName "kube-api-access-2dqzm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:41:07.601816 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.601793 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf8dd72-b419-4e7d-9ee4-627089900967-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cdf8dd72-b419-4e7d-9ee4-627089900967" (UID: "cdf8dd72-b419-4e7d-9ee4-627089900967"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:41:07.616639 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.616614 2578 scope.go:117] "RemoveContainer" containerID="8ef59ccf171f04f0b8083ce8cfe72cb7861bbf143d93e461e90e537955ef0841" Apr 24 21:41:07.617096 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:41:07.617068 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef59ccf171f04f0b8083ce8cfe72cb7861bbf143d93e461e90e537955ef0841\": container with ID starting with 8ef59ccf171f04f0b8083ce8cfe72cb7861bbf143d93e461e90e537955ef0841 not found: ID does not exist" containerID="8ef59ccf171f04f0b8083ce8cfe72cb7861bbf143d93e461e90e537955ef0841" Apr 24 21:41:07.617172 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.617105 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef59ccf171f04f0b8083ce8cfe72cb7861bbf143d93e461e90e537955ef0841"} err="failed to get container status \"8ef59ccf171f04f0b8083ce8cfe72cb7861bbf143d93e461e90e537955ef0841\": rpc error: code = NotFound desc = could not find container \"8ef59ccf171f04f0b8083ce8cfe72cb7861bbf143d93e461e90e537955ef0841\": container with ID starting with 8ef59ccf171f04f0b8083ce8cfe72cb7861bbf143d93e461e90e537955ef0841 not found: ID does not exist" Apr 24 21:41:07.617172 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.617124 2578 scope.go:117] "RemoveContainer" containerID="828ddfab8147285879c496355222db135525c59ba18934c4f520ff5f4387363e" Apr 24 21:41:07.617412 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:41:07.617390 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"828ddfab8147285879c496355222db135525c59ba18934c4f520ff5f4387363e\": container with ID starting with 828ddfab8147285879c496355222db135525c59ba18934c4f520ff5f4387363e not found: ID does not exist" containerID="828ddfab8147285879c496355222db135525c59ba18934c4f520ff5f4387363e" Apr 24 21:41:07.617452 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.617415 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"828ddfab8147285879c496355222db135525c59ba18934c4f520ff5f4387363e"} err="failed to get container status \"828ddfab8147285879c496355222db135525c59ba18934c4f520ff5f4387363e\": rpc error: code = NotFound desc = could not find container \"828ddfab8147285879c496355222db135525c59ba18934c4f520ff5f4387363e\": container with ID starting with 828ddfab8147285879c496355222db135525c59ba18934c4f520ff5f4387363e not found: ID does not exist" Apr 24 21:41:07.700108 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.700062 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-c1a35-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cdf8dd72-b419-4e7d-9ee4-627089900967-error-404-isvc-c1a35-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:41:07.700108 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.700098 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2dqzm\" (UniqueName: \"kubernetes.io/projected/cdf8dd72-b419-4e7d-9ee4-627089900967-kube-api-access-2dqzm\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:41:07.700108 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.700113 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdf8dd72-b419-4e7d-9ee4-627089900967-proxy-tls\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:41:07.912982 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.912950 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh"] Apr 24 21:41:07.916616 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:07.916585 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh"] Apr 24 21:41:08.713068 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:08.713033 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdf8dd72-b419-4e7d-9ee4-627089900967" path="/var/lib/kubelet/pods/cdf8dd72-b419-4e7d-9ee4-627089900967/volumes" Apr 24 21:41:12.595123 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:12.595095 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" Apr 24 21:41:12.595652 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:12.595626 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" podUID="0b157333-a6fe-4cb3-9680-aa9b1570c066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 21:41:16.472773 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:16.472744 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" Apr 24 21:41:22.596549 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:22.596507 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" podUID="0b157333-a6fe-4cb3-9680-aa9b1570c066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 21:41:22.718666 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:22.718635 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:41:22.725619 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:22.725598 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:41:32.596405 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:32.596355 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" podUID="0b157333-a6fe-4cb3-9680-aa9b1570c066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 21:41:42.596193 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:42.596112 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" podUID="0b157333-a6fe-4cb3-9680-aa9b1570c066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 21:41:52.596876 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:41:52.596840 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" Apr 24 21:46:22.740081 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:46:22.740038 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:46:22.748718 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:46:22.748673 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:49:43.078359 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.078323 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp"] Apr 24 21:49:43.078895 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.078664 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" podUID="105dc369-8ab5-42ab-bf68-ea4f803de9f1" containerName="kserve-container" containerID="cri-o://b5705772974d6c52eb2e04d2c32f05e26fda5cf8bdacf564df07cdbc7d9cd661" gracePeriod=30 Apr 24 21:49:43.078895 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.078736 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" podUID="105dc369-8ab5-42ab-bf68-ea4f803de9f1" containerName="kube-rbac-proxy" containerID="cri-o://43675a7bb270bcdb6c3d7b0062cf6a5beb53de1aa9d2b4afc4e62709a4cd28cd" gracePeriod=30 Apr 24 21:49:43.181156 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.181116 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8"] Apr 24 21:49:43.181485 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.181472 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdf8dd72-b419-4e7d-9ee4-627089900967" containerName="kserve-container" Apr 24 21:49:43.181543 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.181489 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf8dd72-b419-4e7d-9ee4-627089900967" containerName="kserve-container" Apr 24 21:49:43.181543 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.181512 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdf8dd72-b419-4e7d-9ee4-627089900967" containerName="kube-rbac-proxy" Apr 24 21:49:43.181543 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.181530 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf8dd72-b419-4e7d-9ee4-627089900967" containerName="kube-rbac-proxy" Apr 24 21:49:43.181708 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.181578 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdf8dd72-b419-4e7d-9ee4-627089900967" containerName="kube-rbac-proxy" Apr 24 21:49:43.181708 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.181594 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdf8dd72-b419-4e7d-9ee4-627089900967" containerName="kserve-container" Apr 24 21:49:43.184867 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.184842 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" Apr 24 21:49:43.187452 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.187427 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-4efea-kube-rbac-proxy-sar-config\"" Apr 24 21:49:43.187595 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.187425 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-4efea-predictor-serving-cert\"" Apr 24 21:49:43.196085 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.196055 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8"] Apr 24 21:49:43.280789 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.280749 2578 generic.go:358] "Generic (PLEG): container finished" podID="105dc369-8ab5-42ab-bf68-ea4f803de9f1" containerID="43675a7bb270bcdb6c3d7b0062cf6a5beb53de1aa9d2b4afc4e62709a4cd28cd" exitCode=2 Apr 24 21:49:43.280964 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.280812 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" event={"ID":"105dc369-8ab5-42ab-bf68-ea4f803de9f1","Type":"ContainerDied","Data":"43675a7bb270bcdb6c3d7b0062cf6a5beb53de1aa9d2b4afc4e62709a4cd28cd"} Apr 24 21:49:43.306779 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.306738 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jxnh\" (UniqueName: \"kubernetes.io/projected/813e321c-1d89-4ab4-a19c-7a158ef051b0-kube-api-access-5jxnh\") pod \"error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8\" (UID: \"813e321c-1d89-4ab4-a19c-7a158ef051b0\") " pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" Apr 24 21:49:43.306943 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.306861 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-4efea-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/813e321c-1d89-4ab4-a19c-7a158ef051b0-error-404-isvc-4efea-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8\" (UID: \"813e321c-1d89-4ab4-a19c-7a158ef051b0\") " pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" Apr 24 21:49:43.306943 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.306891 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/813e321c-1d89-4ab4-a19c-7a158ef051b0-proxy-tls\") pod \"error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8\" (UID: \"813e321c-1d89-4ab4-a19c-7a158ef051b0\") " pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" Apr 24 21:49:43.407614 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.407507 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jxnh\" (UniqueName: \"kubernetes.io/projected/813e321c-1d89-4ab4-a19c-7a158ef051b0-kube-api-access-5jxnh\") pod \"error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8\" (UID: \"813e321c-1d89-4ab4-a19c-7a158ef051b0\") " pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" Apr 24 21:49:43.407614 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.407577 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-4efea-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/813e321c-1d89-4ab4-a19c-7a158ef051b0-error-404-isvc-4efea-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8\" (UID: \"813e321c-1d89-4ab4-a19c-7a158ef051b0\") " pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" Apr 24 21:49:43.407614 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.407607 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/813e321c-1d89-4ab4-a19c-7a158ef051b0-proxy-tls\") pod \"error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8\" (UID: \"813e321c-1d89-4ab4-a19c-7a158ef051b0\") " pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" Apr 24 21:49:43.408230 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.408197 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-4efea-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/813e321c-1d89-4ab4-a19c-7a158ef051b0-error-404-isvc-4efea-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8\" (UID: \"813e321c-1d89-4ab4-a19c-7a158ef051b0\") " pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" Apr 24 21:49:43.410084 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.410062 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/813e321c-1d89-4ab4-a19c-7a158ef051b0-proxy-tls\") pod \"error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8\" (UID: \"813e321c-1d89-4ab4-a19c-7a158ef051b0\") " pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" Apr 24 21:49:43.415647 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.415624 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jxnh\" (UniqueName: \"kubernetes.io/projected/813e321c-1d89-4ab4-a19c-7a158ef051b0-kube-api-access-5jxnh\") pod \"error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8\" (UID: \"813e321c-1d89-4ab4-a19c-7a158ef051b0\") " pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" Apr 24 21:49:43.498463 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.498421 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" Apr 24 21:49:43.623372 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.623319 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8"] Apr 24 21:49:43.626176 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:49:43.626147 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod813e321c_1d89_4ab4_a19c_7a158ef051b0.slice/crio-c0991668d582dc2a02411ac7a20c237ea481732cbef05aed7bf56f8f67f251da WatchSource:0}: Error finding container c0991668d582dc2a02411ac7a20c237ea481732cbef05aed7bf56f8f67f251da: Status 404 returned error can't find the container with id c0991668d582dc2a02411ac7a20c237ea481732cbef05aed7bf56f8f67f251da Apr 24 21:49:43.628004 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:43.627990 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:49:44.286212 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:44.286173 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" event={"ID":"813e321c-1d89-4ab4-a19c-7a158ef051b0","Type":"ContainerStarted","Data":"b142b4d17e3f1c7fa6c01ef551e87d6f59e92a2feb876fd355fa0863f5ecc33c"} Apr 24 21:49:44.286212 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:44.286210 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" event={"ID":"813e321c-1d89-4ab4-a19c-7a158ef051b0","Type":"ContainerStarted","Data":"863ac3aa45eb751adcf5bbcf4c9cd8041e39633fb4fb1f4ffe603231ae7ada60"} Apr 24 21:49:44.286670 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:44.286225 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" event={"ID":"813e321c-1d89-4ab4-a19c-7a158ef051b0","Type":"ContainerStarted","Data":"c0991668d582dc2a02411ac7a20c237ea481732cbef05aed7bf56f8f67f251da"} Apr 24 21:49:44.286670 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:44.286312 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" Apr 24 21:49:44.307259 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:44.307193 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" podStartSLOduration=1.307168924 podStartE2EDuration="1.307168924s" podCreationTimestamp="2026-04-24 21:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:49:44.30519162 +0000 UTC m=+2002.156380112" watchObservedRunningTime="2026-04-24 21:49:44.307168924 +0000 UTC m=+2002.158357423" Apr 24 21:49:45.290742 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:45.290708 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" Apr 24 21:49:45.292013 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:45.291982 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" podUID="813e321c-1d89-4ab4-a19c-7a158ef051b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 21:49:46.295877 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:46.295834 2578 generic.go:358] "Generic (PLEG): container finished" podID="105dc369-8ab5-42ab-bf68-ea4f803de9f1" containerID="b5705772974d6c52eb2e04d2c32f05e26fda5cf8bdacf564df07cdbc7d9cd661" exitCode=0 Apr 24 21:49:46.296300 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:46.295912 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" event={"ID":"105dc369-8ab5-42ab-bf68-ea4f803de9f1","Type":"ContainerDied","Data":"b5705772974d6c52eb2e04d2c32f05e26fda5cf8bdacf564df07cdbc7d9cd661"} Apr 24 21:49:46.296347 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:46.296313 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" podUID="813e321c-1d89-4ab4-a19c-7a158ef051b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 21:49:46.331153 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:46.331128 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" Apr 24 21:49:46.431146 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:46.431050 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2flc2\" (UniqueName: \"kubernetes.io/projected/105dc369-8ab5-42ab-bf68-ea4f803de9f1-kube-api-access-2flc2\") pod \"105dc369-8ab5-42ab-bf68-ea4f803de9f1\" (UID: \"105dc369-8ab5-42ab-bf68-ea4f803de9f1\") " Apr 24 21:49:46.431146 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:46.431116 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/105dc369-8ab5-42ab-bf68-ea4f803de9f1-proxy-tls\") pod \"105dc369-8ab5-42ab-bf68-ea4f803de9f1\" (UID: \"105dc369-8ab5-42ab-bf68-ea4f803de9f1\") " Apr 24 21:49:46.431361 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:46.431162 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-75b0d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/105dc369-8ab5-42ab-bf68-ea4f803de9f1-error-404-isvc-75b0d-kube-rbac-proxy-sar-config\") pod \"105dc369-8ab5-42ab-bf68-ea4f803de9f1\" (UID: \"105dc369-8ab5-42ab-bf68-ea4f803de9f1\") " Apr 24 21:49:46.431601 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:46.431572 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/105dc369-8ab5-42ab-bf68-ea4f803de9f1-error-404-isvc-75b0d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-75b0d-kube-rbac-proxy-sar-config") pod "105dc369-8ab5-42ab-bf68-ea4f803de9f1" (UID: "105dc369-8ab5-42ab-bf68-ea4f803de9f1"). InnerVolumeSpecName "error-404-isvc-75b0d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:49:46.433356 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:46.433321 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/105dc369-8ab5-42ab-bf68-ea4f803de9f1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "105dc369-8ab5-42ab-bf68-ea4f803de9f1" (UID: "105dc369-8ab5-42ab-bf68-ea4f803de9f1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:49:46.433356 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:46.433331 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/105dc369-8ab5-42ab-bf68-ea4f803de9f1-kube-api-access-2flc2" (OuterVolumeSpecName: "kube-api-access-2flc2") pod "105dc369-8ab5-42ab-bf68-ea4f803de9f1" (UID: "105dc369-8ab5-42ab-bf68-ea4f803de9f1"). InnerVolumeSpecName "kube-api-access-2flc2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:49:46.532769 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:46.532723 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2flc2\" (UniqueName: \"kubernetes.io/projected/105dc369-8ab5-42ab-bf68-ea4f803de9f1-kube-api-access-2flc2\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:49:46.532769 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:46.532760 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/105dc369-8ab5-42ab-bf68-ea4f803de9f1-proxy-tls\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:49:46.532769 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:46.532777 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-75b0d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/105dc369-8ab5-42ab-bf68-ea4f803de9f1-error-404-isvc-75b0d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:49:47.299816 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:47.299782 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" event={"ID":"105dc369-8ab5-42ab-bf68-ea4f803de9f1","Type":"ContainerDied","Data":"cf961d00b2c6643f231fbdd53693c1b3e8a34250d89f77de29a88bbc55355fc9"} Apr 24 21:49:47.300212 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:47.299829 2578 scope.go:117] "RemoveContainer" containerID="43675a7bb270bcdb6c3d7b0062cf6a5beb53de1aa9d2b4afc4e62709a4cd28cd" Apr 24 21:49:47.300212 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:47.299793 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp" Apr 24 21:49:47.307985 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:47.307965 2578 scope.go:117] "RemoveContainer" containerID="b5705772974d6c52eb2e04d2c32f05e26fda5cf8bdacf564df07cdbc7d9cd661" Apr 24 21:49:47.317172 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:47.317112 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp"] Apr 24 21:49:47.318474 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:47.318446 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp"] Apr 24 21:49:48.712059 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:48.712026 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="105dc369-8ab5-42ab-bf68-ea4f803de9f1" path="/var/lib/kubelet/pods/105dc369-8ab5-42ab-bf68-ea4f803de9f1/volumes" Apr 24 21:49:51.301205 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:51.301171 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" Apr 24 21:49:51.301782 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:49:51.301755 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" podUID="813e321c-1d89-4ab4-a19c-7a158ef051b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 21:50:01.302423 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:01.302377 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" podUID="813e321c-1d89-4ab4-a19c-7a158ef051b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 21:50:11.302510 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:11.302468 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" podUID="813e321c-1d89-4ab4-a19c-7a158ef051b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 21:50:19.098310 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.098261 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z"] Apr 24 21:50:19.099395 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.098959 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" podUID="0b157333-a6fe-4cb3-9680-aa9b1570c066" containerName="kserve-container" containerID="cri-o://d889e37cd8aa44f38e0dbf2151b6f3b88401f8987d3c580a1c369e47d5f4e503" gracePeriod=30 Apr 24 21:50:19.099395 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.099131 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" podUID="0b157333-a6fe-4cb3-9680-aa9b1570c066" containerName="kube-rbac-proxy" containerID="cri-o://3b59b6a2a3c7ec3d5c4c142b0a36980124ca3d45424565c5e2a359de4201f238" gracePeriod=30 Apr 24 21:50:19.146601 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.146566 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf"] Apr 24 21:50:19.147072 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.147057 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="105dc369-8ab5-42ab-bf68-ea4f803de9f1" containerName="kube-rbac-proxy" Apr 24 21:50:19.147113 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.147076 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="105dc369-8ab5-42ab-bf68-ea4f803de9f1" containerName="kube-rbac-proxy" Apr 24 21:50:19.147113 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.147090 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="105dc369-8ab5-42ab-bf68-ea4f803de9f1" containerName="kserve-container" Apr 24 21:50:19.147113 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.147099 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="105dc369-8ab5-42ab-bf68-ea4f803de9f1" containerName="kserve-container" Apr 24 21:50:19.147206 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.147167 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="105dc369-8ab5-42ab-bf68-ea4f803de9f1" containerName="kube-rbac-proxy" Apr 24 21:50:19.147206 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.147185 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="105dc369-8ab5-42ab-bf68-ea4f803de9f1" containerName="kserve-container" Apr 24 21:50:19.150892 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.150874 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" Apr 24 21:50:19.153118 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.153090 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-98aac-predictor-serving-cert\"" Apr 24 21:50:19.153283 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.153261 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-98aac-kube-rbac-proxy-sar-config\"" Apr 24 21:50:19.161374 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.161348 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf"] Apr 24 21:50:19.182563 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.182518 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fxdk\" (UniqueName: \"kubernetes.io/projected/f69bd325-9fbb-4e92-bc0d-942db20130b2-kube-api-access-4fxdk\") pod \"error-404-isvc-98aac-predictor-5678f7cc74-45scf\" (UID: \"f69bd325-9fbb-4e92-bc0d-942db20130b2\") " pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" Apr 24 21:50:19.182756 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.182611 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-98aac-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f69bd325-9fbb-4e92-bc0d-942db20130b2-error-404-isvc-98aac-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-98aac-predictor-5678f7cc74-45scf\" (UID: \"f69bd325-9fbb-4e92-bc0d-942db20130b2\") " pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" Apr 24 21:50:19.182756 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.182657 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f69bd325-9fbb-4e92-bc0d-942db20130b2-proxy-tls\") pod \"error-404-isvc-98aac-predictor-5678f7cc74-45scf\" (UID: \"f69bd325-9fbb-4e92-bc0d-942db20130b2\") " pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" Apr 24 21:50:19.283706 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.283646 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-98aac-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f69bd325-9fbb-4e92-bc0d-942db20130b2-error-404-isvc-98aac-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-98aac-predictor-5678f7cc74-45scf\" (UID: \"f69bd325-9fbb-4e92-bc0d-942db20130b2\") " pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" Apr 24 21:50:19.283908 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.283738 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f69bd325-9fbb-4e92-bc0d-942db20130b2-proxy-tls\") pod \"error-404-isvc-98aac-predictor-5678f7cc74-45scf\" (UID: \"f69bd325-9fbb-4e92-bc0d-942db20130b2\") " pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" Apr 24 21:50:19.283908 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.283816 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fxdk\" (UniqueName: \"kubernetes.io/projected/f69bd325-9fbb-4e92-bc0d-942db20130b2-kube-api-access-4fxdk\") pod \"error-404-isvc-98aac-predictor-5678f7cc74-45scf\" (UID: \"f69bd325-9fbb-4e92-bc0d-942db20130b2\") " pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" Apr 24 21:50:19.284038 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:50:19.283927 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-98aac-predictor-serving-cert: secret "error-404-isvc-98aac-predictor-serving-cert" not found Apr 24 21:50:19.284038 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:50:19.284019 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f69bd325-9fbb-4e92-bc0d-942db20130b2-proxy-tls podName:f69bd325-9fbb-4e92-bc0d-942db20130b2 nodeName:}" failed. No retries permitted until 2026-04-24 21:50:19.783994822 +0000 UTC m=+2037.635183315 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f69bd325-9fbb-4e92-bc0d-942db20130b2-proxy-tls") pod "error-404-isvc-98aac-predictor-5678f7cc74-45scf" (UID: "f69bd325-9fbb-4e92-bc0d-942db20130b2") : secret "error-404-isvc-98aac-predictor-serving-cert" not found Apr 24 21:50:19.284439 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.284412 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-98aac-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f69bd325-9fbb-4e92-bc0d-942db20130b2-error-404-isvc-98aac-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-98aac-predictor-5678f7cc74-45scf\" (UID: \"f69bd325-9fbb-4e92-bc0d-942db20130b2\") " pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" Apr 24 21:50:19.292517 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.292483 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fxdk\" (UniqueName: \"kubernetes.io/projected/f69bd325-9fbb-4e92-bc0d-942db20130b2-kube-api-access-4fxdk\") pod \"error-404-isvc-98aac-predictor-5678f7cc74-45scf\" (UID: \"f69bd325-9fbb-4e92-bc0d-942db20130b2\") " pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" Apr 24 21:50:19.402085 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.401983 2578 generic.go:358] "Generic (PLEG): container finished" podID="0b157333-a6fe-4cb3-9680-aa9b1570c066" containerID="3b59b6a2a3c7ec3d5c4c142b0a36980124ca3d45424565c5e2a359de4201f238" exitCode=2 Apr 24 21:50:19.402085 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.402018 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" event={"ID":"0b157333-a6fe-4cb3-9680-aa9b1570c066","Type":"ContainerDied","Data":"3b59b6a2a3c7ec3d5c4c142b0a36980124ca3d45424565c5e2a359de4201f238"} Apr 24 21:50:19.787339 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.787301 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f69bd325-9fbb-4e92-bc0d-942db20130b2-proxy-tls\") pod \"error-404-isvc-98aac-predictor-5678f7cc74-45scf\" (UID: \"f69bd325-9fbb-4e92-bc0d-942db20130b2\") " pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" Apr 24 21:50:19.789745 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:19.789722 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f69bd325-9fbb-4e92-bc0d-942db20130b2-proxy-tls\") pod \"error-404-isvc-98aac-predictor-5678f7cc74-45scf\" (UID: \"f69bd325-9fbb-4e92-bc0d-942db20130b2\") " pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" Apr 24 21:50:20.062172 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:20.062069 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" Apr 24 21:50:20.188463 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:20.188424 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf"] Apr 24 21:50:20.192867 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:50:20.192838 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf69bd325_9fbb_4e92_bc0d_942db20130b2.slice/crio-6ddeadbd5e9bea8b237aec0fcd12ae4ef1aaec14b88a8a450eccb8493cec4d7f WatchSource:0}: Error finding container 6ddeadbd5e9bea8b237aec0fcd12ae4ef1aaec14b88a8a450eccb8493cec4d7f: Status 404 returned error can't find the container with id 6ddeadbd5e9bea8b237aec0fcd12ae4ef1aaec14b88a8a450eccb8493cec4d7f Apr 24 21:50:20.407889 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:20.407855 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" event={"ID":"f69bd325-9fbb-4e92-bc0d-942db20130b2","Type":"ContainerStarted","Data":"7dde10c4155ab543cff2c4cb4207758878f542e7472eaf25f0b4df66b26fbc12"} Apr 24 21:50:20.407889 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:20.407893 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" event={"ID":"f69bd325-9fbb-4e92-bc0d-942db20130b2","Type":"ContainerStarted","Data":"d89733330af076db8b2bc5cce5c6699b257c737355a2c94eaae94c888a67efcf"} Apr 24 21:50:20.408126 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:20.407904 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" event={"ID":"f69bd325-9fbb-4e92-bc0d-942db20130b2","Type":"ContainerStarted","Data":"6ddeadbd5e9bea8b237aec0fcd12ae4ef1aaec14b88a8a450eccb8493cec4d7f"} Apr 24 21:50:20.408126 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:20.407988 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" Apr 24 21:50:20.427566 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:20.427499 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" podStartSLOduration=1.427481 podStartE2EDuration="1.427481s" podCreationTimestamp="2026-04-24 21:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:50:20.42625015 +0000 UTC m=+2038.277438648" watchObservedRunningTime="2026-04-24 21:50:20.427481 +0000 UTC m=+2038.278669500" Apr 24 21:50:21.301764 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:21.301720 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" podUID="813e321c-1d89-4ab4-a19c-7a158ef051b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 21:50:21.410740 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:21.410706 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" Apr 24 21:50:21.412050 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:21.412018 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" podUID="f69bd325-9fbb-4e92-bc0d-942db20130b2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 21:50:22.413777 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:22.413743 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" podUID="f69bd325-9fbb-4e92-bc0d-942db20130b2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 21:50:22.591501 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:22.591455 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" podUID="0b157333-a6fe-4cb3-9680-aa9b1570c066" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.34:8643/healthz\": dial tcp 10.133.0.34:8643: connect: connection refused" Apr 24 21:50:22.595623 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:22.595596 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" podUID="0b157333-a6fe-4cb3-9680-aa9b1570c066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 21:50:23.042094 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.042064 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" Apr 24 21:50:23.113603 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.113512 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr98t\" (UniqueName: \"kubernetes.io/projected/0b157333-a6fe-4cb3-9680-aa9b1570c066-kube-api-access-gr98t\") pod \"0b157333-a6fe-4cb3-9680-aa9b1570c066\" (UID: \"0b157333-a6fe-4cb3-9680-aa9b1570c066\") " Apr 24 21:50:23.113603 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.113591 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b157333-a6fe-4cb3-9680-aa9b1570c066-proxy-tls\") pod \"0b157333-a6fe-4cb3-9680-aa9b1570c066\" (UID: \"0b157333-a6fe-4cb3-9680-aa9b1570c066\") " Apr 24 21:50:23.113859 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.113639 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-64806-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b157333-a6fe-4cb3-9680-aa9b1570c066-error-404-isvc-64806-kube-rbac-proxy-sar-config\") pod \"0b157333-a6fe-4cb3-9680-aa9b1570c066\" (UID: \"0b157333-a6fe-4cb3-9680-aa9b1570c066\") " Apr 24 21:50:23.114043 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.114015 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b157333-a6fe-4cb3-9680-aa9b1570c066-error-404-isvc-64806-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-64806-kube-rbac-proxy-sar-config") pod "0b157333-a6fe-4cb3-9680-aa9b1570c066" (UID: "0b157333-a6fe-4cb3-9680-aa9b1570c066"). InnerVolumeSpecName "error-404-isvc-64806-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:50:23.115614 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.115592 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b157333-a6fe-4cb3-9680-aa9b1570c066-kube-api-access-gr98t" (OuterVolumeSpecName: "kube-api-access-gr98t") pod "0b157333-a6fe-4cb3-9680-aa9b1570c066" (UID: "0b157333-a6fe-4cb3-9680-aa9b1570c066"). InnerVolumeSpecName "kube-api-access-gr98t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:50:23.115691 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.115634 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b157333-a6fe-4cb3-9680-aa9b1570c066-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b157333-a6fe-4cb3-9680-aa9b1570c066" (UID: "0b157333-a6fe-4cb3-9680-aa9b1570c066"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:50:23.214819 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.214777 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gr98t\" (UniqueName: \"kubernetes.io/projected/0b157333-a6fe-4cb3-9680-aa9b1570c066-kube-api-access-gr98t\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:50:23.214819 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.214813 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b157333-a6fe-4cb3-9680-aa9b1570c066-proxy-tls\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:50:23.214819 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.214825 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-64806-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b157333-a6fe-4cb3-9680-aa9b1570c066-error-404-isvc-64806-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:50:23.418032 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.417948 2578 generic.go:358] "Generic (PLEG): container finished" podID="0b157333-a6fe-4cb3-9680-aa9b1570c066" containerID="d889e37cd8aa44f38e0dbf2151b6f3b88401f8987d3c580a1c369e47d5f4e503" exitCode=0 Apr 24 21:50:23.418032 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.418018 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" Apr 24 21:50:23.418463 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.418034 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" event={"ID":"0b157333-a6fe-4cb3-9680-aa9b1570c066","Type":"ContainerDied","Data":"d889e37cd8aa44f38e0dbf2151b6f3b88401f8987d3c580a1c369e47d5f4e503"} Apr 24 21:50:23.418463 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.418072 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z" event={"ID":"0b157333-a6fe-4cb3-9680-aa9b1570c066","Type":"ContainerDied","Data":"2f4e35033c4cd89d07735afefb27e63122f0b588e2844ec59c3014270dfdc019"} Apr 24 21:50:23.418463 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.418090 2578 scope.go:117] "RemoveContainer" containerID="3b59b6a2a3c7ec3d5c4c142b0a36980124ca3d45424565c5e2a359de4201f238" Apr 24 21:50:23.426733 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.426706 2578 scope.go:117] "RemoveContainer" containerID="d889e37cd8aa44f38e0dbf2151b6f3b88401f8987d3c580a1c369e47d5f4e503" Apr 24 21:50:23.436072 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.436000 2578 scope.go:117] "RemoveContainer" containerID="3b59b6a2a3c7ec3d5c4c142b0a36980124ca3d45424565c5e2a359de4201f238" Apr 24 21:50:23.436728 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:50:23.436699 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b59b6a2a3c7ec3d5c4c142b0a36980124ca3d45424565c5e2a359de4201f238\": container with ID starting with 3b59b6a2a3c7ec3d5c4c142b0a36980124ca3d45424565c5e2a359de4201f238 not found: ID does not exist" containerID="3b59b6a2a3c7ec3d5c4c142b0a36980124ca3d45424565c5e2a359de4201f238" Apr 24 21:50:23.436820 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.436728 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b59b6a2a3c7ec3d5c4c142b0a36980124ca3d45424565c5e2a359de4201f238"} err="failed to get container status \"3b59b6a2a3c7ec3d5c4c142b0a36980124ca3d45424565c5e2a359de4201f238\": rpc error: code = NotFound desc = could not find container \"3b59b6a2a3c7ec3d5c4c142b0a36980124ca3d45424565c5e2a359de4201f238\": container with ID starting with 3b59b6a2a3c7ec3d5c4c142b0a36980124ca3d45424565c5e2a359de4201f238 not found: ID does not exist" Apr 24 21:50:23.436820 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.436747 2578 scope.go:117] "RemoveContainer" containerID="d889e37cd8aa44f38e0dbf2151b6f3b88401f8987d3c580a1c369e47d5f4e503" Apr 24 21:50:23.437048 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:50:23.437026 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d889e37cd8aa44f38e0dbf2151b6f3b88401f8987d3c580a1c369e47d5f4e503\": container with ID starting with d889e37cd8aa44f38e0dbf2151b6f3b88401f8987d3c580a1c369e47d5f4e503 not found: ID does not exist" containerID="d889e37cd8aa44f38e0dbf2151b6f3b88401f8987d3c580a1c369e47d5f4e503" Apr 24 21:50:23.437195 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.437051 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d889e37cd8aa44f38e0dbf2151b6f3b88401f8987d3c580a1c369e47d5f4e503"} err="failed to get container status \"d889e37cd8aa44f38e0dbf2151b6f3b88401f8987d3c580a1c369e47d5f4e503\": rpc error: code = NotFound desc = could not find container \"d889e37cd8aa44f38e0dbf2151b6f3b88401f8987d3c580a1c369e47d5f4e503\": container with ID starting with d889e37cd8aa44f38e0dbf2151b6f3b88401f8987d3c580a1c369e47d5f4e503 not found: ID does not exist" Apr 24 21:50:23.438945 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.438924 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z"] Apr 24 21:50:23.444880 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:23.444860 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z"] Apr 24 21:50:24.712368 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:24.712331 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b157333-a6fe-4cb3-9680-aa9b1570c066" path="/var/lib/kubelet/pods/0b157333-a6fe-4cb3-9680-aa9b1570c066/volumes" Apr 24 21:50:27.418994 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:27.418968 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" Apr 24 21:50:27.419631 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:27.419604 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" podUID="f69bd325-9fbb-4e92-bc0d-942db20130b2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 21:50:31.302328 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:31.302296 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" Apr 24 21:50:37.419764 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:37.419713 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" podUID="f69bd325-9fbb-4e92-bc0d-942db20130b2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 21:50:47.419622 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:47.419536 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" podUID="f69bd325-9fbb-4e92-bc0d-942db20130b2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 21:50:53.455116 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.455071 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8"] Apr 24 21:50:53.455654 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.455437 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" podUID="813e321c-1d89-4ab4-a19c-7a158ef051b0" containerName="kserve-container" containerID="cri-o://863ac3aa45eb751adcf5bbcf4c9cd8041e39633fb4fb1f4ffe603231ae7ada60" gracePeriod=30 Apr 24 21:50:53.455654 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.455575 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" podUID="813e321c-1d89-4ab4-a19c-7a158ef051b0" containerName="kube-rbac-proxy" containerID="cri-o://b142b4d17e3f1c7fa6c01ef551e87d6f59e92a2feb876fd355fa0863f5ecc33c" gracePeriod=30 Apr 24 21:50:53.464695 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.464639 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps"] Apr 24 21:50:53.465060 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.465041 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b157333-a6fe-4cb3-9680-aa9b1570c066" containerName="kube-rbac-proxy" Apr 24 21:50:53.465060 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.465060 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b157333-a6fe-4cb3-9680-aa9b1570c066" containerName="kube-rbac-proxy" Apr 24 21:50:53.465190 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.465081 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b157333-a6fe-4cb3-9680-aa9b1570c066" containerName="kserve-container" Apr 24 21:50:53.465190 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.465087 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b157333-a6fe-4cb3-9680-aa9b1570c066" containerName="kserve-container" Apr 24 21:50:53.465190 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.465141 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b157333-a6fe-4cb3-9680-aa9b1570c066" containerName="kserve-container" Apr 24 21:50:53.465190 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.465154 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b157333-a6fe-4cb3-9680-aa9b1570c066" containerName="kube-rbac-proxy" Apr 24 21:50:53.469848 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.469825 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" Apr 24 21:50:53.472084 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.472063 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-12404-kube-rbac-proxy-sar-config\"" Apr 24 21:50:53.472084 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.472073 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-12404-predictor-serving-cert\"" Apr 24 21:50:53.476648 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.476620 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps"] Apr 24 21:50:53.567933 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.567897 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/156abc99-7927-479f-8648-7631ecb91b4b-proxy-tls\") pod \"error-404-isvc-12404-predictor-6cb5f67968-778ps\" (UID: \"156abc99-7927-479f-8648-7631ecb91b4b\") " pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" Apr 24 21:50:53.567933 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.567938 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8s2j\" (UniqueName: \"kubernetes.io/projected/156abc99-7927-479f-8648-7631ecb91b4b-kube-api-access-h8s2j\") pod \"error-404-isvc-12404-predictor-6cb5f67968-778ps\" (UID: \"156abc99-7927-479f-8648-7631ecb91b4b\") " pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" Apr 24 21:50:53.568132 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.568047 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-12404-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/156abc99-7927-479f-8648-7631ecb91b4b-error-404-isvc-12404-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-12404-predictor-6cb5f67968-778ps\" (UID: \"156abc99-7927-479f-8648-7631ecb91b4b\") " pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" Apr 24 21:50:53.668819 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.668787 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-12404-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/156abc99-7927-479f-8648-7631ecb91b4b-error-404-isvc-12404-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-12404-predictor-6cb5f67968-778ps\" (UID: \"156abc99-7927-479f-8648-7631ecb91b4b\") " pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" Apr 24 21:50:53.669007 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.668840 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/156abc99-7927-479f-8648-7631ecb91b4b-proxy-tls\") pod \"error-404-isvc-12404-predictor-6cb5f67968-778ps\" (UID: \"156abc99-7927-479f-8648-7631ecb91b4b\") " pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" Apr 24 21:50:53.669007 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.668868 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8s2j\" (UniqueName: \"kubernetes.io/projected/156abc99-7927-479f-8648-7631ecb91b4b-kube-api-access-h8s2j\") pod \"error-404-isvc-12404-predictor-6cb5f67968-778ps\" (UID: \"156abc99-7927-479f-8648-7631ecb91b4b\") " pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" Apr 24 21:50:53.669464 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.669439 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-12404-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/156abc99-7927-479f-8648-7631ecb91b4b-error-404-isvc-12404-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-12404-predictor-6cb5f67968-778ps\" (UID: \"156abc99-7927-479f-8648-7631ecb91b4b\") " pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" Apr 24 21:50:53.671404 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.671375 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/156abc99-7927-479f-8648-7631ecb91b4b-proxy-tls\") pod \"error-404-isvc-12404-predictor-6cb5f67968-778ps\" (UID: \"156abc99-7927-479f-8648-7631ecb91b4b\") " pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" Apr 24 21:50:53.676567 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.676537 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8s2j\" (UniqueName: \"kubernetes.io/projected/156abc99-7927-479f-8648-7631ecb91b4b-kube-api-access-h8s2j\") pod \"error-404-isvc-12404-predictor-6cb5f67968-778ps\" (UID: \"156abc99-7927-479f-8648-7631ecb91b4b\") " pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" Apr 24 21:50:53.782233 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.782182 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" Apr 24 21:50:53.907411 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:53.907310 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps"] Apr 24 21:50:53.909668 ip-10-0-132-81 kubenswrapper[2578]: W0424 21:50:53.909639 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod156abc99_7927_479f_8648_7631ecb91b4b.slice/crio-8739ee497cf53e1d94ecc925e7945b1c276d2f0d6b731dec0a408234347baaae WatchSource:0}: Error finding container 8739ee497cf53e1d94ecc925e7945b1c276d2f0d6b731dec0a408234347baaae: Status 404 returned error can't find the container with id 8739ee497cf53e1d94ecc925e7945b1c276d2f0d6b731dec0a408234347baaae Apr 24 21:50:54.525593 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:54.525548 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" event={"ID":"156abc99-7927-479f-8648-7631ecb91b4b","Type":"ContainerStarted","Data":"ed39b9238cb61048df612a4a948f7461b64bd2ba8a7dea3ed1b61027fc819cbc"} Apr 24 21:50:54.526061 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:54.525600 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" event={"ID":"156abc99-7927-479f-8648-7631ecb91b4b","Type":"ContainerStarted","Data":"3fed1361150babbabfc0e1fee3a979c0704b2e5583a93f7446ccf2545528ce8a"} Apr 24 21:50:54.526061 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:54.525614 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" event={"ID":"156abc99-7927-479f-8648-7631ecb91b4b","Type":"ContainerStarted","Data":"8739ee497cf53e1d94ecc925e7945b1c276d2f0d6b731dec0a408234347baaae"} Apr 24 21:50:54.526061 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:54.525725 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" Apr 24 21:50:54.527268 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:54.527244 2578 generic.go:358] "Generic (PLEG): container finished" podID="813e321c-1d89-4ab4-a19c-7a158ef051b0" containerID="b142b4d17e3f1c7fa6c01ef551e87d6f59e92a2feb876fd355fa0863f5ecc33c" exitCode=2 Apr 24 21:50:54.527396 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:54.527302 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" event={"ID":"813e321c-1d89-4ab4-a19c-7a158ef051b0","Type":"ContainerDied","Data":"b142b4d17e3f1c7fa6c01ef551e87d6f59e92a2feb876fd355fa0863f5ecc33c"} Apr 24 21:50:54.542720 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:54.542646 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" podStartSLOduration=1.542630642 podStartE2EDuration="1.542630642s" podCreationTimestamp="2026-04-24 21:50:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:50:54.541316209 +0000 UTC m=+2072.392504707" watchObservedRunningTime="2026-04-24 21:50:54.542630642 +0000 UTC m=+2072.393819140" Apr 24 21:50:55.531191 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:55.531160 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" Apr 24 21:50:55.532582 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:55.532549 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" podUID="156abc99-7927-479f-8648-7631ecb91b4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 21:50:56.297141 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:56.297095 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" podUID="813e321c-1d89-4ab4-a19c-7a158ef051b0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.35:8643/healthz\": dial tcp 10.133.0.35:8643: connect: connection refused" Apr 24 21:50:56.534005 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:56.533966 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" podUID="156abc99-7927-479f-8648-7631ecb91b4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 21:50:57.094478 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.094454 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" Apr 24 21:50:57.199991 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.199899 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jxnh\" (UniqueName: \"kubernetes.io/projected/813e321c-1d89-4ab4-a19c-7a158ef051b0-kube-api-access-5jxnh\") pod \"813e321c-1d89-4ab4-a19c-7a158ef051b0\" (UID: \"813e321c-1d89-4ab4-a19c-7a158ef051b0\") " Apr 24 21:50:57.199991 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.199981 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-4efea-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/813e321c-1d89-4ab4-a19c-7a158ef051b0-error-404-isvc-4efea-kube-rbac-proxy-sar-config\") pod \"813e321c-1d89-4ab4-a19c-7a158ef051b0\" (UID: \"813e321c-1d89-4ab4-a19c-7a158ef051b0\") " Apr 24 21:50:57.200226 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.200098 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/813e321c-1d89-4ab4-a19c-7a158ef051b0-proxy-tls\") pod \"813e321c-1d89-4ab4-a19c-7a158ef051b0\" (UID: \"813e321c-1d89-4ab4-a19c-7a158ef051b0\") " Apr 24 21:50:57.200385 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.200359 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/813e321c-1d89-4ab4-a19c-7a158ef051b0-error-404-isvc-4efea-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-4efea-kube-rbac-proxy-sar-config") pod "813e321c-1d89-4ab4-a19c-7a158ef051b0" (UID: "813e321c-1d89-4ab4-a19c-7a158ef051b0"). InnerVolumeSpecName "error-404-isvc-4efea-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:50:57.202166 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.202137 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813e321c-1d89-4ab4-a19c-7a158ef051b0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "813e321c-1d89-4ab4-a19c-7a158ef051b0" (UID: "813e321c-1d89-4ab4-a19c-7a158ef051b0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:50:57.202607 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.202583 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813e321c-1d89-4ab4-a19c-7a158ef051b0-kube-api-access-5jxnh" (OuterVolumeSpecName: "kube-api-access-5jxnh") pod "813e321c-1d89-4ab4-a19c-7a158ef051b0" (UID: "813e321c-1d89-4ab4-a19c-7a158ef051b0"). InnerVolumeSpecName "kube-api-access-5jxnh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:50:57.301789 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.301745 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5jxnh\" (UniqueName: \"kubernetes.io/projected/813e321c-1d89-4ab4-a19c-7a158ef051b0-kube-api-access-5jxnh\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:50:57.301789 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.301784 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-4efea-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/813e321c-1d89-4ab4-a19c-7a158ef051b0-error-404-isvc-4efea-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:50:57.301996 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.301800 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/813e321c-1d89-4ab4-a19c-7a158ef051b0-proxy-tls\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 21:50:57.419578 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.419536 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" podUID="f69bd325-9fbb-4e92-bc0d-942db20130b2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 21:50:57.538924 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.538885 2578 generic.go:358] "Generic (PLEG): container finished" podID="813e321c-1d89-4ab4-a19c-7a158ef051b0" containerID="863ac3aa45eb751adcf5bbcf4c9cd8041e39633fb4fb1f4ffe603231ae7ada60" exitCode=0 Apr 24 21:50:57.539399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.538958 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" Apr 24 21:50:57.539399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.538959 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" event={"ID":"813e321c-1d89-4ab4-a19c-7a158ef051b0","Type":"ContainerDied","Data":"863ac3aa45eb751adcf5bbcf4c9cd8041e39633fb4fb1f4ffe603231ae7ada60"} Apr 24 21:50:57.539399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.539078 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8" event={"ID":"813e321c-1d89-4ab4-a19c-7a158ef051b0","Type":"ContainerDied","Data":"c0991668d582dc2a02411ac7a20c237ea481732cbef05aed7bf56f8f67f251da"} Apr 24 21:50:57.539399 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.539102 2578 scope.go:117] "RemoveContainer" containerID="b142b4d17e3f1c7fa6c01ef551e87d6f59e92a2feb876fd355fa0863f5ecc33c" Apr 24 21:50:57.547893 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.547872 2578 scope.go:117] "RemoveContainer" containerID="863ac3aa45eb751adcf5bbcf4c9cd8041e39633fb4fb1f4ffe603231ae7ada60" Apr 24 21:50:57.557719 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.557696 2578 scope.go:117] "RemoveContainer" containerID="b142b4d17e3f1c7fa6c01ef551e87d6f59e92a2feb876fd355fa0863f5ecc33c" Apr 24 21:50:57.558106 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:50:57.558080 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b142b4d17e3f1c7fa6c01ef551e87d6f59e92a2feb876fd355fa0863f5ecc33c\": container with ID starting with b142b4d17e3f1c7fa6c01ef551e87d6f59e92a2feb876fd355fa0863f5ecc33c not found: ID does not exist" containerID="b142b4d17e3f1c7fa6c01ef551e87d6f59e92a2feb876fd355fa0863f5ecc33c" Apr 24 21:50:57.558284 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.558257 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b142b4d17e3f1c7fa6c01ef551e87d6f59e92a2feb876fd355fa0863f5ecc33c"} err="failed to get container status \"b142b4d17e3f1c7fa6c01ef551e87d6f59e92a2feb876fd355fa0863f5ecc33c\": rpc error: code = NotFound desc = could not find container \"b142b4d17e3f1c7fa6c01ef551e87d6f59e92a2feb876fd355fa0863f5ecc33c\": container with ID starting with b142b4d17e3f1c7fa6c01ef551e87d6f59e92a2feb876fd355fa0863f5ecc33c not found: ID does not exist" Apr 24 21:50:57.558400 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.558388 2578 scope.go:117] "RemoveContainer" containerID="863ac3aa45eb751adcf5bbcf4c9cd8041e39633fb4fb1f4ffe603231ae7ada60" Apr 24 21:50:57.561710 ip-10-0-132-81 kubenswrapper[2578]: E0424 21:50:57.561140 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"863ac3aa45eb751adcf5bbcf4c9cd8041e39633fb4fb1f4ffe603231ae7ada60\": container with ID starting with 863ac3aa45eb751adcf5bbcf4c9cd8041e39633fb4fb1f4ffe603231ae7ada60 not found: ID does not exist" containerID="863ac3aa45eb751adcf5bbcf4c9cd8041e39633fb4fb1f4ffe603231ae7ada60" Apr 24 21:50:57.561710 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.561216 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"863ac3aa45eb751adcf5bbcf4c9cd8041e39633fb4fb1f4ffe603231ae7ada60"} err="failed to get container status \"863ac3aa45eb751adcf5bbcf4c9cd8041e39633fb4fb1f4ffe603231ae7ada60\": rpc error: code = NotFound desc = could not find container \"863ac3aa45eb751adcf5bbcf4c9cd8041e39633fb4fb1f4ffe603231ae7ada60\": container with ID starting with 863ac3aa45eb751adcf5bbcf4c9cd8041e39633fb4fb1f4ffe603231ae7ada60 not found: ID does not exist" Apr 24 21:50:57.567186 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.567139 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8"] Apr 24 21:50:57.567342 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:57.567195 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8"] Apr 24 21:50:58.712859 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:50:58.712823 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="813e321c-1d89-4ab4-a19c-7a158ef051b0" path="/var/lib/kubelet/pods/813e321c-1d89-4ab4-a19c-7a158ef051b0/volumes" Apr 24 21:51:01.539395 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:51:01.539363 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" Apr 24 21:51:01.539942 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:51:01.539912 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" podUID="156abc99-7927-479f-8648-7631ecb91b4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 21:51:07.420737 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:51:07.420708 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" Apr 24 21:51:11.540482 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:51:11.540438 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" podUID="156abc99-7927-479f-8648-7631ecb91b4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 21:51:21.540132 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:51:21.540094 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" podUID="156abc99-7927-479f-8648-7631ecb91b4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 21:51:22.759962 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:51:22.759930 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:51:22.771253 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:51:22.771230 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:51:31.540449 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:51:31.540402 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" podUID="156abc99-7927-479f-8648-7631ecb91b4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 21:51:41.540860 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:51:41.540827 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" Apr 24 21:56:22.781502 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:56:22.781474 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 21:56:22.792776 ip-10-0-132-81 kubenswrapper[2578]: I0424 21:56:22.792754 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 22:00:08.264430 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:08.264340 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps"] Apr 24 22:00:08.268133 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:08.268063 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" podUID="156abc99-7927-479f-8648-7631ecb91b4b" containerName="kserve-container" containerID="cri-o://3fed1361150babbabfc0e1fee3a979c0704b2e5583a93f7446ccf2545528ce8a" gracePeriod=30 Apr 24 22:00:08.268293 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:08.268118 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" podUID="156abc99-7927-479f-8648-7631ecb91b4b" containerName="kube-rbac-proxy" containerID="cri-o://ed39b9238cb61048df612a4a948f7461b64bd2ba8a7dea3ed1b61027fc819cbc" gracePeriod=30 Apr 24 22:00:09.359360 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:09.359326 2578 generic.go:358] "Generic (PLEG): container finished" podID="156abc99-7927-479f-8648-7631ecb91b4b" containerID="ed39b9238cb61048df612a4a948f7461b64bd2ba8a7dea3ed1b61027fc819cbc" exitCode=2 Apr 24 22:00:09.359784 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:09.359406 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" event={"ID":"156abc99-7927-479f-8648-7631ecb91b4b","Type":"ContainerDied","Data":"ed39b9238cb61048df612a4a948f7461b64bd2ba8a7dea3ed1b61027fc819cbc"} Apr 24 22:00:11.515860 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:11.515838 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" Apr 24 22:00:11.562744 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:11.562656 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8s2j\" (UniqueName: \"kubernetes.io/projected/156abc99-7927-479f-8648-7631ecb91b4b-kube-api-access-h8s2j\") pod \"156abc99-7927-479f-8648-7631ecb91b4b\" (UID: \"156abc99-7927-479f-8648-7631ecb91b4b\") " Apr 24 22:00:11.562744 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:11.562712 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/156abc99-7927-479f-8648-7631ecb91b4b-proxy-tls\") pod \"156abc99-7927-479f-8648-7631ecb91b4b\" (UID: \"156abc99-7927-479f-8648-7631ecb91b4b\") " Apr 24 22:00:11.562917 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:11.562770 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-12404-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/156abc99-7927-479f-8648-7631ecb91b4b-error-404-isvc-12404-kube-rbac-proxy-sar-config\") pod \"156abc99-7927-479f-8648-7631ecb91b4b\" (UID: \"156abc99-7927-479f-8648-7631ecb91b4b\") " Apr 24 22:00:11.563218 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:11.563167 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/156abc99-7927-479f-8648-7631ecb91b4b-error-404-isvc-12404-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-12404-kube-rbac-proxy-sar-config") pod "156abc99-7927-479f-8648-7631ecb91b4b" (UID: "156abc99-7927-479f-8648-7631ecb91b4b"). InnerVolumeSpecName "error-404-isvc-12404-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:00:11.565006 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:11.564972 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156abc99-7927-479f-8648-7631ecb91b4b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "156abc99-7927-479f-8648-7631ecb91b4b" (UID: "156abc99-7927-479f-8648-7631ecb91b4b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:00:11.565276 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:11.565242 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/156abc99-7927-479f-8648-7631ecb91b4b-kube-api-access-h8s2j" (OuterVolumeSpecName: "kube-api-access-h8s2j") pod "156abc99-7927-479f-8648-7631ecb91b4b" (UID: "156abc99-7927-479f-8648-7631ecb91b4b"). InnerVolumeSpecName "kube-api-access-h8s2j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:00:11.663524 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:11.663484 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-12404-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/156abc99-7927-479f-8648-7631ecb91b4b-error-404-isvc-12404-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 22:00:11.663524 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:11.663519 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h8s2j\" (UniqueName: \"kubernetes.io/projected/156abc99-7927-479f-8648-7631ecb91b4b-kube-api-access-h8s2j\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 22:00:11.663524 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:11.663529 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/156abc99-7927-479f-8648-7631ecb91b4b-proxy-tls\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 22:00:12.371073 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:12.371038 2578 generic.go:358] "Generic (PLEG): container finished" podID="156abc99-7927-479f-8648-7631ecb91b4b" containerID="3fed1361150babbabfc0e1fee3a979c0704b2e5583a93f7446ccf2545528ce8a" exitCode=0 Apr 24 22:00:12.371277 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:12.371091 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" event={"ID":"156abc99-7927-479f-8648-7631ecb91b4b","Type":"ContainerDied","Data":"3fed1361150babbabfc0e1fee3a979c0704b2e5583a93f7446ccf2545528ce8a"} Apr 24 22:00:12.371277 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:12.371124 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" Apr 24 22:00:12.371277 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:12.371139 2578 scope.go:117] "RemoveContainer" containerID="ed39b9238cb61048df612a4a948f7461b64bd2ba8a7dea3ed1b61027fc819cbc" Apr 24 22:00:12.371277 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:12.371125 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps" event={"ID":"156abc99-7927-479f-8648-7631ecb91b4b","Type":"ContainerDied","Data":"8739ee497cf53e1d94ecc925e7945b1c276d2f0d6b731dec0a408234347baaae"} Apr 24 22:00:12.379734 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:12.379712 2578 scope.go:117] "RemoveContainer" containerID="3fed1361150babbabfc0e1fee3a979c0704b2e5583a93f7446ccf2545528ce8a" Apr 24 22:00:12.387100 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:12.386949 2578 scope.go:117] "RemoveContainer" containerID="ed39b9238cb61048df612a4a948f7461b64bd2ba8a7dea3ed1b61027fc819cbc" Apr 24 22:00:12.387224 ip-10-0-132-81 kubenswrapper[2578]: E0424 22:00:12.387204 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed39b9238cb61048df612a4a948f7461b64bd2ba8a7dea3ed1b61027fc819cbc\": container with ID starting with ed39b9238cb61048df612a4a948f7461b64bd2ba8a7dea3ed1b61027fc819cbc not found: ID does not exist" containerID="ed39b9238cb61048df612a4a948f7461b64bd2ba8a7dea3ed1b61027fc819cbc" Apr 24 22:00:12.387295 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:12.387237 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed39b9238cb61048df612a4a948f7461b64bd2ba8a7dea3ed1b61027fc819cbc"} err="failed to get container status \"ed39b9238cb61048df612a4a948f7461b64bd2ba8a7dea3ed1b61027fc819cbc\": rpc error: code = NotFound desc = could not find container \"ed39b9238cb61048df612a4a948f7461b64bd2ba8a7dea3ed1b61027fc819cbc\": container with ID starting with ed39b9238cb61048df612a4a948f7461b64bd2ba8a7dea3ed1b61027fc819cbc not found: ID does not exist" Apr 24 22:00:12.387295 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:12.387263 2578 scope.go:117] "RemoveContainer" containerID="3fed1361150babbabfc0e1fee3a979c0704b2e5583a93f7446ccf2545528ce8a" Apr 24 22:00:12.387526 ip-10-0-132-81 kubenswrapper[2578]: E0424 22:00:12.387507 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fed1361150babbabfc0e1fee3a979c0704b2e5583a93f7446ccf2545528ce8a\": container with ID starting with 3fed1361150babbabfc0e1fee3a979c0704b2e5583a93f7446ccf2545528ce8a not found: ID does not exist" containerID="3fed1361150babbabfc0e1fee3a979c0704b2e5583a93f7446ccf2545528ce8a" Apr 24 22:00:12.387561 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:12.387532 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fed1361150babbabfc0e1fee3a979c0704b2e5583a93f7446ccf2545528ce8a"} err="failed to get container status \"3fed1361150babbabfc0e1fee3a979c0704b2e5583a93f7446ccf2545528ce8a\": rpc error: code = NotFound desc = could not find container \"3fed1361150babbabfc0e1fee3a979c0704b2e5583a93f7446ccf2545528ce8a\": container with ID starting with 3fed1361150babbabfc0e1fee3a979c0704b2e5583a93f7446ccf2545528ce8a not found: ID does not exist" Apr 24 22:00:12.396275 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:12.396249 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps"] Apr 24 22:00:12.404788 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:12.404763 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps"] Apr 24 22:00:12.712355 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:00:12.712273 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="156abc99-7927-479f-8648-7631ecb91b4b" path="/var/lib/kubelet/pods/156abc99-7927-479f-8648-7631ecb91b4b/volumes" Apr 24 22:01:22.802468 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:01:22.802357 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 22:01:22.813748 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:01:22.813724 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 22:06:22.823962 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:06:22.823850 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 22:06:22.834761 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:06:22.834735 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 22:07:38.531368 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:38.531326 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf"] Apr 24 22:07:38.531938 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:38.531634 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" podUID="f69bd325-9fbb-4e92-bc0d-942db20130b2" containerName="kserve-container" containerID="cri-o://d89733330af076db8b2bc5cce5c6699b257c737355a2c94eaae94c888a67efcf" gracePeriod=30 Apr 24 22:07:38.531938 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:38.531667 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" podUID="f69bd325-9fbb-4e92-bc0d-942db20130b2" containerName="kube-rbac-proxy" containerID="cri-o://7dde10c4155ab543cff2c4cb4207758878f542e7472eaf25f0b4df66b26fbc12" gracePeriod=30 Apr 24 22:07:38.806086 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:38.805987 2578 generic.go:358] "Generic (PLEG): container finished" podID="f69bd325-9fbb-4e92-bc0d-942db20130b2" containerID="7dde10c4155ab543cff2c4cb4207758878f542e7472eaf25f0b4df66b26fbc12" exitCode=2 Apr 24 22:07:38.806086 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:38.806049 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" event={"ID":"f69bd325-9fbb-4e92-bc0d-942db20130b2","Type":"ContainerDied","Data":"7dde10c4155ab543cff2c4cb4207758878f542e7472eaf25f0b4df66b26fbc12"} Apr 24 22:07:41.818204 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:41.818171 2578 generic.go:358] "Generic (PLEG): container finished" podID="f69bd325-9fbb-4e92-bc0d-942db20130b2" containerID="d89733330af076db8b2bc5cce5c6699b257c737355a2c94eaae94c888a67efcf" exitCode=0 Apr 24 22:07:41.818547 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:41.818238 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" event={"ID":"f69bd325-9fbb-4e92-bc0d-942db20130b2","Type":"ContainerDied","Data":"d89733330af076db8b2bc5cce5c6699b257c737355a2c94eaae94c888a67efcf"} Apr 24 22:07:41.873895 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:41.873863 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" Apr 24 22:07:42.014292 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:42.014258 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fxdk\" (UniqueName: \"kubernetes.io/projected/f69bd325-9fbb-4e92-bc0d-942db20130b2-kube-api-access-4fxdk\") pod \"f69bd325-9fbb-4e92-bc0d-942db20130b2\" (UID: \"f69bd325-9fbb-4e92-bc0d-942db20130b2\") " Apr 24 22:07:42.014483 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:42.014347 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f69bd325-9fbb-4e92-bc0d-942db20130b2-proxy-tls\") pod \"f69bd325-9fbb-4e92-bc0d-942db20130b2\" (UID: \"f69bd325-9fbb-4e92-bc0d-942db20130b2\") " Apr 24 22:07:42.014483 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:42.014384 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-98aac-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f69bd325-9fbb-4e92-bc0d-942db20130b2-error-404-isvc-98aac-kube-rbac-proxy-sar-config\") pod \"f69bd325-9fbb-4e92-bc0d-942db20130b2\" (UID: \"f69bd325-9fbb-4e92-bc0d-942db20130b2\") " Apr 24 22:07:42.014886 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:42.014848 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f69bd325-9fbb-4e92-bc0d-942db20130b2-error-404-isvc-98aac-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-98aac-kube-rbac-proxy-sar-config") pod "f69bd325-9fbb-4e92-bc0d-942db20130b2" (UID: "f69bd325-9fbb-4e92-bc0d-942db20130b2"). InnerVolumeSpecName "error-404-isvc-98aac-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:07:42.016351 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:42.016329 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f69bd325-9fbb-4e92-bc0d-942db20130b2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f69bd325-9fbb-4e92-bc0d-942db20130b2" (UID: "f69bd325-9fbb-4e92-bc0d-942db20130b2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:07:42.016351 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:42.016342 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f69bd325-9fbb-4e92-bc0d-942db20130b2-kube-api-access-4fxdk" (OuterVolumeSpecName: "kube-api-access-4fxdk") pod "f69bd325-9fbb-4e92-bc0d-942db20130b2" (UID: "f69bd325-9fbb-4e92-bc0d-942db20130b2"). InnerVolumeSpecName "kube-api-access-4fxdk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:07:42.115839 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:42.115798 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4fxdk\" (UniqueName: \"kubernetes.io/projected/f69bd325-9fbb-4e92-bc0d-942db20130b2-kube-api-access-4fxdk\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 22:07:42.115839 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:42.115833 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f69bd325-9fbb-4e92-bc0d-942db20130b2-proxy-tls\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 22:07:42.115839 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:42.115846 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-98aac-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f69bd325-9fbb-4e92-bc0d-942db20130b2-error-404-isvc-98aac-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-81.ec2.internal\" DevicePath \"\"" Apr 24 22:07:42.823320 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:42.823288 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" event={"ID":"f69bd325-9fbb-4e92-bc0d-942db20130b2","Type":"ContainerDied","Data":"6ddeadbd5e9bea8b237aec0fcd12ae4ef1aaec14b88a8a450eccb8493cec4d7f"} Apr 24 22:07:42.823744 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:42.823334 2578 scope.go:117] "RemoveContainer" containerID="7dde10c4155ab543cff2c4cb4207758878f542e7472eaf25f0b4df66b26fbc12" Apr 24 22:07:42.823744 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:42.823297 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf" Apr 24 22:07:42.835916 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:42.835885 2578 scope.go:117] "RemoveContainer" containerID="d89733330af076db8b2bc5cce5c6699b257c737355a2c94eaae94c888a67efcf" Apr 24 22:07:42.841422 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:42.841396 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf"] Apr 24 22:07:42.845275 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:42.845253 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf"] Apr 24 22:07:44.714492 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:07:44.714456 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f69bd325-9fbb-4e92-bc0d-942db20130b2" path="/var/lib/kubelet/pods/f69bd325-9fbb-4e92-bc0d-942db20130b2/volumes" Apr 24 22:08:05.808397 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.808352 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bccfr/must-gather-rf887"] Apr 24 22:08:05.808899 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.808808 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f69bd325-9fbb-4e92-bc0d-942db20130b2" containerName="kserve-container" Apr 24 22:08:05.808899 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.808828 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f69bd325-9fbb-4e92-bc0d-942db20130b2" containerName="kserve-container" Apr 24 22:08:05.808899 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.808839 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="813e321c-1d89-4ab4-a19c-7a158ef051b0" containerName="kserve-container" Apr 24 22:08:05.808899 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.808847 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="813e321c-1d89-4ab4-a19c-7a158ef051b0" containerName="kserve-container" Apr 24 22:08:05.808899 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.808862 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="813e321c-1d89-4ab4-a19c-7a158ef051b0" containerName="kube-rbac-proxy" Apr 24 22:08:05.808899 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.808870 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="813e321c-1d89-4ab4-a19c-7a158ef051b0" containerName="kube-rbac-proxy" Apr 24 22:08:05.808899 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.808879 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="156abc99-7927-479f-8648-7631ecb91b4b" containerName="kserve-container" Apr 24 22:08:05.808899 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.808886 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="156abc99-7927-479f-8648-7631ecb91b4b" containerName="kserve-container" Apr 24 22:08:05.809278 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.808914 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f69bd325-9fbb-4e92-bc0d-942db20130b2" containerName="kube-rbac-proxy" Apr 24 22:08:05.809278 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.808922 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f69bd325-9fbb-4e92-bc0d-942db20130b2" containerName="kube-rbac-proxy" Apr 24 22:08:05.809278 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.808936 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="156abc99-7927-479f-8648-7631ecb91b4b" containerName="kube-rbac-proxy" Apr 24 22:08:05.809278 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.808944 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="156abc99-7927-479f-8648-7631ecb91b4b" containerName="kube-rbac-proxy" Apr 24 22:08:05.809278 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.809016 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="813e321c-1d89-4ab4-a19c-7a158ef051b0" containerName="kube-rbac-proxy" Apr 24 22:08:05.809278 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.809028 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="813e321c-1d89-4ab4-a19c-7a158ef051b0" containerName="kserve-container" Apr 24 22:08:05.809278 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.809040 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f69bd325-9fbb-4e92-bc0d-942db20130b2" containerName="kube-rbac-proxy" Apr 24 22:08:05.809278 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.809052 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f69bd325-9fbb-4e92-bc0d-942db20130b2" containerName="kserve-container" Apr 24 22:08:05.809278 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.809066 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="156abc99-7927-479f-8648-7631ecb91b4b" containerName="kube-rbac-proxy" Apr 24 22:08:05.809278 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.809076 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="156abc99-7927-479f-8648-7631ecb91b4b" containerName="kserve-container" Apr 24 22:08:05.812035 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.812010 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bccfr/must-gather-rf887" Apr 24 22:08:05.814283 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.814258 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-bccfr\"/\"openshift-service-ca.crt\"" Apr 24 22:08:05.814399 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.814266 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-bccfr\"/\"kube-root-ca.crt\"" Apr 24 22:08:05.814462 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.814435 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-bccfr\"/\"default-dockercfg-vcvfc\"" Apr 24 22:08:05.820625 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.820598 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bccfr/must-gather-rf887"] Apr 24 22:08:05.886596 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.886559 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bmsl\" (UniqueName: \"kubernetes.io/projected/71ecf9cb-ac9d-4fe2-8c11-1d2ac978b2b6-kube-api-access-8bmsl\") pod \"must-gather-rf887\" (UID: \"71ecf9cb-ac9d-4fe2-8c11-1d2ac978b2b6\") " pod="openshift-must-gather-bccfr/must-gather-rf887" Apr 24 22:08:05.886596 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.886614 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71ecf9cb-ac9d-4fe2-8c11-1d2ac978b2b6-must-gather-output\") pod \"must-gather-rf887\" (UID: \"71ecf9cb-ac9d-4fe2-8c11-1d2ac978b2b6\") " pod="openshift-must-gather-bccfr/must-gather-rf887" Apr 24 22:08:05.987421 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.987375 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bmsl\" (UniqueName: \"kubernetes.io/projected/71ecf9cb-ac9d-4fe2-8c11-1d2ac978b2b6-kube-api-access-8bmsl\") pod \"must-gather-rf887\" (UID: \"71ecf9cb-ac9d-4fe2-8c11-1d2ac978b2b6\") " pod="openshift-must-gather-bccfr/must-gather-rf887" Apr 24 22:08:05.987421 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.987421 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71ecf9cb-ac9d-4fe2-8c11-1d2ac978b2b6-must-gather-output\") pod \"must-gather-rf887\" (UID: \"71ecf9cb-ac9d-4fe2-8c11-1d2ac978b2b6\") " pod="openshift-must-gather-bccfr/must-gather-rf887" Apr 24 22:08:05.987733 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.987717 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71ecf9cb-ac9d-4fe2-8c11-1d2ac978b2b6-must-gather-output\") pod \"must-gather-rf887\" (UID: \"71ecf9cb-ac9d-4fe2-8c11-1d2ac978b2b6\") " pod="openshift-must-gather-bccfr/must-gather-rf887" Apr 24 22:08:05.995777 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:05.995747 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bmsl\" (UniqueName: \"kubernetes.io/projected/71ecf9cb-ac9d-4fe2-8c11-1d2ac978b2b6-kube-api-access-8bmsl\") pod \"must-gather-rf887\" (UID: \"71ecf9cb-ac9d-4fe2-8c11-1d2ac978b2b6\") " pod="openshift-must-gather-bccfr/must-gather-rf887" Apr 24 22:08:06.122470 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:06.122376 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bccfr/must-gather-rf887" Apr 24 22:08:06.242144 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:06.242122 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bccfr/must-gather-rf887"] Apr 24 22:08:06.244741 ip-10-0-132-81 kubenswrapper[2578]: W0424 22:08:06.244711 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71ecf9cb_ac9d_4fe2_8c11_1d2ac978b2b6.slice/crio-1cd80f21b980ca81c3b1564810e75981811e0c6a058a5df81b00872d369e2547 WatchSource:0}: Error finding container 1cd80f21b980ca81c3b1564810e75981811e0c6a058a5df81b00872d369e2547: Status 404 returned error can't find the container with id 1cd80f21b980ca81c3b1564810e75981811e0c6a058a5df81b00872d369e2547 Apr 24 22:08:06.246500 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:06.246482 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:08:06.908128 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:06.908091 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bccfr/must-gather-rf887" event={"ID":"71ecf9cb-ac9d-4fe2-8c11-1d2ac978b2b6","Type":"ContainerStarted","Data":"1cd80f21b980ca81c3b1564810e75981811e0c6a058a5df81b00872d369e2547"} Apr 24 22:08:07.914482 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:07.914443 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bccfr/must-gather-rf887" event={"ID":"71ecf9cb-ac9d-4fe2-8c11-1d2ac978b2b6","Type":"ContainerStarted","Data":"d844213202d361b48b2f0e5a3cdad29ec5013515fb89aa5423a16dcc3069303e"} Apr 24 22:08:07.914482 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:07.914488 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bccfr/must-gather-rf887" event={"ID":"71ecf9cb-ac9d-4fe2-8c11-1d2ac978b2b6","Type":"ContainerStarted","Data":"803cf4a1af0329e9834dc7aafc10b18507227c96601c34e58a5a53a4622e65e8"} Apr 24 22:08:07.935564 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:07.935506 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bccfr/must-gather-rf887" podStartSLOduration=2.136349957 podStartE2EDuration="2.935491139s" podCreationTimestamp="2026-04-24 22:08:05 +0000 UTC" firstStartedPulling="2026-04-24 22:08:06.246616309 +0000 UTC m=+3104.097804790" lastFinishedPulling="2026-04-24 22:08:07.045757492 +0000 UTC m=+3104.896945972" observedRunningTime="2026-04-24 22:08:07.933275612 +0000 UTC m=+3105.784464111" watchObservedRunningTime="2026-04-24 22:08:07.935491139 +0000 UTC m=+3105.786679638" Apr 24 22:08:08.491359 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:08.491327 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6lx29_dcc464bb-ad40-4751-9ee3-8ee223123ed6/global-pull-secret-syncer/0.log" Apr 24 22:08:08.587992 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:08.587942 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2c2cp_c3b3582a-d64c-4ef1-8758-602aabd2be60/konnectivity-agent/0.log" Apr 24 22:08:08.701472 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:08.701438 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-81.ec2.internal_91804ee55f307105f50a9b09138ac4ec/haproxy/0.log" Apr 24 22:08:12.279878 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:12.279845 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-n677g_d5a35927-4abc-4212-9393-0fc58f4776db/monitoring-plugin/0.log" Apr 24 22:08:12.476645 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:12.476613 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n2flz_9f3aec88-d654-473c-a26a-236aeb20a6cd/node-exporter/0.log" Apr 24 22:08:12.502294 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:12.502262 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n2flz_9f3aec88-d654-473c-a26a-236aeb20a6cd/kube-rbac-proxy/0.log" Apr 24 22:08:12.525352 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:12.525310 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n2flz_9f3aec88-d654-473c-a26a-236aeb20a6cd/init-textfile/0.log" Apr 24 22:08:12.949961 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:12.949926 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b985d598b-262rf_3418a28a-7b63-4978-949c-05cef2f4db92/thanos-query/0.log" Apr 24 22:08:12.974494 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:12.974470 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b985d598b-262rf_3418a28a-7b63-4978-949c-05cef2f4db92/kube-rbac-proxy-web/0.log" Apr 24 22:08:13.001535 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:13.001505 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b985d598b-262rf_3418a28a-7b63-4978-949c-05cef2f4db92/kube-rbac-proxy/0.log" Apr 24 22:08:13.022606 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:13.022579 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b985d598b-262rf_3418a28a-7b63-4978-949c-05cef2f4db92/prom-label-proxy/0.log" Apr 24 22:08:13.044695 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:13.044642 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b985d598b-262rf_3418a28a-7b63-4978-949c-05cef2f4db92/kube-rbac-proxy-rules/0.log" Apr 24 22:08:13.070597 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:13.070569 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b985d598b-262rf_3418a28a-7b63-4978-949c-05cef2f4db92/kube-rbac-proxy-metrics/0.log" Apr 24 22:08:15.004353 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.004301 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57d8cfbd8d-qxglm_fb46568a-d191-418a-8d51-03b26e627c83/console/0.log" Apr 24 22:08:15.043763 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.043728 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-m27jw_f9ced7e0-e303-4e2f-9642-e2c46ac4800a/download-server/0.log" Apr 24 22:08:15.715549 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.715513 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr"] Apr 24 22:08:15.723536 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.720789 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:15.728083 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.728047 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr"] Apr 24 22:08:15.769153 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.769112 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16f4c503-0371-4830-88b4-bcd8c4d1373b-sys\") pod \"perf-node-gather-daemonset-wgtcr\" (UID: \"16f4c503-0371-4830-88b4-bcd8c4d1373b\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:15.769319 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.769168 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvk8v\" (UniqueName: \"kubernetes.io/projected/16f4c503-0371-4830-88b4-bcd8c4d1373b-kube-api-access-nvk8v\") pod \"perf-node-gather-daemonset-wgtcr\" (UID: \"16f4c503-0371-4830-88b4-bcd8c4d1373b\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:15.769319 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.769224 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/16f4c503-0371-4830-88b4-bcd8c4d1373b-proc\") pod \"perf-node-gather-daemonset-wgtcr\" (UID: \"16f4c503-0371-4830-88b4-bcd8c4d1373b\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:15.769319 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.769294 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/16f4c503-0371-4830-88b4-bcd8c4d1373b-podres\") pod \"perf-node-gather-daemonset-wgtcr\" (UID: \"16f4c503-0371-4830-88b4-bcd8c4d1373b\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:15.769421 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.769390 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/16f4c503-0371-4830-88b4-bcd8c4d1373b-lib-modules\") pod \"perf-node-gather-daemonset-wgtcr\" (UID: \"16f4c503-0371-4830-88b4-bcd8c4d1373b\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:15.870130 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.870095 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/16f4c503-0371-4830-88b4-bcd8c4d1373b-podres\") pod \"perf-node-gather-daemonset-wgtcr\" (UID: \"16f4c503-0371-4830-88b4-bcd8c4d1373b\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:15.870130 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.870136 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/16f4c503-0371-4830-88b4-bcd8c4d1373b-lib-modules\") pod \"perf-node-gather-daemonset-wgtcr\" (UID: \"16f4c503-0371-4830-88b4-bcd8c4d1373b\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:15.870383 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.870175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16f4c503-0371-4830-88b4-bcd8c4d1373b-sys\") pod \"perf-node-gather-daemonset-wgtcr\" (UID: \"16f4c503-0371-4830-88b4-bcd8c4d1373b\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:15.870383 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.870193 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvk8v\" (UniqueName: \"kubernetes.io/projected/16f4c503-0371-4830-88b4-bcd8c4d1373b-kube-api-access-nvk8v\") pod \"perf-node-gather-daemonset-wgtcr\" (UID: \"16f4c503-0371-4830-88b4-bcd8c4d1373b\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:15.870383 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.870226 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/16f4c503-0371-4830-88b4-bcd8c4d1373b-proc\") pod \"perf-node-gather-daemonset-wgtcr\" (UID: \"16f4c503-0371-4830-88b4-bcd8c4d1373b\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:15.870383 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.870297 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/16f4c503-0371-4830-88b4-bcd8c4d1373b-proc\") pod \"perf-node-gather-daemonset-wgtcr\" (UID: \"16f4c503-0371-4830-88b4-bcd8c4d1373b\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:15.870383 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.870294 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/16f4c503-0371-4830-88b4-bcd8c4d1373b-podres\") pod \"perf-node-gather-daemonset-wgtcr\" (UID: \"16f4c503-0371-4830-88b4-bcd8c4d1373b\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:15.870383 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.870298 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16f4c503-0371-4830-88b4-bcd8c4d1373b-sys\") pod \"perf-node-gather-daemonset-wgtcr\" (UID: \"16f4c503-0371-4830-88b4-bcd8c4d1373b\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:15.870383 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.870358 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/16f4c503-0371-4830-88b4-bcd8c4d1373b-lib-modules\") pod \"perf-node-gather-daemonset-wgtcr\" (UID: \"16f4c503-0371-4830-88b4-bcd8c4d1373b\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:15.877728 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:15.877704 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvk8v\" (UniqueName: \"kubernetes.io/projected/16f4c503-0371-4830-88b4-bcd8c4d1373b-kube-api-access-nvk8v\") pod \"perf-node-gather-daemonset-wgtcr\" (UID: \"16f4c503-0371-4830-88b4-bcd8c4d1373b\") " pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:16.035273 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:16.035237 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:16.165338 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:16.165310 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr"] Apr 24 22:08:16.165770 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:16.165742 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tsgdk_1b1a23d3-3a62-4a84-a3aa-d49e382f7322/dns/0.log" Apr 24 22:08:16.167780 ip-10-0-132-81 kubenswrapper[2578]: W0424 22:08:16.167752 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod16f4c503_0371_4830_88b4_bcd8c4d1373b.slice/crio-da7b262bebbfc5de25b6fcfe8e447fbfb45b3dcc9479e8afdaa1b31c5e966ec9 WatchSource:0}: Error finding container da7b262bebbfc5de25b6fcfe8e447fbfb45b3dcc9479e8afdaa1b31c5e966ec9: Status 404 returned error can't find the container with id da7b262bebbfc5de25b6fcfe8e447fbfb45b3dcc9479e8afdaa1b31c5e966ec9 Apr 24 22:08:16.185834 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:16.185807 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tsgdk_1b1a23d3-3a62-4a84-a3aa-d49e382f7322/kube-rbac-proxy/0.log" Apr 24 22:08:16.210236 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:16.210212 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4x96t_b90d25e6-8bbe-484f-9222-fe772ed03d48/dns-node-resolver/0.log" Apr 24 22:08:16.693490 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:16.693402 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2tx46_a2a58d1a-b069-41cc-b869-2a502d2e4e3c/node-ca/0.log" Apr 24 22:08:16.954128 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:16.954026 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" event={"ID":"16f4c503-0371-4830-88b4-bcd8c4d1373b","Type":"ContainerStarted","Data":"a2078a0bc2d583fdb957131e1d0d52476692a953a2e2d7fafe00dcb082bbc7bd"} Apr 24 22:08:16.954128 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:16.954078 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" event={"ID":"16f4c503-0371-4830-88b4-bcd8c4d1373b","Type":"ContainerStarted","Data":"da7b262bebbfc5de25b6fcfe8e447fbfb45b3dcc9479e8afdaa1b31c5e966ec9"} Apr 24 22:08:16.954420 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:16.954393 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:16.970572 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:16.970500 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" podStartSLOduration=1.970477383 podStartE2EDuration="1.970477383s" podCreationTimestamp="2026-04-24 22:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:08:16.968580242 +0000 UTC m=+3114.819768743" watchObservedRunningTime="2026-04-24 22:08:16.970477383 +0000 UTC m=+3114.821665885" Apr 24 22:08:17.780133 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:17.780098 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-srdps_e0d12939-1428-44ac-bf2e-c0fee7bf5161/serve-healthcheck-canary/0.log" Apr 24 22:08:18.212833 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:18.212753 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m4fz4_8590998a-f1f9-44c1-b54a-f83ca722daad/kube-rbac-proxy/0.log" Apr 24 22:08:18.232758 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:18.232731 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m4fz4_8590998a-f1f9-44c1-b54a-f83ca722daad/exporter/0.log" Apr 24 22:08:18.253250 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:18.253222 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m4fz4_8590998a-f1f9-44c1-b54a-f83ca722daad/extractor/0.log" Apr 24 22:08:20.218009 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:20.217972 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-gzdl2_31a0d55e-bdad-4e40-904b-9a677606a391/manager/0.log" Apr 24 22:08:20.503151 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:20.503123 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-x9gcs_3e769146-edb3-4f7e-912a-8ecc88bd044d/s3-init/0.log" Apr 24 22:08:20.530992 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:20.530957 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-dnd4k_f01308fc-a4d6-482c-abcd-2c45cba9f156/seaweedfs/0.log" Apr 24 22:08:22.967956 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:22.967926 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-bccfr/perf-node-gather-daemonset-wgtcr" Apr 24 22:08:24.122263 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:24.122199 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-cl5g5_165cdc84-5c62-46ec-9b79-042587eb1d6d/migrator/0.log" Apr 24 22:08:24.145569 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:24.145538 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-cl5g5_165cdc84-5c62-46ec-9b79-042587eb1d6d/graceful-termination/0.log" Apr 24 22:08:25.788634 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:25.788607 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cjzlc_f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1/kube-multus-additional-cni-plugins/0.log" Apr 24 22:08:25.833172 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:25.833096 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cjzlc_f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1/egress-router-binary-copy/0.log" Apr 24 22:08:25.869093 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:25.869055 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cjzlc_f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1/cni-plugins/0.log" Apr 24 22:08:25.912369 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:25.912336 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cjzlc_f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1/bond-cni-plugin/0.log" Apr 24 22:08:25.937205 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:25.937174 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cjzlc_f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1/routeoverride-cni/0.log" Apr 24 22:08:25.980170 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:25.980145 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cjzlc_f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1/whereabouts-cni-bincopy/0.log" Apr 24 22:08:26.051115 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:26.051082 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cjzlc_f8fda77c-3dd1-4cdf-998b-c5cf0dac12b1/whereabouts-cni/0.log" Apr 24 22:08:26.380446 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:26.380414 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-klqg5_b5070b1b-b70e-45ae-b891-b47bcfb3f22a/kube-multus/0.log" Apr 24 22:08:26.460258 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:26.460219 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-62n84_214d9db8-1af2-4a55-8c32-7b0ade9b8b1b/network-metrics-daemon/0.log" Apr 24 22:08:26.490670 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:26.490635 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-62n84_214d9db8-1af2-4a55-8c32-7b0ade9b8b1b/kube-rbac-proxy/0.log" Apr 24 22:08:27.686217 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:27.686181 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-controller/0.log" Apr 24 22:08:27.703966 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:27.703934 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/0.log" Apr 24 22:08:27.731991 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:27.731961 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovn-acl-logging/1.log" Apr 24 22:08:27.755479 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:27.755444 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/kube-rbac-proxy-node/0.log" Apr 24 22:08:27.775538 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:27.775503 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:08:27.793911 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:27.793876 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/northd/0.log" Apr 24 22:08:27.812008 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:27.811970 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/nbdb/0.log" Apr 24 22:08:27.832934 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:27.832907 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/sbdb/0.log" Apr 24 22:08:28.032201 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:28.032103 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cg9z2_38d01fc4-4ff2-408e-baa1-6d9c62d27470/ovnkube-controller/0.log" Apr 24 22:08:29.337427 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:29.337394 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-dwmtg_87f55c8d-8e86-4982-94b5-0f6145c23361/network-check-target-container/0.log" Apr 24 22:08:30.286732 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:30.286697 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-s2wzz_8d2fcb39-1f56-4917-9964-9a549ea0b2a2/iptables-alerter/0.log" Apr 24 22:08:30.963868 ip-10-0-132-81 kubenswrapper[2578]: I0424 22:08:30.963818 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-6xzp7_372d4cd0-c127-43df-b1c7-06d67c0f967c/tuned/0.log"