Apr 16 08:39:07.946989 ip-10-0-139-8 systemd[1]: Starting Kubernetes Kubelet... Apr 16 08:39:08.433087 ip-10-0-139-8 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 08:39:08.433087 ip-10-0-139-8 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 08:39:08.433087 ip-10-0-139-8 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 08:39:08.433703 ip-10-0-139-8 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 08:39:08.433703 ip-10-0-139-8 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 08:39:08.433703 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.433155 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 08:39:08.436753 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436736 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:39:08.436753 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436753 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436757 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436760 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436763 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436766 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436769 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436771 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436775 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436777 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436780 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436783 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436785 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436788 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436792 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436796 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436799 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436802 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436805 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436808 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:39:08.436817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436810 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436814 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436817 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436820 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436823 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436826 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436829 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436831 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436835 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436837 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436840 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436843 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436845 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436848 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436851 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436853 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436856 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436858 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436861 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:39:08.437259 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436865 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436869 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436871 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436874 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436877 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436880 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436882 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436885 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436887 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436890 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436892 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436895 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436897 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436900 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436902 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436906 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436909 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436912 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436914 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436917 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:39:08.437775 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436919 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436922 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436924 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436927 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436929 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436932 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436934 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436937 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436939 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436942 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436944 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436947 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436951 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436954 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436957 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436959 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436962 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436964 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436967 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436969 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:39:08.438249 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436972 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:39:08.438738 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436974 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:39:08.438738 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436977 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:39:08.438738 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436980 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:39:08.438738 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436982 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:39:08.438738 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436985 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:39:08.438738 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.436988 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:39:08.440002 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.439991 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:39:08.440002 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440002 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440006 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440008 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440011 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440014 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440017 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440019 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440022 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440024 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440027 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440029 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440032 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440034 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440037 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440039 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440042 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440044 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440047 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440049 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440052 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:39:08.440066 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440054 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440056 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440059 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440061 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440065 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440067 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440070 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440073 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440076 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440078 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440081 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440084 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440087 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440090 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440092 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440095 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440097 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440099 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440102 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440104 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:39:08.440549 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440107 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440109 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440112 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440114 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440116 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440119 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440121 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440124 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440126 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440129 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440131 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440134 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440136 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440138 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440142 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440144 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440149 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440152 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440155 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:39:08.441050 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440158 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440160 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440163 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440165 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440168 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440171 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440174 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440177 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440180 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440183 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440185 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440187 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440190 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440192 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440195 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440197 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440200 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440203 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440205 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440210 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:39:08.441519 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440213 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440216 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440219 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440221 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440224 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440227 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440295 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440303 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440311 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440315 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440320 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440323 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440328 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440332 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440336 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440339 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440342 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440345 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440349 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440352 2578 flags.go:64] FLAG: --cgroup-root="" Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440355 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440358 2578 flags.go:64] FLAG: --client-ca-file="" Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440361 2578 flags.go:64] FLAG: --cloud-config="" Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440363 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 16 08:39:08.442022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440366 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440371 2578 flags.go:64] FLAG: --cluster-domain="" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440374 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440377 2578 flags.go:64] FLAG: --config-dir="" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440379 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440383 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440387 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440390 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440393 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440396 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440399 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440402 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440405 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440408 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440411 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440416 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440419 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440421 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440424 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440427 2578 flags.go:64] FLAG: --enable-server="true" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440430 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440435 2578 flags.go:64] FLAG: --event-burst="100" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440438 2578 flags.go:64] FLAG: --event-qps="50" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440441 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440444 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 08:39:08.442595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440447 2578 flags.go:64] FLAG: --eviction-hard="" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440451 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440454 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440457 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440460 2578 flags.go:64] FLAG: --eviction-soft="" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440463 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440466 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440469 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440472 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440475 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440478 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440481 2578 flags.go:64] FLAG: --feature-gates="" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440493 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440498 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440501 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440504 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440508 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440511 2578 flags.go:64] FLAG: --help="false" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440514 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-139-8.ec2.internal" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440518 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440521 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440524 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440527 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 08:39:08.443207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440530 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440533 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440536 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440539 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440542 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440545 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440548 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440551 2578 flags.go:64] FLAG: --kube-reserved="" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440554 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440557 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440560 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440563 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440566 2578 flags.go:64] FLAG: --lock-file="" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440570 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440573 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440576 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440582 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440585 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440588 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440590 2578 flags.go:64] FLAG: --logging-format="text" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440593 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440597 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440600 2578 flags.go:64] FLAG: --manifest-url="" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440602 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440607 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 08:39:08.443790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440610 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440614 2578 flags.go:64] FLAG: --max-pods="110" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440618 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440621 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440624 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440627 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440630 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440633 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440636 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440644 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440648 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440650 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440654 2578 flags.go:64] FLAG: --pod-cidr="" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440656 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440663 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440666 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440669 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440671 2578 flags.go:64] FLAG: --port="10250" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440674 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440677 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0235f23bd9194fd04" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440680 2578 flags.go:64] FLAG: --qos-reserved="" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440684 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440687 2578 flags.go:64] FLAG: --register-node="true" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440690 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 16 08:39:08.444390 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440693 2578 flags.go:64] FLAG: --register-with-taints="" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440702 2578 flags.go:64] FLAG: --registry-burst="10" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440704 2578 flags.go:64] FLAG: --registry-qps="5" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440720 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440724 2578 flags.go:64] FLAG: --reserved-memory="" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440731 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440734 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440737 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440740 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440743 2578 flags.go:64] FLAG: --runonce="false" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440746 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440749 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440752 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440755 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440758 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440761 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440764 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440767 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440770 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440772 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440775 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440778 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440781 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440784 2578 flags.go:64] FLAG: --system-cgroups="" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440786 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 08:39:08.444989 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440792 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440795 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440797 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440801 2578 flags.go:64] FLAG: --tls-min-version="" Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440804 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440807 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440810 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440813 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440816 2578 flags.go:64] FLAG: --v="2" Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440820 2578 flags.go:64] FLAG: --version="false" Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440824 2578 flags.go:64] FLAG: --vmodule="" Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440828 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.440831 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440926 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440933 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440936 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440939 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440943 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440945 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440948 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440951 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440954 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440957 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:39:08.445606 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440959 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440962 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440964 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440967 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440969 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440972 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440974 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440977 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440979 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440982 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440984 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440987 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440989 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440991 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440994 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.440998 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441001 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441003 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441005 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:39:08.446185 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441008 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441011 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441013 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441016 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441020 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441022 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441025 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441029 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441033 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441035 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441040 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441043 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441046 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441049 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441052 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441054 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441057 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441059 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441062 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441064 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:39:08.446666 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441067 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441069 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441072 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441074 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441077 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441079 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441081 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441084 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441086 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441090 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441092 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441095 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441097 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441100 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441102 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441105 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441108 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441111 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441113 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441116 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:39:08.447477 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441118 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:39:08.448272 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441122 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:39:08.448272 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441127 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:39:08.448272 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441130 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:39:08.448272 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441133 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:39:08.448272 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441135 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:39:08.448272 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441138 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:39:08.448272 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441140 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:39:08.448272 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441143 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:39:08.448272 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441145 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:39:08.448272 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441148 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:39:08.448272 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441150 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:39:08.448272 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441153 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:39:08.448272 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441155 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:39:08.448272 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441157 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:39:08.448272 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441160 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:39:08.448272 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.441162 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:39:08.448884 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.441755 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 08:39:08.450445 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.450423 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 08:39:08.450445 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.450446 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450518 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450526 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450532 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450538 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450543 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450547 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450552 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450556 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450560 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450565 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450569 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450573 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450577 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450581 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450587 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450591 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450595 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450599 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:39:08.450596 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450603 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450607 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450612 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450615 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450619 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450623 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450627 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450631 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450635 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450639 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450643 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450648 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450652 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450655 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450659 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450663 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450667 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450671 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450675 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450678 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:39:08.451467 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450682 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450686 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450690 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450694 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450698 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450703 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450731 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450736 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450740 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450744 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450748 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450752 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450756 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450759 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450765 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450769 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450773 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450777 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450781 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450785 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:39:08.452103 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450789 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450793 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450798 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450802 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450806 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450810 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450816 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450821 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450826 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450829 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450833 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450840 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450847 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450851 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450856 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450862 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450866 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450871 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450877 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:39:08.452945 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450881 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:39:08.453409 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450885 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:39:08.453409 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450889 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:39:08.453409 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450894 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:39:08.453409 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450898 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:39:08.453409 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450903 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:39:08.453409 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450907 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:39:08.453409 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450913 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:39:08.453409 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.450918 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:39:08.453409 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.450926 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 08:39:08.453409 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451086 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:39:08.453409 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451094 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:39:08.453409 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451099 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:39:08.453409 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451105 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:39:08.453409 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451110 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:39:08.453409 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451114 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:39:08.453409 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451119 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451123 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451127 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451132 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451136 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451140 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451144 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451148 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451153 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451157 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451161 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451165 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451169 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451173 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451177 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451181 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451186 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451190 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451194 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:39:08.453828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451198 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451202 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451206 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451210 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451215 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451219 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451224 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451232 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451236 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451240 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451244 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451248 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451252 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451256 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451261 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451265 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451269 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451273 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451277 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451281 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:39:08.454302 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451286 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451290 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451295 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451301 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451306 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451311 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451315 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451320 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451324 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451328 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451334 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451339 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451343 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451348 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451352 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451357 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451361 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451366 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451371 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451375 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:39:08.454806 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451379 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451383 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451388 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451392 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451397 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451401 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451405 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451409 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451413 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451418 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451422 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451426 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451430 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451434 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451439 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451442 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451446 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451450 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451454 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:39:08.455305 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451458 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:39:08.456036 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:08.451462 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:39:08.456036 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.451469 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 08:39:08.456036 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.451639 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 08:39:08.457816 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.457799 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 08:39:08.458919 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.458905 2578 server.go:1019] "Starting client certificate rotation" Apr 16 08:39:08.459028 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.459009 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 08:39:08.459067 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.459054 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 08:39:08.487168 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.487141 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 08:39:08.489956 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.489935 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 08:39:08.509576 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.509553 2578 log.go:25] "Validated CRI v1 runtime API" Apr 16 08:39:08.514997 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.514976 2578 log.go:25] "Validated CRI v1 image API" Apr 16 08:39:08.516278 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.516264 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 08:39:08.521351 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.521330 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 08:39:08.524620 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.524592 2578 fs.go:135] Filesystem UUIDs: map[1375ade0-018a-4114-8d27-d80802a471d9:/dev/nvme0n1p3 200dd984-4860-4fac-adb7-d6d7622e0ddd:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 08:39:08.524702 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.524621 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 08:39:08.531300 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.531161 2578 manager.go:217] Machine: {Timestamp:2026-04-16 08:39:08.528893213 +0000 UTC m=+0.448879079 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099911 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec220c55b402685dbfb807eadf071eb8 SystemUUID:ec220c55-b402-685d-bfb8-07eadf071eb8 BootID:548148ad-bf5d-4889-8f3f-94886a6ec4e5 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f1:7c:8c:a8:27 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f1:7c:8c:a8:27 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ae:fd:08:b8:38:16 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 08:39:08.531300 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.531290 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 08:39:08.531425 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.531370 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 08:39:08.532610 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.532583 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 08:39:08.532758 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.532613 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-8.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 08:39:08.532806 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.532765 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 08:39:08.532806 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.532773 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 08:39:08.532806 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.532785 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 08:39:08.533680 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.533670 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 08:39:08.534953 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.534944 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 08:39:08.535056 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.535047 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 08:39:08.535582 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.535565 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fn284" Apr 16 08:39:08.537964 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.537954 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 16 08:39:08.538007 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.537968 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 08:39:08.538007 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.537984 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 08:39:08.538007 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.537994 2578 kubelet.go:397] "Adding apiserver pod source" Apr 16 08:39:08.538007 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.538003 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 08:39:08.539244 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.539229 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 08:39:08.539244 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.539247 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 08:39:08.542299 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.542277 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 08:39:08.542393 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.542379 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fn284" Apr 16 08:39:08.543996 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.543982 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 08:39:08.545395 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.545382 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 08:39:08.545435 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.545404 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 08:39:08.545435 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.545413 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 08:39:08.545435 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.545423 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 08:39:08.545435 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.545432 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 08:39:08.545543 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.545442 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 08:39:08.545543 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.545451 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 08:39:08.545543 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.545460 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 08:39:08.545543 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.545470 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 08:39:08.545543 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.545479 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 08:39:08.545543 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.545491 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 08:39:08.545543 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.545503 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 08:39:08.546429 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.546415 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 08:39:08.546429 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.546430 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 08:39:08.549489 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.549470 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:39:08.549895 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.549883 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 08:39:08.549941 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.549917 2578 server.go:1295] "Started kubelet" Apr 16 08:39:08.550091 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.550060 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 08:39:08.550091 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.550056 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 08:39:08.550199 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.550115 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 08:39:08.550828 ip-10-0-139-8 systemd[1]: Started Kubernetes Kubelet. Apr 16 08:39:08.550922 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.550829 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:39:08.551212 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.551194 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 08:39:08.554120 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.554095 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-8.ec2.internal" not found Apr 16 08:39:08.554905 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.554889 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 16 08:39:08.559137 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.559119 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 08:39:08.559764 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.559747 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 08:39:08.560556 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.560542 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 08:39:08.560556 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.560556 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 08:39:08.560688 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.560597 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 08:39:08.560688 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.560667 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 16 08:39:08.560688 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.560673 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 16 08:39:08.560981 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:08.560877 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-8.ec2.internal\" not found" Apr 16 08:39:08.561048 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.561012 2578 factory.go:153] Registering CRI-O factory Apr 16 08:39:08.561048 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.561029 2578 factory.go:223] Registration of the crio container factory successfully Apr 16 08:39:08.561125 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.561102 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 08:39:08.561125 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.561113 2578 factory.go:55] Registering systemd factory Apr 16 08:39:08.561125 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.561121 2578 factory.go:223] Registration of the systemd container factory successfully Apr 16 08:39:08.561206 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.561144 2578 factory.go:103] Registering Raw factory Apr 16 08:39:08.561206 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.561162 2578 manager.go:1196] Started watching for new ooms in manager Apr 16 08:39:08.561848 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.561836 2578 manager.go:319] Starting recovery of all containers Apr 16 08:39:08.562130 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.562108 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:39:08.564832 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:08.564613 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-8.ec2.internal\" not found" node="ip-10-0-139-8.ec2.internal" Apr 16 08:39:08.565122 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:08.565099 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 08:39:08.568553 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.568538 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-8.ec2.internal" not found Apr 16 08:39:08.569107 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.569065 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 08:39:08.571769 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.571752 2578 manager.go:324] Recovery completed Apr 16 08:39:08.576453 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.576440 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:39:08.578702 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.578687 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-8.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:39:08.578790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.578733 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-8.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:39:08.578790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.578744 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-8.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:39:08.579211 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.579198 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 08:39:08.579260 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.579213 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 08:39:08.579260 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.579232 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 08:39:08.581649 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.581637 2578 policy_none.go:49] "None policy: Start" Apr 16 08:39:08.581692 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.581655 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 08:39:08.581692 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.581665 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 16 08:39:08.619356 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.619340 2578 manager.go:341] "Starting Device Plugin manager" Apr 16 08:39:08.639815 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:08.619383 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 08:39:08.639815 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.619396 2578 server.go:85] "Starting device plugin registration server" Apr 16 08:39:08.639815 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.619651 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 08:39:08.639815 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.619664 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 08:39:08.639815 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.619782 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 08:39:08.639815 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.619847 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 08:39:08.639815 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.619855 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 08:39:08.639815 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:08.620343 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 08:39:08.639815 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:08.620510 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-8.ec2.internal\" not found" Apr 16 08:39:08.639815 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.623222 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-8.ec2.internal" not found Apr 16 08:39:08.661260 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.661233 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 08:39:08.661349 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.661268 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 08:39:08.661349 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.661287 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 08:39:08.661349 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.661293 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 08:39:08.661349 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:08.661324 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 08:39:08.665205 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.665185 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:39:08.720729 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.720638 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:39:08.721621 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.721604 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-8.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:39:08.721707 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.721633 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-8.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:39:08.721707 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.721644 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-8.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:39:08.721707 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.721668 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-8.ec2.internal" Apr 16 08:39:08.730637 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.730618 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-8.ec2.internal" Apr 16 08:39:08.761833 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.761805 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-8.ec2.internal"] Apr 16 08:39:08.765064 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.765048 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal" Apr 16 08:39:08.765142 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.765052 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-8.ec2.internal" Apr 16 08:39:08.788344 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.788326 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal" Apr 16 08:39:08.792800 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.792786 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-8.ec2.internal" Apr 16 08:39:08.801517 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.801502 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 08:39:08.801592 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.801502 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 08:39:08.961124 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.961095 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/80da84d77c931c1bc8843e7cd21359fc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal\" (UID: \"80da84d77c931c1bc8843e7cd21359fc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal" Apr 16 08:39:08.961124 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.961122 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/80da84d77c931c1bc8843e7cd21359fc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal\" (UID: \"80da84d77c931c1bc8843e7cd21359fc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal" Apr 16 08:39:08.961320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:08.961145 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ff050f160b53222212c83886cf4ca7d7-config\") pod \"kube-apiserver-proxy-ip-10-0-139-8.ec2.internal\" (UID: \"ff050f160b53222212c83886cf4ca7d7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-8.ec2.internal" Apr 16 08:39:09.062125 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.062049 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/80da84d77c931c1bc8843e7cd21359fc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal\" (UID: \"80da84d77c931c1bc8843e7cd21359fc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal" Apr 16 08:39:09.062125 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.062002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/80da84d77c931c1bc8843e7cd21359fc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal\" (UID: \"80da84d77c931c1bc8843e7cd21359fc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal" Apr 16 08:39:09.062125 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.062116 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/80da84d77c931c1bc8843e7cd21359fc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal\" (UID: \"80da84d77c931c1bc8843e7cd21359fc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal" Apr 16 08:39:09.062314 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.062140 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ff050f160b53222212c83886cf4ca7d7-config\") pod \"kube-apiserver-proxy-ip-10-0-139-8.ec2.internal\" (UID: \"ff050f160b53222212c83886cf4ca7d7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-8.ec2.internal" Apr 16 08:39:09.062314 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.062189 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ff050f160b53222212c83886cf4ca7d7-config\") pod \"kube-apiserver-proxy-ip-10-0-139-8.ec2.internal\" (UID: \"ff050f160b53222212c83886cf4ca7d7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-8.ec2.internal" Apr 16 08:39:09.062314 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.062196 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/80da84d77c931c1bc8843e7cd21359fc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal\" (UID: \"80da84d77c931c1bc8843e7cd21359fc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal" Apr 16 08:39:09.105207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.105175 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-8.ec2.internal" Apr 16 08:39:09.105316 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.105231 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal" Apr 16 08:39:09.459190 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.459104 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 08:39:09.459701 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.459236 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 08:39:09.459701 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.459267 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 08:39:09.459701 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.459284 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 08:39:09.538144 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.538113 2578 apiserver.go:52] "Watching apiserver" Apr 16 08:39:09.544573 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.544528 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 08:34:08 +0000 UTC" deadline="2028-01-13 23:09:56.723329632 +0000 UTC" Apr 16 08:39:09.544573 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.544563 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15302h30m47.178768815s" Apr 16 08:39:09.545960 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.545943 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 08:39:09.546281 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.546262 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-kvw77","openshift-network-operator/iptables-alerter-7m67g","openshift-dns/node-resolver-6ngxf","openshift-image-registry/node-ca-xxvcl","openshift-multus/multus-additional-cni-plugins-nsv7m","openshift-network-diagnostics/network-check-target-x9sn2","openshift-ovn-kubernetes/ovnkube-node-cvzdh","kube-system/konnectivity-agent-g5rw9","kube-system/kube-apiserver-proxy-ip-10-0-139-8.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc","openshift-cluster-node-tuning-operator/tuned-wrx69","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal","openshift-multus/multus-hh7sm"] Apr 16 08:39:09.547837 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.547820 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:09.547917 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:09.547900 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kvw77" podUID="13e8353c-4eb0-4abd-98df-42ece4ec0318" Apr 16 08:39:09.549012 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.548974 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7m67g" Apr 16 08:39:09.550198 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.550180 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6ngxf" Apr 16 08:39:09.551329 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.551308 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 08:39:09.551452 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.551434 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:39:09.551524 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.551454 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-zlvc6\"" Apr 16 08:39:09.551524 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.551468 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 08:39:09.552639 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.552623 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 08:39:09.552639 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.552635 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xxvcl" Apr 16 08:39:09.552990 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.552972 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 08:39:09.553468 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.553401 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8gc9p\"" Apr 16 08:39:09.554289 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.554266 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:09.554394 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:09.554365 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9sn2" podUID="e7a5a6cf-7715-4262-8d6b-d3268b40a1ad" Apr 16 08:39:09.556874 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.555333 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-szt97\"" Apr 16 08:39:09.556874 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.555317 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 08:39:09.556874 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.555377 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 08:39:09.556874 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.555428 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 08:39:09.556874 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.555664 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.557550 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.557529 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.558149 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.558133 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 08:39:09.558285 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.558264 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 08:39:09.558426 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.558403 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 08:39:09.558426 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.558412 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-96fgd\"" Apr 16 08:39:09.558786 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.558766 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 08:39:09.558882 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.558799 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 08:39:09.558882 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.558803 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-g5rw9" Apr 16 08:39:09.559206 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.559190 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 08:39:09.559667 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.559652 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 08:39:09.559931 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.559919 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 08:39:09.560188 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.560173 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 08:39:09.560301 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.560288 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 08:39:09.560343 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.560321 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.561494 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.561475 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-zk9zs\"" Apr 16 08:39:09.561570 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.561480 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 08:39:09.561570 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.561479 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 08:39:09.561800 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.561786 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 08:39:09.561800 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.561793 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 08:39:09.561958 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.561844 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fgwx2\"" Apr 16 08:39:09.562105 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.562088 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.562393 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.562380 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 08:39:09.562748 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.562734 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 08:39:09.562822 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.562737 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 08:39:09.562822 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.562764 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-gjpcq\"" Apr 16 08:39:09.563522 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.563499 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.564320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.564302 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:39:09.564660 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.564646 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-t5cfx\"" Apr 16 08:39:09.564729 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.564668 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 08:39:09.565067 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565049 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/654861bc-7246-41e6-a23f-20623cc156ef-konnectivity-ca\") pod \"konnectivity-agent-g5rw9\" (UID: \"654861bc-7246-41e6-a23f-20623cc156ef\") " pod="kube-system/konnectivity-agent-g5rw9" Apr 16 08:39:09.565140 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565077 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8ad5142c-d6f7-4dec-94b9-064b7167a387-device-dir\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.565140 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565094 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkd6x\" (UniqueName: \"kubernetes.io/projected/8ad5142c-d6f7-4dec-94b9-064b7167a387-kube-api-access-fkd6x\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.565140 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565109 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt6gw\" (UniqueName: \"kubernetes.io/projected/c9e716af-74d3-4a46-891e-32d46250da3e-kube-api-access-mt6gw\") pod \"iptables-alerter-7m67g\" (UID: \"c9e716af-74d3-4a46-891e-32d46250da3e\") " pod="openshift-network-operator/iptables-alerter-7m67g" Apr 16 08:39:09.565140 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565125 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7sqg\" (UniqueName: \"kubernetes.io/projected/1ba56a43-ff49-4b6d-a602-289479e4e2f7-kube-api-access-n7sqg\") pod \"node-resolver-6ngxf\" (UID: \"1ba56a43-ff49-4b6d-a602-289479e4e2f7\") " pod="openshift-dns/node-resolver-6ngxf" Apr 16 08:39:09.565386 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565143 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-var-lib-openvswitch\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.565386 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565165 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e2881a10-5691-4f22-92fd-70bdbdbacec2-ovnkube-script-lib\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.565386 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565210 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-modprobe-d\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.565386 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565236 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2583f7f3-820e-46ff-b710-c2256f41f5c1-cni-binary-copy\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.565386 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565256 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e2881a10-5691-4f22-92fd-70bdbdbacec2-ovn-node-metrics-cert\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.565386 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565280 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-var-lib-kubelet\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.565386 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565305 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1ba56a43-ff49-4b6d-a602-289479e4e2f7-tmp-dir\") pod \"node-resolver-6ngxf\" (UID: \"1ba56a43-ff49-4b6d-a602-289479e4e2f7\") " pod="openshift-dns/node-resolver-6ngxf" Apr 16 08:39:09.565386 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565328 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-run-systemd\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.565386 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565353 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-run-ovn-kubernetes\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.565386 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565373 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvbd7\" (UniqueName: \"kubernetes.io/projected/13e8353c-4eb0-4abd-98df-42ece4ec0318-kube-api-access-cvbd7\") pod \"network-metrics-daemon-kvw77\" (UID: \"13e8353c-4eb0-4abd-98df-42ece4ec0318\") " pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:09.565386 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565388 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45fcs\" (UniqueName: \"kubernetes.io/projected/9befbb38-427c-4f04-9ac5-007147cbf0ea-kube-api-access-45fcs\") pod \"node-ca-xxvcl\" (UID: \"9befbb38-427c-4f04-9ac5-007147cbf0ea\") " pod="openshift-image-registry/node-ca-xxvcl" Apr 16 08:39:09.565875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565412 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-slash\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.565875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565434 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-lib-modules\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.565875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565449 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2583f7f3-820e-46ff-b710-c2256f41f5c1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.565875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565477 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-systemd-units\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.565875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565538 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 08:39:09.565875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565541 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzpv6\" (UniqueName: \"kubernetes.io/projected/e2881a10-5691-4f22-92fd-70bdbdbacec2-kube-api-access-dzpv6\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.565875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565571 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-host\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.565875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565597 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-tuned\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.565875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565620 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqh55\" (UniqueName: \"kubernetes.io/projected/46d8d537-5ba7-484f-944f-56bdb9ac055f-kube-api-access-jqh55\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.565875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565643 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8ad5142c-d6f7-4dec-94b9-064b7167a387-registration-dir\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.565875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565664 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-sysctl-conf\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.565875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565678 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs\") pod \"network-metrics-daemon-kvw77\" (UID: \"13e8353c-4eb0-4abd-98df-42ece4ec0318\") " pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:09.565875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565698 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9befbb38-427c-4f04-9ac5-007147cbf0ea-host\") pod \"node-ca-xxvcl\" (UID: \"9befbb38-427c-4f04-9ac5-007147cbf0ea\") " pod="openshift-image-registry/node-ca-xxvcl" Apr 16 08:39:09.565875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565739 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9befbb38-427c-4f04-9ac5-007147cbf0ea-serviceca\") pod \"node-ca-xxvcl\" (UID: \"9befbb38-427c-4f04-9ac5-007147cbf0ea\") " pod="openshift-image-registry/node-ca-xxvcl" Apr 16 08:39:09.565875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565768 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1ba56a43-ff49-4b6d-a602-289479e4e2f7-hosts-file\") pod \"node-resolver-6ngxf\" (UID: \"1ba56a43-ff49-4b6d-a602-289479e4e2f7\") " pod="openshift-dns/node-resolver-6ngxf" Apr 16 08:39:09.565875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565843 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8ad5142c-d6f7-4dec-94b9-064b7167a387-socket-dir\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.566465 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565872 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zp5k\" (UniqueName: \"kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k\") pod \"network-check-target-x9sn2\" (UID: \"e7a5a6cf-7715-4262-8d6b-d3268b40a1ad\") " pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:09.566465 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565912 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-cni-netd\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.566465 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565948 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.566465 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565972 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e2881a10-5691-4f22-92fd-70bdbdbacec2-env-overrides\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.566465 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.565993 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/46d8d537-5ba7-484f-944f-56bdb9ac055f-tmp\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.566465 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566014 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c9e716af-74d3-4a46-891e-32d46250da3e-host-slash\") pod \"iptables-alerter-7m67g\" (UID: \"c9e716af-74d3-4a46-891e-32d46250da3e\") " pod="openshift-network-operator/iptables-alerter-7m67g" Apr 16 08:39:09.566465 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566036 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-sysconfig\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.566465 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566061 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2583f7f3-820e-46ff-b710-c2256f41f5c1-system-cni-dir\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.566465 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566091 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bd8b\" (UniqueName: \"kubernetes.io/projected/2583f7f3-820e-46ff-b710-c2256f41f5c1-kube-api-access-7bd8b\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.566465 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566092 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-rpk6n\"" Apr 16 08:39:09.566465 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566133 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-log-socket\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.566465 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566164 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ad5142c-d6f7-4dec-94b9-064b7167a387-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.566465 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566189 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-etc-openvswitch\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.566465 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566212 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-run-ovn\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.566465 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566235 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e2881a10-5691-4f22-92fd-70bdbdbacec2-ovnkube-config\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.566465 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566260 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/654861bc-7246-41e6-a23f-20623cc156ef-agent-certs\") pod \"konnectivity-agent-g5rw9\" (UID: \"654861bc-7246-41e6-a23f-20623cc156ef\") " pod="kube-system/konnectivity-agent-g5rw9" Apr 16 08:39:09.566465 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566282 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8ad5142c-d6f7-4dec-94b9-064b7167a387-sys-fs\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.566981 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566305 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-run\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.566981 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566331 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8ad5142c-d6f7-4dec-94b9-064b7167a387-etc-selinux\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.566981 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566353 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-systemd\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.566981 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566373 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-sys\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.566981 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566403 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2583f7f3-820e-46ff-b710-c2256f41f5c1-os-release\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.566981 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566428 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2583f7f3-820e-46ff-b710-c2256f41f5c1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.566981 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566451 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-run-openvswitch\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.566981 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566473 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-cni-bin\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.566981 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566495 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-kubernetes\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.566981 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566516 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-sysctl-d\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.566981 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566539 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-node-log\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.566981 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566563 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c9e716af-74d3-4a46-891e-32d46250da3e-iptables-alerter-script\") pod \"iptables-alerter-7m67g\" (UID: \"c9e716af-74d3-4a46-891e-32d46250da3e\") " pod="openshift-network-operator/iptables-alerter-7m67g" Apr 16 08:39:09.566981 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566586 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2583f7f3-820e-46ff-b710-c2256f41f5c1-cnibin\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.566981 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566637 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2583f7f3-820e-46ff-b710-c2256f41f5c1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.566981 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566658 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-kubelet\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.566981 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.566675 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-run-netns\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.568892 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.568871 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 08:39:09.587759 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.587740 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-k6pxl" Apr 16 08:39:09.596947 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.596928 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-k6pxl" Apr 16 08:39:09.632828 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:09.632803 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff050f160b53222212c83886cf4ca7d7.slice/crio-b805c4c261027b400f2009d374d6258149bc0c266e45f3245d31552e57a780d9 WatchSource:0}: Error finding container b805c4c261027b400f2009d374d6258149bc0c266e45f3245d31552e57a780d9: Status 404 returned error can't find the container with id b805c4c261027b400f2009d374d6258149bc0c266e45f3245d31552e57a780d9 Apr 16 08:39:09.633052 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:09.633033 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80da84d77c931c1bc8843e7cd21359fc.slice/crio-117653ef45316ba1e1c20d25635b4863e44af47b2ef263589e9cc00fc8fb6b8a WatchSource:0}: Error finding container 117653ef45316ba1e1c20d25635b4863e44af47b2ef263589e9cc00fc8fb6b8a: Status 404 returned error can't find the container with id 117653ef45316ba1e1c20d25635b4863e44af47b2ef263589e9cc00fc8fb6b8a Apr 16 08:39:09.637380 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.637360 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 08:39:09.662138 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.662122 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 08:39:09.664328 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.664287 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal" event={"ID":"80da84d77c931c1bc8843e7cd21359fc","Type":"ContainerStarted","Data":"117653ef45316ba1e1c20d25635b4863e44af47b2ef263589e9cc00fc8fb6b8a"} Apr 16 08:39:09.665186 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.665170 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-8.ec2.internal" event={"ID":"ff050f160b53222212c83886cf4ca7d7","Type":"ContainerStarted","Data":"b805c4c261027b400f2009d374d6258149bc0c266e45f3245d31552e57a780d9"} Apr 16 08:39:09.667420 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667406 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-slash\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.667471 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667428 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-lib-modules\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.667471 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667443 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2583f7f3-820e-46ff-b710-c2256f41f5c1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.667471 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667457 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-systemd-units\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.667586 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667471 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzpv6\" (UniqueName: \"kubernetes.io/projected/e2881a10-5691-4f22-92fd-70bdbdbacec2-kube-api-access-dzpv6\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.667586 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667511 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-systemd-units\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.667586 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667537 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-host\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.667586 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667578 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-tuned\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.667586 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667540 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-slash\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.667835 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667591 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-host\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.667835 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667576 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-lib-modules\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.667835 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667606 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqh55\" (UniqueName: \"kubernetes.io/projected/46d8d537-5ba7-484f-944f-56bdb9ac055f-kube-api-access-jqh55\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.667835 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667632 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-host-var-lib-kubelet\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.667835 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667649 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-hostroot\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.667835 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667667 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8ad5142c-d6f7-4dec-94b9-064b7167a387-registration-dir\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.667835 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667691 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-sysctl-conf\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.667835 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667733 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs\") pod \"network-metrics-daemon-kvw77\" (UID: \"13e8353c-4eb0-4abd-98df-42ece4ec0318\") " pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:09.667835 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667759 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9befbb38-427c-4f04-9ac5-007147cbf0ea-host\") pod \"node-ca-xxvcl\" (UID: \"9befbb38-427c-4f04-9ac5-007147cbf0ea\") " pod="openshift-image-registry/node-ca-xxvcl" Apr 16 08:39:09.667835 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667760 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8ad5142c-d6f7-4dec-94b9-064b7167a387-registration-dir\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.667835 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667795 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-sysctl-conf\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.667835 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667799 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9befbb38-427c-4f04-9ac5-007147cbf0ea-serviceca\") pod \"node-ca-xxvcl\" (UID: \"9befbb38-427c-4f04-9ac5-007147cbf0ea\") " pod="openshift-image-registry/node-ca-xxvcl" Apr 16 08:39:09.667835 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667833 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-multus-conf-dir\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.668360 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:09.667857 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:39:09.668360 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667864 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1ba56a43-ff49-4b6d-a602-289479e4e2f7-hosts-file\") pod \"node-resolver-6ngxf\" (UID: \"1ba56a43-ff49-4b6d-a602-289479e4e2f7\") " pod="openshift-dns/node-resolver-6ngxf" Apr 16 08:39:09.668360 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667798 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9befbb38-427c-4f04-9ac5-007147cbf0ea-host\") pod \"node-ca-xxvcl\" (UID: \"9befbb38-427c-4f04-9ac5-007147cbf0ea\") " pod="openshift-image-registry/node-ca-xxvcl" Apr 16 08:39:09.668360 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667888 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8ad5142c-d6f7-4dec-94b9-064b7167a387-socket-dir\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.668360 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:09.667919 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs podName:13e8353c-4eb0-4abd-98df-42ece4ec0318 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:10.167887684 +0000 UTC m=+2.087873537 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs") pod "network-metrics-daemon-kvw77" (UID: "13e8353c-4eb0-4abd-98df-42ece4ec0318") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:39:09.668360 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667928 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 08:39:09.668360 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667946 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1ba56a43-ff49-4b6d-a602-289479e4e2f7-hosts-file\") pod \"node-resolver-6ngxf\" (UID: \"1ba56a43-ff49-4b6d-a602-289479e4e2f7\") " pod="openshift-dns/node-resolver-6ngxf" Apr 16 08:39:09.668360 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zp5k\" (UniqueName: \"kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k\") pod \"network-check-target-x9sn2\" (UID: \"e7a5a6cf-7715-4262-8d6b-d3268b40a1ad\") " pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:09.668360 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667984 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-host-run-k8s-cni-cncf-io\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.668360 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.667988 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8ad5142c-d6f7-4dec-94b9-064b7167a387-socket-dir\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.668360 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668011 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-cni-netd\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.668360 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668038 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.668360 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668063 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e2881a10-5691-4f22-92fd-70bdbdbacec2-env-overrides\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.668360 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668085 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/46d8d537-5ba7-484f-944f-56bdb9ac055f-tmp\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.668360 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668088 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-cni-netd\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.668360 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668118 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9befbb38-427c-4f04-9ac5-007147cbf0ea-serviceca\") pod \"node-ca-xxvcl\" (UID: \"9befbb38-427c-4f04-9ac5-007147cbf0ea\") " pod="openshift-image-registry/node-ca-xxvcl" Apr 16 08:39:09.668360 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668124 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2583f7f3-820e-46ff-b710-c2256f41f5c1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.669158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668125 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.669158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668147 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6fc606d-9332-4d96-911e-24bed66bbda7-cni-binary-copy\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.669158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668173 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c6fc606d-9332-4d96-911e-24bed66bbda7-multus-daemon-config\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.669158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668190 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ls9d\" (UniqueName: \"kubernetes.io/projected/c6fc606d-9332-4d96-911e-24bed66bbda7-kube-api-access-5ls9d\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.669158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668208 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c9e716af-74d3-4a46-891e-32d46250da3e-host-slash\") pod \"iptables-alerter-7m67g\" (UID: \"c9e716af-74d3-4a46-891e-32d46250da3e\") " pod="openshift-network-operator/iptables-alerter-7m67g" Apr 16 08:39:09.669158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668229 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-sysconfig\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.669158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668259 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c9e716af-74d3-4a46-891e-32d46250da3e-host-slash\") pod \"iptables-alerter-7m67g\" (UID: \"c9e716af-74d3-4a46-891e-32d46250da3e\") " pod="openshift-network-operator/iptables-alerter-7m67g" Apr 16 08:39:09.669158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668274 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-sysconfig\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.669158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668253 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-os-release\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.669158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668307 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-multus-socket-dir-parent\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.669158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668324 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-etc-kubernetes\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.669158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668358 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2583f7f3-820e-46ff-b710-c2256f41f5c1-system-cni-dir\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.669158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668381 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bd8b\" (UniqueName: \"kubernetes.io/projected/2583f7f3-820e-46ff-b710-c2256f41f5c1-kube-api-access-7bd8b\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.669158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668398 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-log-socket\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.669158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668412 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ad5142c-d6f7-4dec-94b9-064b7167a387-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.669158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668417 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2583f7f3-820e-46ff-b710-c2256f41f5c1-system-cni-dir\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.669158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668431 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-system-cni-dir\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.669933 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668457 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-etc-openvswitch\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.669933 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668462 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-log-socket\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.669933 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668481 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-run-ovn\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.669933 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668481 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ad5142c-d6f7-4dec-94b9-064b7167a387-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.669933 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668526 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-etc-openvswitch\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.669933 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668535 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-run-ovn\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.669933 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668555 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e2881a10-5691-4f22-92fd-70bdbdbacec2-ovnkube-config\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.669933 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668577 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/654861bc-7246-41e6-a23f-20623cc156ef-agent-certs\") pod \"konnectivity-agent-g5rw9\" (UID: \"654861bc-7246-41e6-a23f-20623cc156ef\") " pod="kube-system/konnectivity-agent-g5rw9" Apr 16 08:39:09.669933 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668596 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8ad5142c-d6f7-4dec-94b9-064b7167a387-sys-fs\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.669933 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668614 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-run\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.669933 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668682 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-run\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.669933 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668697 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8ad5142c-d6f7-4dec-94b9-064b7167a387-sys-fs\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.669933 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668753 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8ad5142c-d6f7-4dec-94b9-064b7167a387-etc-selinux\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.669933 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-systemd\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.669933 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668803 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-sys\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.669933 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668828 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2583f7f3-820e-46ff-b710-c2256f41f5c1-os-release\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.669933 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668856 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2583f7f3-820e-46ff-b710-c2256f41f5c1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.670606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668882 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-run-openvswitch\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.670606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668905 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-cni-bin\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.670606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668934 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-kubernetes\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.670606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668957 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-sysctl-d\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.670606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668975 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8ad5142c-d6f7-4dec-94b9-064b7167a387-etc-selinux\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.670606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.668980 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-node-log\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.670606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669009 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c9e716af-74d3-4a46-891e-32d46250da3e-iptables-alerter-script\") pod \"iptables-alerter-7m67g\" (UID: \"c9e716af-74d3-4a46-891e-32d46250da3e\") " pod="openshift-network-operator/iptables-alerter-7m67g" Apr 16 08:39:09.670606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669035 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2583f7f3-820e-46ff-b710-c2256f41f5c1-cnibin\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.670606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669062 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2583f7f3-820e-46ff-b710-c2256f41f5c1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.670606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669089 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-kubelet\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.670606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669091 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e2881a10-5691-4f22-92fd-70bdbdbacec2-env-overrides\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.670606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669116 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-run-netns\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.670606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669142 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/654861bc-7246-41e6-a23f-20623cc156ef-konnectivity-ca\") pod \"konnectivity-agent-g5rw9\" (UID: \"654861bc-7246-41e6-a23f-20623cc156ef\") " pod="kube-system/konnectivity-agent-g5rw9" Apr 16 08:39:09.670606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669152 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-node-log\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.670606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669213 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8ad5142c-d6f7-4dec-94b9-064b7167a387-device-dir\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.670606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669353 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e2881a10-5691-4f22-92fd-70bdbdbacec2-ovnkube-config\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.670606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669360 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-run-openvswitch\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.671320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669406 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-kubernetes\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.671320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669423 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-cni-bin\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.671320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669167 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8ad5142c-d6f7-4dec-94b9-064b7167a387-device-dir\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.671320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669477 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkd6x\" (UniqueName: \"kubernetes.io/projected/8ad5142c-d6f7-4dec-94b9-064b7167a387-kube-api-access-fkd6x\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.671320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669490 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-sysctl-d\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.671320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669506 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-host-run-netns\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.671320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669512 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2583f7f3-820e-46ff-b710-c2256f41f5c1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.671320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669531 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-host-var-lib-cni-multus\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.671320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669558 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-host-run-multus-certs\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.671320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669575 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-systemd\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.671320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt6gw\" (UniqueName: \"kubernetes.io/projected/c9e716af-74d3-4a46-891e-32d46250da3e-kube-api-access-mt6gw\") pod \"iptables-alerter-7m67g\" (UID: \"c9e716af-74d3-4a46-891e-32d46250da3e\") " pod="openshift-network-operator/iptables-alerter-7m67g" Apr 16 08:39:09.671320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669611 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7sqg\" (UniqueName: \"kubernetes.io/projected/1ba56a43-ff49-4b6d-a602-289479e4e2f7-kube-api-access-n7sqg\") pod \"node-resolver-6ngxf\" (UID: \"1ba56a43-ff49-4b6d-a602-289479e4e2f7\") " pod="openshift-dns/node-resolver-6ngxf" Apr 16 08:39:09.671320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669622 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-sys\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.671320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669636 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-var-lib-openvswitch\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.671320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669660 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e2881a10-5691-4f22-92fd-70bdbdbacec2-ovnkube-script-lib\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.671320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669675 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2583f7f3-820e-46ff-b710-c2256f41f5c1-os-release\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.671320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669685 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-modprobe-d\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.671940 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669734 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c9e716af-74d3-4a46-891e-32d46250da3e-iptables-alerter-script\") pod \"iptables-alerter-7m67g\" (UID: \"c9e716af-74d3-4a46-891e-32d46250da3e\") " pod="openshift-network-operator/iptables-alerter-7m67g" Apr 16 08:39:09.671940 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669760 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-cnibin\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.671940 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669768 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-kubelet\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.671940 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669824 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2583f7f3-820e-46ff-b710-c2256f41f5c1-cni-binary-copy\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.671940 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669851 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-modprobe-d\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.671940 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669854 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e2881a10-5691-4f22-92fd-70bdbdbacec2-ovn-node-metrics-cert\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.671940 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669896 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-var-lib-kubelet\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.671940 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669925 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-multus-cni-dir\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.671940 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.669977 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-host-var-lib-cni-bin\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.671940 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.670004 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1ba56a43-ff49-4b6d-a602-289479e4e2f7-tmp-dir\") pod \"node-resolver-6ngxf\" (UID: \"1ba56a43-ff49-4b6d-a602-289479e4e2f7\") " pod="openshift-dns/node-resolver-6ngxf" Apr 16 08:39:09.671940 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.670028 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-run-systemd\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.671940 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.670049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-run-ovn-kubernetes\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.671940 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.670071 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvbd7\" (UniqueName: \"kubernetes.io/projected/13e8353c-4eb0-4abd-98df-42ece4ec0318-kube-api-access-cvbd7\") pod \"network-metrics-daemon-kvw77\" (UID: \"13e8353c-4eb0-4abd-98df-42ece4ec0318\") " pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:09.671940 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.670096 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45fcs\" (UniqueName: \"kubernetes.io/projected/9befbb38-427c-4f04-9ac5-007147cbf0ea-kube-api-access-45fcs\") pod \"node-ca-xxvcl\" (UID: \"9befbb38-427c-4f04-9ac5-007147cbf0ea\") " pod="openshift-image-registry/node-ca-xxvcl" Apr 16 08:39:09.671940 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.670182 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46d8d537-5ba7-484f-944f-56bdb9ac055f-var-lib-kubelet\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.671940 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.670259 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2583f7f3-820e-46ff-b710-c2256f41f5c1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.671940 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.670313 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-run-netns\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.672512 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.670751 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e2881a10-5691-4f22-92fd-70bdbdbacec2-ovnkube-script-lib\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.672512 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.670789 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/654861bc-7246-41e6-a23f-20623cc156ef-konnectivity-ca\") pod \"konnectivity-agent-g5rw9\" (UID: \"654861bc-7246-41e6-a23f-20623cc156ef\") " pod="kube-system/konnectivity-agent-g5rw9" Apr 16 08:39:09.672512 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.670843 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2583f7f3-820e-46ff-b710-c2256f41f5c1-cnibin\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.672512 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.670884 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-var-lib-openvswitch\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.672512 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.670970 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1ba56a43-ff49-4b6d-a602-289479e4e2f7-tmp-dir\") pod \"node-resolver-6ngxf\" (UID: \"1ba56a43-ff49-4b6d-a602-289479e4e2f7\") " pod="openshift-dns/node-resolver-6ngxf" Apr 16 08:39:09.672512 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.671060 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-run-systemd\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.672512 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.671104 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2881a10-5691-4f22-92fd-70bdbdbacec2-host-run-ovn-kubernetes\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.672512 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.671274 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/46d8d537-5ba7-484f-944f-56bdb9ac055f-tmp\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.672512 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.671366 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/46d8d537-5ba7-484f-944f-56bdb9ac055f-etc-tuned\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.672512 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.671413 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/654861bc-7246-41e6-a23f-20623cc156ef-agent-certs\") pod \"konnectivity-agent-g5rw9\" (UID: \"654861bc-7246-41e6-a23f-20623cc156ef\") " pod="kube-system/konnectivity-agent-g5rw9" Apr 16 08:39:09.672512 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.671675 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2583f7f3-820e-46ff-b710-c2256f41f5c1-cni-binary-copy\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.672512 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.672255 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e2881a10-5691-4f22-92fd-70bdbdbacec2-ovn-node-metrics-cert\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.676475 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:09.676456 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:39:09.676475 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:09.676474 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:39:09.676475 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:09.676482 2578 projected.go:194] Error preparing data for projected volume kube-api-access-9zp5k for pod openshift-network-diagnostics/network-check-target-x9sn2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:39:09.676858 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:09.676532 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k podName:e7a5a6cf-7715-4262-8d6b-d3268b40a1ad nodeName:}" failed. No retries permitted until 2026-04-16 08:39:10.176519971 +0000 UTC m=+2.096505827 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9zp5k" (UniqueName: "kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k") pod "network-check-target-x9sn2" (UID: "e7a5a6cf-7715-4262-8d6b-d3268b40a1ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:39:09.678642 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.678617 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqh55\" (UniqueName: \"kubernetes.io/projected/46d8d537-5ba7-484f-944f-56bdb9ac055f-kube-api-access-jqh55\") pod \"tuned-wrx69\" (UID: \"46d8d537-5ba7-484f-944f-56bdb9ac055f\") " pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.679082 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.679060 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzpv6\" (UniqueName: \"kubernetes.io/projected/e2881a10-5691-4f22-92fd-70bdbdbacec2-kube-api-access-dzpv6\") pod \"ovnkube-node-cvzdh\" (UID: \"e2881a10-5691-4f22-92fd-70bdbdbacec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.679260 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.679243 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45fcs\" (UniqueName: \"kubernetes.io/projected/9befbb38-427c-4f04-9ac5-007147cbf0ea-kube-api-access-45fcs\") pod \"node-ca-xxvcl\" (UID: \"9befbb38-427c-4f04-9ac5-007147cbf0ea\") " pod="openshift-image-registry/node-ca-xxvcl" Apr 16 08:39:09.679496 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.679475 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bd8b\" (UniqueName: \"kubernetes.io/projected/2583f7f3-820e-46ff-b710-c2256f41f5c1-kube-api-access-7bd8b\") pod \"multus-additional-cni-plugins-nsv7m\" (UID: \"2583f7f3-820e-46ff-b710-c2256f41f5c1\") " pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.679814 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.679798 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7sqg\" (UniqueName: \"kubernetes.io/projected/1ba56a43-ff49-4b6d-a602-289479e4e2f7-kube-api-access-n7sqg\") pod \"node-resolver-6ngxf\" (UID: \"1ba56a43-ff49-4b6d-a602-289479e4e2f7\") " pod="openshift-dns/node-resolver-6ngxf" Apr 16 08:39:09.679939 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.679923 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvbd7\" (UniqueName: \"kubernetes.io/projected/13e8353c-4eb0-4abd-98df-42ece4ec0318-kube-api-access-cvbd7\") pod \"network-metrics-daemon-kvw77\" (UID: \"13e8353c-4eb0-4abd-98df-42ece4ec0318\") " pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:09.680100 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.680081 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkd6x\" (UniqueName: \"kubernetes.io/projected/8ad5142c-d6f7-4dec-94b9-064b7167a387-kube-api-access-fkd6x\") pod \"aws-ebs-csi-driver-node-nk2sc\" (UID: \"8ad5142c-d6f7-4dec-94b9-064b7167a387\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.680568 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.680548 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt6gw\" (UniqueName: \"kubernetes.io/projected/c9e716af-74d3-4a46-891e-32d46250da3e-kube-api-access-mt6gw\") pod \"iptables-alerter-7m67g\" (UID: \"c9e716af-74d3-4a46-891e-32d46250da3e\") " pod="openshift-network-operator/iptables-alerter-7m67g" Apr 16 08:39:09.771376 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771290 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-host-run-netns\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.771376 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771329 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-host-var-lib-cni-multus\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.771376 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771345 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-host-run-multus-certs\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.771564 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771388 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-cnibin\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.771564 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771405 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-host-var-lib-cni-multus\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.771564 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771418 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-multus-cni-dir\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.771564 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771437 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-host-var-lib-cni-bin\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.771564 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771442 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-host-run-netns\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.771564 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771477 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-host-var-lib-kubelet\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.771564 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771478 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-host-var-lib-cni-bin\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.771564 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771496 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-cnibin\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.771564 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771504 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-host-run-multus-certs\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.771564 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771507 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-hostroot\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.771564 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771537 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-hostroot\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.771564 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771509 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-host-var-lib-kubelet\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.771564 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771554 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-multus-conf-dir\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.772005 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771575 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-multus-conf-dir\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.772005 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771567 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-multus-cni-dir\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.772005 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771586 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-host-run-k8s-cni-cncf-io\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.772005 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771624 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6fc606d-9332-4d96-911e-24bed66bbda7-cni-binary-copy\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.772005 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771631 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-host-run-k8s-cni-cncf-io\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.772005 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771639 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c6fc606d-9332-4d96-911e-24bed66bbda7-multus-daemon-config\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.772005 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771657 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ls9d\" (UniqueName: \"kubernetes.io/projected/c6fc606d-9332-4d96-911e-24bed66bbda7-kube-api-access-5ls9d\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.772005 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771678 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-os-release\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.772005 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-multus-socket-dir-parent\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.772005 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771739 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-etc-kubernetes\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.772005 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771765 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-system-cni-dir\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.772005 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771798 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-os-release\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.772005 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771818 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-etc-kubernetes\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.772005 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771822 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-multus-socket-dir-parent\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.772005 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.771845 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6fc606d-9332-4d96-911e-24bed66bbda7-system-cni-dir\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.772412 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.772104 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6fc606d-9332-4d96-911e-24bed66bbda7-cni-binary-copy\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.772412 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.772168 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c6fc606d-9332-4d96-911e-24bed66bbda7-multus-daemon-config\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.779369 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.779352 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ls9d\" (UniqueName: \"kubernetes.io/projected/c6fc606d-9332-4d96-911e-24bed66bbda7-kube-api-access-5ls9d\") pod \"multus-hh7sm\" (UID: \"c6fc606d-9332-4d96-911e-24bed66bbda7\") " pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.879533 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.879495 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7m67g" Apr 16 08:39:09.885160 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.885144 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6ngxf" Apr 16 08:39:09.885494 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:09.885474 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9e716af_74d3_4a46_891e_32d46250da3e.slice/crio-99c97deacdc9992aecfa87b293b0269288535dc7f079f9c3b82f40f444384b2b WatchSource:0}: Error finding container 99c97deacdc9992aecfa87b293b0269288535dc7f079f9c3b82f40f444384b2b: Status 404 returned error can't find the container with id 99c97deacdc9992aecfa87b293b0269288535dc7f079f9c3b82f40f444384b2b Apr 16 08:39:09.890987 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:09.890967 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ba56a43_ff49_4b6d_a602_289479e4e2f7.slice/crio-5e435a61e650d7ac0b06603955b29dde3a8ccd41544e9c363d917209b91e1d40 WatchSource:0}: Error finding container 5e435a61e650d7ac0b06603955b29dde3a8ccd41544e9c363d917209b91e1d40: Status 404 returned error can't find the container with id 5e435a61e650d7ac0b06603955b29dde3a8ccd41544e9c363d917209b91e1d40 Apr 16 08:39:09.907221 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.907202 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xxvcl" Apr 16 08:39:09.911916 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.911451 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nsv7m" Apr 16 08:39:09.916626 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:09.916604 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9befbb38_427c_4f04_9ac5_007147cbf0ea.slice/crio-d0dc46eafebdb4200bfd212e200d683b3cb095601f0d8ecc6920b74f0a976a18 WatchSource:0}: Error finding container d0dc46eafebdb4200bfd212e200d683b3cb095601f0d8ecc6920b74f0a976a18: Status 404 returned error can't find the container with id d0dc46eafebdb4200bfd212e200d683b3cb095601f0d8ecc6920b74f0a976a18 Apr 16 08:39:09.919163 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.919144 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:09.919278 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:09.919212 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2583f7f3_820e_46ff_b710_c2256f41f5c1.slice/crio-135f2f6a46e313d4dea9ba72d574067927fb9180a40fe646fdc8f8db490f1ec8 WatchSource:0}: Error finding container 135f2f6a46e313d4dea9ba72d574067927fb9180a40fe646fdc8f8db490f1ec8: Status 404 returned error can't find the container with id 135f2f6a46e313d4dea9ba72d574067927fb9180a40fe646fdc8f8db490f1ec8 Apr 16 08:39:09.925574 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:09.925556 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2881a10_5691_4f22_92fd_70bdbdbacec2.slice/crio-4cb1d162302ef94495b90caadfb3f1a1d59fbe6ead6bd0d7224c4e715f6ecf12 WatchSource:0}: Error finding container 4cb1d162302ef94495b90caadfb3f1a1d59fbe6ead6bd0d7224c4e715f6ecf12: Status 404 returned error can't find the container with id 4cb1d162302ef94495b90caadfb3f1a1d59fbe6ead6bd0d7224c4e715f6ecf12 Apr 16 08:39:09.943386 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.943363 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-g5rw9" Apr 16 08:39:09.948672 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:09.948652 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod654861bc_7246_41e6_a23f_20623cc156ef.slice/crio-83712ad9ed9e745ff84f4183b635cf1ddd217faa5855fd434d243f966bee813d WatchSource:0}: Error finding container 83712ad9ed9e745ff84f4183b635cf1ddd217faa5855fd434d243f966bee813d: Status 404 returned error can't find the container with id 83712ad9ed9e745ff84f4183b635cf1ddd217faa5855fd434d243f966bee813d Apr 16 08:39:09.949940 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.949927 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" Apr 16 08:39:09.955347 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.955330 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wrx69" Apr 16 08:39:09.955817 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:09.955779 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ad5142c_d6f7_4dec_94b9_064b7167a387.slice/crio-057d2b08b444469bc7943b346b22b48265af29fb1e8447a0f69d46373a5ccfae WatchSource:0}: Error finding container 057d2b08b444469bc7943b346b22b48265af29fb1e8447a0f69d46373a5ccfae: Status 404 returned error can't find the container with id 057d2b08b444469bc7943b346b22b48265af29fb1e8447a0f69d46373a5ccfae Apr 16 08:39:09.959295 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:09.959276 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hh7sm" Apr 16 08:39:09.961017 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:09.960996 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46d8d537_5ba7_484f_944f_56bdb9ac055f.slice/crio-78f6df3a541f2ccbc52dc6cfd51feb3473bb9ab31c759a12127489c6efddf6bc WatchSource:0}: Error finding container 78f6df3a541f2ccbc52dc6cfd51feb3473bb9ab31c759a12127489c6efddf6bc: Status 404 returned error can't find the container with id 78f6df3a541f2ccbc52dc6cfd51feb3473bb9ab31c759a12127489c6efddf6bc Apr 16 08:39:09.966269 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:09.966250 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6fc606d_9332_4d96_911e_24bed66bbda7.slice/crio-6c0be291d66e797d5438c11cc19afac6a0702bf2f7562a00a1d27f5c0a25051b WatchSource:0}: Error finding container 6c0be291d66e797d5438c11cc19afac6a0702bf2f7562a00a1d27f5c0a25051b: Status 404 returned error can't find the container with id 6c0be291d66e797d5438c11cc19afac6a0702bf2f7562a00a1d27f5c0a25051b Apr 16 08:39:10.174696 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:10.174612 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs\") pod \"network-metrics-daemon-kvw77\" (UID: \"13e8353c-4eb0-4abd-98df-42ece4ec0318\") " pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:10.174858 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:10.174807 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:39:10.174912 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:10.174864 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs podName:13e8353c-4eb0-4abd-98df-42ece4ec0318 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:11.174846372 +0000 UTC m=+3.094832240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs") pod "network-metrics-daemon-kvw77" (UID: "13e8353c-4eb0-4abd-98df-42ece4ec0318") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:39:10.275941 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:10.275906 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zp5k\" (UniqueName: \"kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k\") pod \"network-check-target-x9sn2\" (UID: \"e7a5a6cf-7715-4262-8d6b-d3268b40a1ad\") " pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:10.276112 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:10.276073 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:39:10.276112 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:10.276094 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:39:10.276112 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:10.276106 2578 projected.go:194] Error preparing data for projected volume kube-api-access-9zp5k for pod openshift-network-diagnostics/network-check-target-x9sn2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:39:10.276274 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:10.276163 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k podName:e7a5a6cf-7715-4262-8d6b-d3268b40a1ad nodeName:}" failed. No retries permitted until 2026-04-16 08:39:11.276143278 +0000 UTC m=+3.196129136 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9zp5k" (UniqueName: "kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k") pod "network-check-target-x9sn2" (UID: "e7a5a6cf-7715-4262-8d6b-d3268b40a1ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:39:10.589014 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:10.588935 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:39:10.598210 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:10.598176 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 08:34:09 +0000 UTC" deadline="2027-11-05 04:16:20.868720532 +0000 UTC" Apr 16 08:39:10.598210 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:10.598208 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13627h37m10.27051544s" Apr 16 08:39:10.663893 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:10.663861 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:10.664105 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:10.663992 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9sn2" podUID="e7a5a6cf-7715-4262-8d6b-d3268b40a1ad" Apr 16 08:39:10.709112 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:10.709071 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wrx69" event={"ID":"46d8d537-5ba7-484f-944f-56bdb9ac055f","Type":"ContainerStarted","Data":"78f6df3a541f2ccbc52dc6cfd51feb3473bb9ab31c759a12127489c6efddf6bc"} Apr 16 08:39:10.715440 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:10.715408 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nsv7m" event={"ID":"2583f7f3-820e-46ff-b710-c2256f41f5c1","Type":"ContainerStarted","Data":"135f2f6a46e313d4dea9ba72d574067927fb9180a40fe646fdc8f8db490f1ec8"} Apr 16 08:39:10.733953 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:10.733915 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6ngxf" event={"ID":"1ba56a43-ff49-4b6d-a602-289479e4e2f7","Type":"ContainerStarted","Data":"5e435a61e650d7ac0b06603955b29dde3a8ccd41544e9c363d917209b91e1d40"} Apr 16 08:39:10.742670 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:10.742595 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7m67g" event={"ID":"c9e716af-74d3-4a46-891e-32d46250da3e","Type":"ContainerStarted","Data":"99c97deacdc9992aecfa87b293b0269288535dc7f079f9c3b82f40f444384b2b"} Apr 16 08:39:10.759820 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:10.759743 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hh7sm" event={"ID":"c6fc606d-9332-4d96-911e-24bed66bbda7","Type":"ContainerStarted","Data":"6c0be291d66e797d5438c11cc19afac6a0702bf2f7562a00a1d27f5c0a25051b"} Apr 16 08:39:10.769217 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:10.769184 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" event={"ID":"8ad5142c-d6f7-4dec-94b9-064b7167a387","Type":"ContainerStarted","Data":"057d2b08b444469bc7943b346b22b48265af29fb1e8447a0f69d46373a5ccfae"} Apr 16 08:39:10.771153 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:10.771125 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-g5rw9" event={"ID":"654861bc-7246-41e6-a23f-20623cc156ef","Type":"ContainerStarted","Data":"83712ad9ed9e745ff84f4183b635cf1ddd217faa5855fd434d243f966bee813d"} Apr 16 08:39:10.780442 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:10.780404 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:39:10.783730 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:10.783684 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" event={"ID":"e2881a10-5691-4f22-92fd-70bdbdbacec2","Type":"ContainerStarted","Data":"4cb1d162302ef94495b90caadfb3f1a1d59fbe6ead6bd0d7224c4e715f6ecf12"} Apr 16 08:39:10.789061 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:10.788635 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xxvcl" event={"ID":"9befbb38-427c-4f04-9ac5-007147cbf0ea","Type":"ContainerStarted","Data":"d0dc46eafebdb4200bfd212e200d683b3cb095601f0d8ecc6920b74f0a976a18"} Apr 16 08:39:10.965226 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:10.965140 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:39:11.184384 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:11.184348 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs\") pod \"network-metrics-daemon-kvw77\" (UID: \"13e8353c-4eb0-4abd-98df-42ece4ec0318\") " pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:11.184576 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:11.184525 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:39:11.184645 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:11.184588 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs podName:13e8353c-4eb0-4abd-98df-42ece4ec0318 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:13.184570207 +0000 UTC m=+5.104556065 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs") pod "network-metrics-daemon-kvw77" (UID: "13e8353c-4eb0-4abd-98df-42ece4ec0318") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:39:11.285590 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:11.285360 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zp5k\" (UniqueName: \"kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k\") pod \"network-check-target-x9sn2\" (UID: \"e7a5a6cf-7715-4262-8d6b-d3268b40a1ad\") " pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:11.285590 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:11.285510 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:39:11.285590 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:11.285532 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:39:11.285590 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:11.285544 2578 projected.go:194] Error preparing data for projected volume kube-api-access-9zp5k for pod openshift-network-diagnostics/network-check-target-x9sn2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:39:11.285993 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:11.285599 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k podName:e7a5a6cf-7715-4262-8d6b-d3268b40a1ad nodeName:}" failed. No retries permitted until 2026-04-16 08:39:13.285581562 +0000 UTC m=+5.205567433 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-9zp5k" (UniqueName: "kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k") pod "network-check-target-x9sn2" (UID: "e7a5a6cf-7715-4262-8d6b-d3268b40a1ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:39:11.598457 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:11.598366 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 08:34:09 +0000 UTC" deadline="2027-12-08 18:11:55.958313501 +0000 UTC" Apr 16 08:39:11.598457 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:11.598404 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14433h32m44.359913578s" Apr 16 08:39:11.662316 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:11.662281 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:11.662495 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:11.662431 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kvw77" podUID="13e8353c-4eb0-4abd-98df-42ece4ec0318" Apr 16 08:39:12.398112 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:12.397961 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7mf7d"] Apr 16 08:39:12.401051 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:12.400930 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:12.401051 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:12.401007 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mf7d" podUID="dc53c5fe-6772-48e1-b1d9-82b3bf47aca3" Apr 16 08:39:12.494434 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:12.494378 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret\") pod \"global-pull-secret-syncer-7mf7d\" (UID: \"dc53c5fe-6772-48e1-b1d9-82b3bf47aca3\") " pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:12.494605 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:12.494455 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-kubelet-config\") pod \"global-pull-secret-syncer-7mf7d\" (UID: \"dc53c5fe-6772-48e1-b1d9-82b3bf47aca3\") " pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:12.494605 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:12.494487 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-dbus\") pod \"global-pull-secret-syncer-7mf7d\" (UID: \"dc53c5fe-6772-48e1-b1d9-82b3bf47aca3\") " pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:12.595503 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:12.595463 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret\") pod \"global-pull-secret-syncer-7mf7d\" (UID: \"dc53c5fe-6772-48e1-b1d9-82b3bf47aca3\") " pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:12.595700 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:12.595518 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-kubelet-config\") pod \"global-pull-secret-syncer-7mf7d\" (UID: \"dc53c5fe-6772-48e1-b1d9-82b3bf47aca3\") " pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:12.595700 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:12.595550 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-dbus\") pod \"global-pull-secret-syncer-7mf7d\" (UID: \"dc53c5fe-6772-48e1-b1d9-82b3bf47aca3\") " pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:12.595838 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:12.595747 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-dbus\") pod \"global-pull-secret-syncer-7mf7d\" (UID: \"dc53c5fe-6772-48e1-b1d9-82b3bf47aca3\") " pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:12.595890 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:12.595853 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 08:39:12.595942 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:12.595908 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret podName:dc53c5fe-6772-48e1-b1d9-82b3bf47aca3 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:13.095888401 +0000 UTC m=+5.015874259 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret") pod "global-pull-secret-syncer-7mf7d" (UID: "dc53c5fe-6772-48e1-b1d9-82b3bf47aca3") : object "kube-system"/"original-pull-secret" not registered Apr 16 08:39:12.596222 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:12.596202 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-kubelet-config\") pod \"global-pull-secret-syncer-7mf7d\" (UID: \"dc53c5fe-6772-48e1-b1d9-82b3bf47aca3\") " pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:12.664344 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:12.664238 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:12.664844 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:12.664418 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9sn2" podUID="e7a5a6cf-7715-4262-8d6b-d3268b40a1ad" Apr 16 08:39:13.101149 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:13.100861 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret\") pod \"global-pull-secret-syncer-7mf7d\" (UID: \"dc53c5fe-6772-48e1-b1d9-82b3bf47aca3\") " pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:13.101149 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:13.101035 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 08:39:13.101149 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:13.101101 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret podName:dc53c5fe-6772-48e1-b1d9-82b3bf47aca3 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:14.101080956 +0000 UTC m=+6.021066828 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret") pod "global-pull-secret-syncer-7mf7d" (UID: "dc53c5fe-6772-48e1-b1d9-82b3bf47aca3") : object "kube-system"/"original-pull-secret" not registered Apr 16 08:39:13.202019 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:13.201981 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs\") pod \"network-metrics-daemon-kvw77\" (UID: \"13e8353c-4eb0-4abd-98df-42ece4ec0318\") " pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:13.202170 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:13.202144 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:39:13.202225 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:13.202211 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs podName:13e8353c-4eb0-4abd-98df-42ece4ec0318 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:17.202189308 +0000 UTC m=+9.122175355 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs") pod "network-metrics-daemon-kvw77" (UID: "13e8353c-4eb0-4abd-98df-42ece4ec0318") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:39:13.303357 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:13.303319 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zp5k\" (UniqueName: \"kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k\") pod \"network-check-target-x9sn2\" (UID: \"e7a5a6cf-7715-4262-8d6b-d3268b40a1ad\") " pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:13.303537 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:13.303482 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:39:13.303537 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:13.303500 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:39:13.303537 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:13.303512 2578 projected.go:194] Error preparing data for projected volume kube-api-access-9zp5k for pod openshift-network-diagnostics/network-check-target-x9sn2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:39:13.303686 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:13.303567 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k podName:e7a5a6cf-7715-4262-8d6b-d3268b40a1ad nodeName:}" failed. No retries permitted until 2026-04-16 08:39:17.303549875 +0000 UTC m=+9.223535729 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-9zp5k" (UniqueName: "kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k") pod "network-check-target-x9sn2" (UID: "e7a5a6cf-7715-4262-8d6b-d3268b40a1ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:39:13.661939 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:13.661902 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:13.662137 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:13.662037 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mf7d" podUID="dc53c5fe-6772-48e1-b1d9-82b3bf47aca3" Apr 16 08:39:13.662457 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:13.662437 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:13.662564 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:13.662545 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kvw77" podUID="13e8353c-4eb0-4abd-98df-42ece4ec0318" Apr 16 08:39:14.110011 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:14.109926 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret\") pod \"global-pull-secret-syncer-7mf7d\" (UID: \"dc53c5fe-6772-48e1-b1d9-82b3bf47aca3\") " pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:14.110446 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:14.110118 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 08:39:14.110446 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:14.110179 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret podName:dc53c5fe-6772-48e1-b1d9-82b3bf47aca3 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:16.110161704 +0000 UTC m=+8.030147561 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret") pod "global-pull-secret-syncer-7mf7d" (UID: "dc53c5fe-6772-48e1-b1d9-82b3bf47aca3") : object "kube-system"/"original-pull-secret" not registered Apr 16 08:39:14.663976 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:14.663943 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:14.664148 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:14.664074 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9sn2" podUID="e7a5a6cf-7715-4262-8d6b-d3268b40a1ad" Apr 16 08:39:15.662640 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:15.661977 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:15.662640 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:15.662120 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kvw77" podUID="13e8353c-4eb0-4abd-98df-42ece4ec0318" Apr 16 08:39:15.662640 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:15.662461 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:15.662640 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:15.662596 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mf7d" podUID="dc53c5fe-6772-48e1-b1d9-82b3bf47aca3" Apr 16 08:39:16.128051 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:16.127965 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret\") pod \"global-pull-secret-syncer-7mf7d\" (UID: \"dc53c5fe-6772-48e1-b1d9-82b3bf47aca3\") " pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:16.128217 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:16.128092 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 08:39:16.128217 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:16.128165 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret podName:dc53c5fe-6772-48e1-b1d9-82b3bf47aca3 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:20.128142585 +0000 UTC m=+12.048128440 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret") pod "global-pull-secret-syncer-7mf7d" (UID: "dc53c5fe-6772-48e1-b1d9-82b3bf47aca3") : object "kube-system"/"original-pull-secret" not registered Apr 16 08:39:16.663403 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:16.662931 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:16.663403 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:16.663057 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9sn2" podUID="e7a5a6cf-7715-4262-8d6b-d3268b40a1ad" Apr 16 08:39:17.236979 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:17.236936 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs\") pod \"network-metrics-daemon-kvw77\" (UID: \"13e8353c-4eb0-4abd-98df-42ece4ec0318\") " pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:17.237153 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:17.237114 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:39:17.237220 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:17.237176 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs podName:13e8353c-4eb0-4abd-98df-42ece4ec0318 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:25.237158243 +0000 UTC m=+17.157144104 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs") pod "network-metrics-daemon-kvw77" (UID: "13e8353c-4eb0-4abd-98df-42ece4ec0318") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:39:17.337374 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:17.337325 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zp5k\" (UniqueName: \"kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k\") pod \"network-check-target-x9sn2\" (UID: \"e7a5a6cf-7715-4262-8d6b-d3268b40a1ad\") " pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:17.337557 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:17.337500 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:39:17.337557 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:17.337520 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:39:17.337557 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:17.337531 2578 projected.go:194] Error preparing data for projected volume kube-api-access-9zp5k for pod openshift-network-diagnostics/network-check-target-x9sn2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:39:17.337706 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:17.337586 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k podName:e7a5a6cf-7715-4262-8d6b-d3268b40a1ad nodeName:}" failed. No retries permitted until 2026-04-16 08:39:25.337567605 +0000 UTC m=+17.257553474 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-9zp5k" (UniqueName: "kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k") pod "network-check-target-x9sn2" (UID: "e7a5a6cf-7715-4262-8d6b-d3268b40a1ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:39:17.662111 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:17.662032 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:17.662267 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:17.662044 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:17.662267 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:17.662171 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kvw77" podUID="13e8353c-4eb0-4abd-98df-42ece4ec0318" Apr 16 08:39:17.662267 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:17.662237 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mf7d" podUID="dc53c5fe-6772-48e1-b1d9-82b3bf47aca3" Apr 16 08:39:18.662952 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:18.662810 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:18.662952 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:18.662925 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9sn2" podUID="e7a5a6cf-7715-4262-8d6b-d3268b40a1ad" Apr 16 08:39:19.662430 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:19.662395 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:19.662623 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:19.662394 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:19.662623 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:19.662500 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mf7d" podUID="dc53c5fe-6772-48e1-b1d9-82b3bf47aca3" Apr 16 08:39:19.662623 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:19.662604 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kvw77" podUID="13e8353c-4eb0-4abd-98df-42ece4ec0318" Apr 16 08:39:20.160083 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:20.160043 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret\") pod \"global-pull-secret-syncer-7mf7d\" (UID: \"dc53c5fe-6772-48e1-b1d9-82b3bf47aca3\") " pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:20.160454 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:20.160158 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 08:39:20.160454 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:20.160212 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret podName:dc53c5fe-6772-48e1-b1d9-82b3bf47aca3 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:28.160197712 +0000 UTC m=+20.080183585 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret") pod "global-pull-secret-syncer-7mf7d" (UID: "dc53c5fe-6772-48e1-b1d9-82b3bf47aca3") : object "kube-system"/"original-pull-secret" not registered Apr 16 08:39:20.661893 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:20.661859 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:20.662067 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:20.661989 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9sn2" podUID="e7a5a6cf-7715-4262-8d6b-d3268b40a1ad" Apr 16 08:39:21.662485 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:21.662448 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:21.662485 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:21.662449 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:21.662978 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:21.662603 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kvw77" podUID="13e8353c-4eb0-4abd-98df-42ece4ec0318" Apr 16 08:39:21.662978 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:21.662749 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mf7d" podUID="dc53c5fe-6772-48e1-b1d9-82b3bf47aca3" Apr 16 08:39:22.661933 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:22.661908 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:22.662107 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:22.662002 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9sn2" podUID="e7a5a6cf-7715-4262-8d6b-d3268b40a1ad" Apr 16 08:39:23.661875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:23.661792 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:23.661875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:23.661800 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:23.662338 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:23.661910 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mf7d" podUID="dc53c5fe-6772-48e1-b1d9-82b3bf47aca3" Apr 16 08:39:23.662338 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:23.662041 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kvw77" podUID="13e8353c-4eb0-4abd-98df-42ece4ec0318" Apr 16 08:39:24.661997 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:24.661817 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:24.662401 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:24.662094 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9sn2" podUID="e7a5a6cf-7715-4262-8d6b-d3268b40a1ad" Apr 16 08:39:25.295655 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:25.295617 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs\") pod \"network-metrics-daemon-kvw77\" (UID: \"13e8353c-4eb0-4abd-98df-42ece4ec0318\") " pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:25.295885 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:25.295805 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:39:25.295885 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:25.295859 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs podName:13e8353c-4eb0-4abd-98df-42ece4ec0318 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:41.295843637 +0000 UTC m=+33.215829490 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs") pod "network-metrics-daemon-kvw77" (UID: "13e8353c-4eb0-4abd-98df-42ece4ec0318") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:39:25.396849 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:25.396817 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zp5k\" (UniqueName: \"kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k\") pod \"network-check-target-x9sn2\" (UID: \"e7a5a6cf-7715-4262-8d6b-d3268b40a1ad\") " pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:25.397021 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:25.397001 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:39:25.397090 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:25.397023 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:39:25.397090 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:25.397033 2578 projected.go:194] Error preparing data for projected volume kube-api-access-9zp5k for pod openshift-network-diagnostics/network-check-target-x9sn2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:39:25.397090 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:25.397088 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k podName:e7a5a6cf-7715-4262-8d6b-d3268b40a1ad nodeName:}" failed. No retries permitted until 2026-04-16 08:39:41.397071206 +0000 UTC m=+33.317057082 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-9zp5k" (UniqueName: "kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k") pod "network-check-target-x9sn2" (UID: "e7a5a6cf-7715-4262-8d6b-d3268b40a1ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:39:25.661790 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:25.661750 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:25.661960 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:25.661754 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:25.661960 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:25.661882 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mf7d" podUID="dc53c5fe-6772-48e1-b1d9-82b3bf47aca3" Apr 16 08:39:25.662076 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:25.661948 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kvw77" podUID="13e8353c-4eb0-4abd-98df-42ece4ec0318" Apr 16 08:39:26.661885 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:26.661851 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:26.662141 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:26.661971 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9sn2" podUID="e7a5a6cf-7715-4262-8d6b-d3268b40a1ad" Apr 16 08:39:27.661884 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:27.661849 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:27.662062 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:27.661856 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:27.662062 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:27.661955 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mf7d" podUID="dc53c5fe-6772-48e1-b1d9-82b3bf47aca3" Apr 16 08:39:27.662062 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:27.662044 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kvw77" podUID="13e8353c-4eb0-4abd-98df-42ece4ec0318" Apr 16 08:39:28.217587 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:28.217562 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret\") pod \"global-pull-secret-syncer-7mf7d\" (UID: \"dc53c5fe-6772-48e1-b1d9-82b3bf47aca3\") " pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:28.217706 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:28.217690 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 08:39:28.217789 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:28.217782 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret podName:dc53c5fe-6772-48e1-b1d9-82b3bf47aca3 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:44.217761013 +0000 UTC m=+36.137746866 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret") pod "global-pull-secret-syncer-7mf7d" (UID: "dc53c5fe-6772-48e1-b1d9-82b3bf47aca3") : object "kube-system"/"original-pull-secret" not registered Apr 16 08:39:28.662830 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:28.662797 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:28.663259 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:28.662916 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9sn2" podUID="e7a5a6cf-7715-4262-8d6b-d3268b40a1ad" Apr 16 08:39:28.828095 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:28.827665 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-8.ec2.internal" event={"ID":"ff050f160b53222212c83886cf4ca7d7","Type":"ContainerStarted","Data":"a032fea37f5faee15d1353051b2a303a7ee8fc8b0dcdbc95988c1d7640a10fc2"} Apr 16 08:39:28.830819 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:28.830789 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hh7sm" event={"ID":"c6fc606d-9332-4d96-911e-24bed66bbda7","Type":"ContainerStarted","Data":"4ce25623d9ba4720f33528e70d2bbdb31079399c0a136496bb3f02db3e88918b"} Apr 16 08:39:28.837012 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:28.836995 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/0.log" Apr 16 08:39:28.837378 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:28.837355 2578 generic.go:358] "Generic (PLEG): container finished" podID="e2881a10-5691-4f22-92fd-70bdbdbacec2" containerID="5609ab8a06e32a00b4bee3235e7a43094c1335adf1066689d287fb1c52307154" exitCode=1 Apr 16 08:39:28.837455 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:28.837433 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" event={"ID":"e2881a10-5691-4f22-92fd-70bdbdbacec2","Type":"ContainerStarted","Data":"48581a8fe33600d94d749958838dce07e0250a3763f22c1f52ef837d428c83fc"} Apr 16 08:39:28.837492 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:28.837457 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" event={"ID":"e2881a10-5691-4f22-92fd-70bdbdbacec2","Type":"ContainerStarted","Data":"027eaa7a98aa2baf3f094d03fb69b38db909379b08e8e85b1735497809b76d6b"} Apr 16 08:39:28.837492 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:28.837471 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" event={"ID":"e2881a10-5691-4f22-92fd-70bdbdbacec2","Type":"ContainerDied","Data":"5609ab8a06e32a00b4bee3235e7a43094c1335adf1066689d287fb1c52307154"} Apr 16 08:39:28.837492 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:28.837485 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" event={"ID":"e2881a10-5691-4f22-92fd-70bdbdbacec2","Type":"ContainerStarted","Data":"73e8b91a5300e9a02a8b15d13d59489946a673b33cd131c20ffe3e3f92e108b9"} Apr 16 08:39:28.839328 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:28.838984 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wrx69" event={"ID":"46d8d537-5ba7-484f-944f-56bdb9ac055f","Type":"ContainerStarted","Data":"bec3824504366f33fd05213d616f78472688515539bbd9c3bbfbde72adb1b978"} Apr 16 08:39:28.855962 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:28.855890 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-8.ec2.internal" podStartSLOduration=20.855871177 podStartE2EDuration="20.855871177s" podCreationTimestamp="2026-04-16 08:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:39:28.840799597 +0000 UTC m=+20.760785473" watchObservedRunningTime="2026-04-16 08:39:28.855871177 +0000 UTC m=+20.775857054" Apr 16 08:39:28.856222 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:28.856187 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hh7sm" podStartSLOduration=2.262589884 podStartE2EDuration="20.856176223s" podCreationTimestamp="2026-04-16 08:39:08 +0000 UTC" firstStartedPulling="2026-04-16 08:39:09.96757049 +0000 UTC m=+1.887556345" lastFinishedPulling="2026-04-16 08:39:28.561156831 +0000 UTC m=+20.481142684" observedRunningTime="2026-04-16 08:39:28.855521733 +0000 UTC m=+20.775507609" watchObservedRunningTime="2026-04-16 08:39:28.856176223 +0000 UTC m=+20.776162103" Apr 16 08:39:28.871001 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:28.870933 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-wrx69" podStartSLOduration=2.544369353 podStartE2EDuration="20.87077758s" podCreationTimestamp="2026-04-16 08:39:08 +0000 UTC" firstStartedPulling="2026-04-16 08:39:09.962887935 +0000 UTC m=+1.882873790" lastFinishedPulling="2026-04-16 08:39:28.28929615 +0000 UTC m=+20.209282017" observedRunningTime="2026-04-16 08:39:28.869933165 +0000 UTC m=+20.789919040" watchObservedRunningTime="2026-04-16 08:39:28.87077758 +0000 UTC m=+20.790763456" Apr 16 08:39:29.661696 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:29.661488 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:29.661957 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:29.661488 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:29.661957 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:29.661808 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mf7d" podUID="dc53c5fe-6772-48e1-b1d9-82b3bf47aca3" Apr 16 08:39:29.661957 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:29.661892 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kvw77" podUID="13e8353c-4eb0-4abd-98df-42ece4ec0318" Apr 16 08:39:29.842188 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:29.842137 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xxvcl" event={"ID":"9befbb38-427c-4f04-9ac5-007147cbf0ea","Type":"ContainerStarted","Data":"67755788811e2ced6c3c69a1b03ab08db72ec5ba2d9962c2fd8685ee57684188"} Apr 16 08:39:29.843759 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:29.843732 2578 generic.go:358] "Generic (PLEG): container finished" podID="2583f7f3-820e-46ff-b710-c2256f41f5c1" containerID="bd9945c1a84ca0642d5fd1c2015aff2df9e14b2003cd5b94e1f364340c6ecd71" exitCode=0 Apr 16 08:39:29.843885 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:29.843815 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nsv7m" event={"ID":"2583f7f3-820e-46ff-b710-c2256f41f5c1","Type":"ContainerDied","Data":"bd9945c1a84ca0642d5fd1c2015aff2df9e14b2003cd5b94e1f364340c6ecd71"} Apr 16 08:39:29.845952 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:29.845927 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6ngxf" event={"ID":"1ba56a43-ff49-4b6d-a602-289479e4e2f7","Type":"ContainerStarted","Data":"ff53206873ddc99803cc33b35e8d282e7f3b4d8f7dbe9b6dbb897a335ab5ac66"} Apr 16 08:39:29.847488 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:29.847467 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7m67g" event={"ID":"c9e716af-74d3-4a46-891e-32d46250da3e","Type":"ContainerStarted","Data":"60bfe0eb0ec5db8f52c25aebe804d72381ee75be805f6fd50c7f5b8cf4189d80"} Apr 16 08:39:29.848796 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:29.848774 2578 generic.go:358] "Generic (PLEG): container finished" podID="80da84d77c931c1bc8843e7cd21359fc" containerID="ae26fa8ea953f3cf1b60d9eb900dad0d0caffea4296edf36a1564d22c962fe57" exitCode=0 Apr 16 08:39:29.848893 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:29.848843 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal" event={"ID":"80da84d77c931c1bc8843e7cd21359fc","Type":"ContainerDied","Data":"ae26fa8ea953f3cf1b60d9eb900dad0d0caffea4296edf36a1564d22c962fe57"} Apr 16 08:39:29.850121 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:29.850090 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" event={"ID":"8ad5142c-d6f7-4dec-94b9-064b7167a387","Type":"ContainerStarted","Data":"88543df4225d11724891f205c58331d130c275c92e8770bc7674c5ace2ce0ada"} Apr 16 08:39:29.851422 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:29.851400 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-g5rw9" event={"ID":"654861bc-7246-41e6-a23f-20623cc156ef","Type":"ContainerStarted","Data":"cc73c63272a77afb680bc0bc591956286f122d93e6ede2cda1e11c28553ab709"} Apr 16 08:39:29.854489 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:29.854469 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/0.log" Apr 16 08:39:29.855349 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:29.855325 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" event={"ID":"e2881a10-5691-4f22-92fd-70bdbdbacec2","Type":"ContainerStarted","Data":"a0e4bc687676aa7a7ae440283762c7fe8c21846fb25793ed47c116d3a9c8cb3c"} Apr 16 08:39:29.855447 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:29.855366 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" event={"ID":"e2881a10-5691-4f22-92fd-70bdbdbacec2","Type":"ContainerStarted","Data":"6420e74ae4ca0085d57a0ea6efce77d5c37370cab02396dd820aa1dcee614fd0"} Apr 16 08:39:29.858097 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:29.858061 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xxvcl" podStartSLOduration=3.522033852 podStartE2EDuration="21.858049487s" podCreationTimestamp="2026-04-16 08:39:08 +0000 UTC" firstStartedPulling="2026-04-16 08:39:09.918159512 +0000 UTC m=+1.838145365" lastFinishedPulling="2026-04-16 08:39:28.254175133 +0000 UTC m=+20.174161000" observedRunningTime="2026-04-16 08:39:29.857867769 +0000 UTC m=+21.777853644" watchObservedRunningTime="2026-04-16 08:39:29.858049487 +0000 UTC m=+21.778035362" Apr 16 08:39:29.871171 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:29.871128 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7m67g" podStartSLOduration=3.471209101 podStartE2EDuration="21.871114319s" podCreationTimestamp="2026-04-16 08:39:08 +0000 UTC" firstStartedPulling="2026-04-16 08:39:09.887883386 +0000 UTC m=+1.807869238" lastFinishedPulling="2026-04-16 08:39:28.287788596 +0000 UTC m=+20.207774456" observedRunningTime="2026-04-16 08:39:29.870983396 +0000 UTC m=+21.790969290" watchObservedRunningTime="2026-04-16 08:39:29.871114319 +0000 UTC m=+21.791100191" Apr 16 08:39:29.904035 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:29.903976 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-g5rw9" podStartSLOduration=3.59988993 podStartE2EDuration="21.903959029s" podCreationTimestamp="2026-04-16 08:39:08 +0000 UTC" firstStartedPulling="2026-04-16 08:39:09.950128367 +0000 UTC m=+1.870114220" lastFinishedPulling="2026-04-16 08:39:28.254197449 +0000 UTC m=+20.174183319" observedRunningTime="2026-04-16 08:39:29.903293581 +0000 UTC m=+21.823279456" watchObservedRunningTime="2026-04-16 08:39:29.903959029 +0000 UTC m=+21.823944904" Apr 16 08:39:29.938348 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:29.938311 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6ngxf" podStartSLOduration=3.578931861 podStartE2EDuration="21.938295801s" podCreationTimestamp="2026-04-16 08:39:08 +0000 UTC" firstStartedPulling="2026-04-16 08:39:09.89483174 +0000 UTC m=+1.814817593" lastFinishedPulling="2026-04-16 08:39:28.254195677 +0000 UTC m=+20.174181533" observedRunningTime="2026-04-16 08:39:29.938089657 +0000 UTC m=+21.858075533" watchObservedRunningTime="2026-04-16 08:39:29.938295801 +0000 UTC m=+21.858281677" Apr 16 08:39:30.105908 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:30.105885 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 08:39:30.632427 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:30.632259 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T08:39:30.1059064Z","UUID":"0dc11274-710c-421f-83ec-a939d5b70204","Handler":null,"Name":"","Endpoint":""} Apr 16 08:39:30.634126 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:30.634104 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 08:39:30.634253 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:30.634134 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 08:39:30.661954 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:30.661923 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:30.662102 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:30.662037 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9sn2" podUID="e7a5a6cf-7715-4262-8d6b-d3268b40a1ad" Apr 16 08:39:30.859344 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:30.859312 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal" event={"ID":"80da84d77c931c1bc8843e7cd21359fc","Type":"ContainerStarted","Data":"834fb4b276a133afd09288c9e6bbead1157680c63721598bb6825de43a5f3290"} Apr 16 08:39:30.861241 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:30.861212 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" event={"ID":"8ad5142c-d6f7-4dec-94b9-064b7167a387","Type":"ContainerStarted","Data":"4159717408fb6316e4fa0248013a67b357ce18b07f47466f4a6ca75fae4933d0"} Apr 16 08:39:30.877283 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:30.877223 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-8.ec2.internal" podStartSLOduration=22.877204894 podStartE2EDuration="22.877204894s" podCreationTimestamp="2026-04-16 08:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:39:30.876847732 +0000 UTC m=+22.796833607" watchObservedRunningTime="2026-04-16 08:39:30.877204894 +0000 UTC m=+22.797190771" Apr 16 08:39:31.661726 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:31.661677 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:31.661926 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:31.661683 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:31.661926 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:31.661805 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mf7d" podUID="dc53c5fe-6772-48e1-b1d9-82b3bf47aca3" Apr 16 08:39:31.661926 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:31.661911 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kvw77" podUID="13e8353c-4eb0-4abd-98df-42ece4ec0318" Apr 16 08:39:31.864887 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:31.864850 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" event={"ID":"8ad5142c-d6f7-4dec-94b9-064b7167a387","Type":"ContainerStarted","Data":"d3c4abae2a9889a290a7993c01f5f9db5a841c40eac4afb0ab521409834f26c4"} Apr 16 08:39:31.868233 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:31.868210 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/0.log" Apr 16 08:39:31.868551 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:31.868514 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" event={"ID":"e2881a10-5691-4f22-92fd-70bdbdbacec2","Type":"ContainerStarted","Data":"21660b5d597b98fc836b9c2e4de3601bfcb75bb3ed8f71bfba266ea9e8247ff7"} Apr 16 08:39:32.518220 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:32.518182 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-g5rw9" Apr 16 08:39:32.627605 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:32.627571 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-g5rw9" Apr 16 08:39:32.628186 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:32.628167 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-g5rw9" Apr 16 08:39:32.641307 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:32.641258 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nk2sc" podStartSLOduration=3.7408589880000003 podStartE2EDuration="24.641243116s" podCreationTimestamp="2026-04-16 08:39:08 +0000 UTC" firstStartedPulling="2026-04-16 08:39:09.957919856 +0000 UTC m=+1.877905710" lastFinishedPulling="2026-04-16 08:39:30.858303984 +0000 UTC m=+22.778289838" observedRunningTime="2026-04-16 08:39:31.880224663 +0000 UTC m=+23.800210539" watchObservedRunningTime="2026-04-16 08:39:32.641243116 +0000 UTC m=+24.561228985" Apr 16 08:39:32.661992 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:32.661961 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:32.662150 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:32.662087 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9sn2" podUID="e7a5a6cf-7715-4262-8d6b-d3268b40a1ad" Apr 16 08:39:32.871359 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:32.871335 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-g5rw9" Apr 16 08:39:33.662243 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:33.662211 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:33.662439 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:33.662325 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kvw77" podUID="13e8353c-4eb0-4abd-98df-42ece4ec0318" Apr 16 08:39:33.662439 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:33.662362 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:33.662439 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:33.662407 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mf7d" podUID="dc53c5fe-6772-48e1-b1d9-82b3bf47aca3" Apr 16 08:39:33.876248 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:33.875849 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/0.log" Apr 16 08:39:33.877193 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:33.876419 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" event={"ID":"e2881a10-5691-4f22-92fd-70bdbdbacec2","Type":"ContainerStarted","Data":"6b2f4dd1aa8bd9c235dbf09a2e069b52e514a62f04674c1e880a388db0726442"} Apr 16 08:39:33.877193 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:33.876984 2578 scope.go:117] "RemoveContainer" containerID="5609ab8a06e32a00b4bee3235e7a43094c1335adf1066689d287fb1c52307154" Apr 16 08:39:33.877193 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:33.876990 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:33.877193 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:33.877029 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:33.877193 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:33.877045 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:33.899340 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:33.898632 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:33.902402 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:33.902116 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:39:34.661604 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:34.661571 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:34.661784 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:34.661692 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9sn2" podUID="e7a5a6cf-7715-4262-8d6b-d3268b40a1ad" Apr 16 08:39:34.881520 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:34.881496 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/0.log" Apr 16 08:39:34.882193 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:34.881881 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" event={"ID":"e2881a10-5691-4f22-92fd-70bdbdbacec2","Type":"ContainerStarted","Data":"e635c0da6281939121e2ac03301a5498897cb6ff6e08ec565b43a58e1311f86e"} Apr 16 08:39:34.883527 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:34.883503 2578 generic.go:358] "Generic (PLEG): container finished" podID="2583f7f3-820e-46ff-b710-c2256f41f5c1" containerID="1bf7e5eeedc6dc51fe3501f2ef7cef2c22da3c13c7fe5bb64a8b3a10cc701f63" exitCode=0 Apr 16 08:39:34.883614 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:34.883589 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nsv7m" event={"ID":"2583f7f3-820e-46ff-b710-c2256f41f5c1","Type":"ContainerDied","Data":"1bf7e5eeedc6dc51fe3501f2ef7cef2c22da3c13c7fe5bb64a8b3a10cc701f63"} Apr 16 08:39:34.910695 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:34.910650 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" podStartSLOduration=8.510267286 podStartE2EDuration="26.910636088s" podCreationTimestamp="2026-04-16 08:39:08 +0000 UTC" firstStartedPulling="2026-04-16 08:39:09.926771594 +0000 UTC m=+1.846757447" lastFinishedPulling="2026-04-16 08:39:28.327140392 +0000 UTC m=+20.247126249" observedRunningTime="2026-04-16 08:39:34.910147636 +0000 UTC m=+26.830133511" watchObservedRunningTime="2026-04-16 08:39:34.910636088 +0000 UTC m=+26.830621962" Apr 16 08:39:35.662282 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:35.662109 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:35.662406 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:35.662109 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:35.662406 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:35.662382 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kvw77" podUID="13e8353c-4eb0-4abd-98df-42ece4ec0318" Apr 16 08:39:35.662485 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:35.662412 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mf7d" podUID="dc53c5fe-6772-48e1-b1d9-82b3bf47aca3" Apr 16 08:39:35.760894 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:35.760789 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7mf7d"] Apr 16 08:39:35.763909 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:35.763879 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kvw77"] Apr 16 08:39:35.764376 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:35.764347 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-x9sn2"] Apr 16 08:39:35.764478 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:35.764443 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:35.764563 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:35.764539 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9sn2" podUID="e7a5a6cf-7715-4262-8d6b-d3268b40a1ad" Apr 16 08:39:35.887409 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:35.887375 2578 generic.go:358] "Generic (PLEG): container finished" podID="2583f7f3-820e-46ff-b710-c2256f41f5c1" containerID="52edb96b2eeada08ca608a321dd3dca4d5995432f606e67477e2dbe072e4af1b" exitCode=0 Apr 16 08:39:35.887888 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:35.887462 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:35.887888 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:35.887475 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:35.887888 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:35.887481 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nsv7m" event={"ID":"2583f7f3-820e-46ff-b710-c2256f41f5c1","Type":"ContainerDied","Data":"52edb96b2eeada08ca608a321dd3dca4d5995432f606e67477e2dbe072e4af1b"} Apr 16 08:39:35.887888 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:35.887688 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kvw77" podUID="13e8353c-4eb0-4abd-98df-42ece4ec0318" Apr 16 08:39:35.887888 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:35.887763 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mf7d" podUID="dc53c5fe-6772-48e1-b1d9-82b3bf47aca3" Apr 16 08:39:36.890859 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:36.890824 2578 generic.go:358] "Generic (PLEG): container finished" podID="2583f7f3-820e-46ff-b710-c2256f41f5c1" containerID="55f7d4ce30fb26ef6742a9a7de2eaeb6dccfb2828bbf67ea01d1d2285d529a02" exitCode=0 Apr 16 08:39:36.891236 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:36.890866 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nsv7m" event={"ID":"2583f7f3-820e-46ff-b710-c2256f41f5c1","Type":"ContainerDied","Data":"55f7d4ce30fb26ef6742a9a7de2eaeb6dccfb2828bbf67ea01d1d2285d529a02"} Apr 16 08:39:37.662851 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:37.662347 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:37.662851 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:37.662375 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:37.662851 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:37.662486 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mf7d" podUID="dc53c5fe-6772-48e1-b1d9-82b3bf47aca3" Apr 16 08:39:37.662851 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:37.662529 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kvw77" podUID="13e8353c-4eb0-4abd-98df-42ece4ec0318" Apr 16 08:39:37.662851 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:37.662586 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:37.662851 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:37.662684 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9sn2" podUID="e7a5a6cf-7715-4262-8d6b-d3268b40a1ad" Apr 16 08:39:39.662470 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:39.662431 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:39.662904 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:39.662482 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:39.662904 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:39.662578 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mf7d" podUID="dc53c5fe-6772-48e1-b1d9-82b3bf47aca3" Apr 16 08:39:39.662904 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:39.662646 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9sn2" podUID="e7a5a6cf-7715-4262-8d6b-d3268b40a1ad" Apr 16 08:39:39.662904 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:39.662690 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:39.662904 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:39.662775 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kvw77" podUID="13e8353c-4eb0-4abd-98df-42ece4ec0318" Apr 16 08:39:41.317807 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.317758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs\") pod \"network-metrics-daemon-kvw77\" (UID: \"13e8353c-4eb0-4abd-98df-42ece4ec0318\") " pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:41.318235 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:41.317944 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:39:41.318235 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:41.318021 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs podName:13e8353c-4eb0-4abd-98df-42ece4ec0318 nodeName:}" failed. No retries permitted until 2026-04-16 08:40:13.318000393 +0000 UTC m=+65.237986252 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs") pod "network-metrics-daemon-kvw77" (UID: "13e8353c-4eb0-4abd-98df-42ece4ec0318") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:39:41.344522 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.344491 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-8.ec2.internal" event="NodeReady" Apr 16 08:39:41.344689 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.344654 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 08:39:41.389235 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.389203 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fqcdx"] Apr 16 08:39:41.416559 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.416469 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4tvsn"] Apr 16 08:39:41.416736 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.416648 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fqcdx" Apr 16 08:39:41.418627 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.418434 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zp5k\" (UniqueName: \"kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k\") pod \"network-check-target-x9sn2\" (UID: \"e7a5a6cf-7715-4262-8d6b-d3268b40a1ad\") " pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:41.418792 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:41.418600 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:39:41.418792 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:41.418729 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:39:41.418792 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:41.418746 2578 projected.go:194] Error preparing data for projected volume kube-api-access-9zp5k for pod openshift-network-diagnostics/network-check-target-x9sn2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:39:41.418950 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:41.418812 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k podName:e7a5a6cf-7715-4262-8d6b-d3268b40a1ad nodeName:}" failed. No retries permitted until 2026-04-16 08:40:13.418792852 +0000 UTC m=+65.338778726 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-9zp5k" (UniqueName: "kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k") pod "network-check-target-x9sn2" (UID: "e7a5a6cf-7715-4262-8d6b-d3268b40a1ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:39:41.419363 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.419344 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9d2hv\"" Apr 16 08:39:41.419467 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.419394 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 08:39:41.419467 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.419344 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 08:39:41.434367 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.434334 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fqcdx"] Apr 16 08:39:41.434367 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.434370 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4tvsn"] Apr 16 08:39:41.434542 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.434441 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4tvsn" Apr 16 08:39:41.437051 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.437028 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 08:39:41.437149 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.437098 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 08:39:41.437149 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.437138 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 08:39:41.437271 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.437258 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-q9sj7\"" Apr 16 08:39:41.519003 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.518973 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8cea109b-1867-4bf4-a48a-15604584a8d2-tmp-dir\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:39:41.519222 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.519050 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtzcp\" (UniqueName: \"kubernetes.io/projected/11093fee-55ea-464a-b838-08d5d6f8e907-kube-api-access-qtzcp\") pod \"ingress-canary-4tvsn\" (UID: \"11093fee-55ea-464a-b838-08d5d6f8e907\") " pod="openshift-ingress-canary/ingress-canary-4tvsn" Apr 16 08:39:41.519222 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.519112 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:39:41.519222 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.519145 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cea109b-1867-4bf4-a48a-15604584a8d2-config-volume\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:39:41.519222 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.519190 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpnlh\" (UniqueName: \"kubernetes.io/projected/8cea109b-1867-4bf4-a48a-15604584a8d2-kube-api-access-zpnlh\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:39:41.519222 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.519215 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert\") pod \"ingress-canary-4tvsn\" (UID: \"11093fee-55ea-464a-b838-08d5d6f8e907\") " pod="openshift-ingress-canary/ingress-canary-4tvsn" Apr 16 08:39:41.619544 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.619513 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpnlh\" (UniqueName: \"kubernetes.io/projected/8cea109b-1867-4bf4-a48a-15604584a8d2-kube-api-access-zpnlh\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:39:41.619544 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.619552 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert\") pod \"ingress-canary-4tvsn\" (UID: \"11093fee-55ea-464a-b838-08d5d6f8e907\") " pod="openshift-ingress-canary/ingress-canary-4tvsn" Apr 16 08:39:41.619834 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.619582 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8cea109b-1867-4bf4-a48a-15604584a8d2-tmp-dir\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:39:41.619834 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.619633 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtzcp\" (UniqueName: \"kubernetes.io/projected/11093fee-55ea-464a-b838-08d5d6f8e907-kube-api-access-qtzcp\") pod \"ingress-canary-4tvsn\" (UID: \"11093fee-55ea-464a-b838-08d5d6f8e907\") " pod="openshift-ingress-canary/ingress-canary-4tvsn" Apr 16 08:39:41.619834 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:41.619681 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:39:41.619834 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:41.619773 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert podName:11093fee-55ea-464a-b838-08d5d6f8e907 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:42.11975735 +0000 UTC m=+34.039743203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert") pod "ingress-canary-4tvsn" (UID: "11093fee-55ea-464a-b838-08d5d6f8e907") : secret "canary-serving-cert" not found Apr 16 08:39:41.619834 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:41.619776 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:39:41.619834 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.619686 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:39:41.620076 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:41.619841 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls podName:8cea109b-1867-4bf4-a48a-15604584a8d2 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:42.119810196 +0000 UTC m=+34.039796055 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls") pod "dns-default-fqcdx" (UID: "8cea109b-1867-4bf4-a48a-15604584a8d2") : secret "dns-default-metrics-tls" not found Apr 16 08:39:41.620076 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.619883 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cea109b-1867-4bf4-a48a-15604584a8d2-config-volume\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:39:41.620076 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.619997 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8cea109b-1867-4bf4-a48a-15604584a8d2-tmp-dir\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:39:41.624467 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.624443 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cea109b-1867-4bf4-a48a-15604584a8d2-config-volume\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:39:41.631104 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.631081 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpnlh\" (UniqueName: \"kubernetes.io/projected/8cea109b-1867-4bf4-a48a-15604584a8d2-kube-api-access-zpnlh\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:39:41.631238 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.631187 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtzcp\" (UniqueName: \"kubernetes.io/projected/11093fee-55ea-464a-b838-08d5d6f8e907-kube-api-access-qtzcp\") pod \"ingress-canary-4tvsn\" (UID: \"11093fee-55ea-464a-b838-08d5d6f8e907\") " pod="openshift-ingress-canary/ingress-canary-4tvsn" Apr 16 08:39:41.662516 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.662476 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:39:41.662681 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.662600 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:41.662681 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.662638 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:39:41.665077 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.665051 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 08:39:41.665236 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.665217 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 08:39:41.665350 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.665330 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9gvlt\"" Apr 16 08:39:41.665350 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.665227 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 08:39:41.665503 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.665054 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 08:39:41.665548 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:41.665529 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-zdwck\"" Apr 16 08:39:42.124735 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:42.124680 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert\") pod \"ingress-canary-4tvsn\" (UID: \"11093fee-55ea-464a-b838-08d5d6f8e907\") " pod="openshift-ingress-canary/ingress-canary-4tvsn" Apr 16 08:39:42.125014 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:42.124802 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:39:42.125014 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:42.124848 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:39:42.125014 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:42.124910 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:39:42.125014 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:42.124923 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert podName:11093fee-55ea-464a-b838-08d5d6f8e907 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:43.124904991 +0000 UTC m=+35.044890845 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert") pod "ingress-canary-4tvsn" (UID: "11093fee-55ea-464a-b838-08d5d6f8e907") : secret "canary-serving-cert" not found Apr 16 08:39:42.125014 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:42.124962 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls podName:8cea109b-1867-4bf4-a48a-15604584a8d2 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:43.124946612 +0000 UTC m=+35.044932482 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls") pod "dns-default-fqcdx" (UID: "8cea109b-1867-4bf4-a48a-15604584a8d2") : secret "dns-default-metrics-tls" not found Apr 16 08:39:43.131335 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:43.131305 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert\") pod \"ingress-canary-4tvsn\" (UID: \"11093fee-55ea-464a-b838-08d5d6f8e907\") " pod="openshift-ingress-canary/ingress-canary-4tvsn" Apr 16 08:39:43.131896 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:43.131402 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:39:43.131896 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:43.131452 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:39:43.131896 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:43.131493 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:39:43.131896 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:43.131517 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert podName:11093fee-55ea-464a-b838-08d5d6f8e907 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:45.131502627 +0000 UTC m=+37.051488479 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert") pod "ingress-canary-4tvsn" (UID: "11093fee-55ea-464a-b838-08d5d6f8e907") : secret "canary-serving-cert" not found Apr 16 08:39:43.131896 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:43.131540 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls podName:8cea109b-1867-4bf4-a48a-15604584a8d2 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:45.131528364 +0000 UTC m=+37.051514218 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls") pod "dns-default-fqcdx" (UID: "8cea109b-1867-4bf4-a48a-15604584a8d2") : secret "dns-default-metrics-tls" not found Apr 16 08:39:43.905640 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:43.905607 2578 generic.go:358] "Generic (PLEG): container finished" podID="2583f7f3-820e-46ff-b710-c2256f41f5c1" containerID="98a7f80bc403f7eebd6a7792b769602a997d5d454768f1bfc98a1f14bb022233" exitCode=0 Apr 16 08:39:43.905825 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:43.905653 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nsv7m" event={"ID":"2583f7f3-820e-46ff-b710-c2256f41f5c1","Type":"ContainerDied","Data":"98a7f80bc403f7eebd6a7792b769602a997d5d454768f1bfc98a1f14bb022233"} Apr 16 08:39:44.239877 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:44.239782 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret\") pod \"global-pull-secret-syncer-7mf7d\" (UID: \"dc53c5fe-6772-48e1-b1d9-82b3bf47aca3\") " pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:44.242539 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:44.242505 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc53c5fe-6772-48e1-b1d9-82b3bf47aca3-original-pull-secret\") pod \"global-pull-secret-syncer-7mf7d\" (UID: \"dc53c5fe-6772-48e1-b1d9-82b3bf47aca3\") " pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:44.380014 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:44.379978 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mf7d" Apr 16 08:39:44.548805 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:44.548613 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7mf7d"] Apr 16 08:39:44.554660 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:39:44.554633 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc53c5fe_6772_48e1_b1d9_82b3bf47aca3.slice/crio-6a58697c86f063924ba6726c6a93948e65e8555509382145d057403b8b249c0d WatchSource:0}: Error finding container 6a58697c86f063924ba6726c6a93948e65e8555509382145d057403b8b249c0d: Status 404 returned error can't find the container with id 6a58697c86f063924ba6726c6a93948e65e8555509382145d057403b8b249c0d Apr 16 08:39:44.908200 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:44.908161 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7mf7d" event={"ID":"dc53c5fe-6772-48e1-b1d9-82b3bf47aca3","Type":"ContainerStarted","Data":"6a58697c86f063924ba6726c6a93948e65e8555509382145d057403b8b249c0d"} Apr 16 08:39:44.910545 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:44.910519 2578 generic.go:358] "Generic (PLEG): container finished" podID="2583f7f3-820e-46ff-b710-c2256f41f5c1" containerID="ad511af69f8bd60e70038e8bb92f519abc39c9e6dd7cdfb14df38f83ca255250" exitCode=0 Apr 16 08:39:44.910658 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:44.910573 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nsv7m" event={"ID":"2583f7f3-820e-46ff-b710-c2256f41f5c1","Type":"ContainerDied","Data":"ad511af69f8bd60e70038e8bb92f519abc39c9e6dd7cdfb14df38f83ca255250"} Apr 16 08:39:45.146510 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:45.146469 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:39:45.146690 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:45.146547 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert\") pod \"ingress-canary-4tvsn\" (UID: \"11093fee-55ea-464a-b838-08d5d6f8e907\") " pod="openshift-ingress-canary/ingress-canary-4tvsn" Apr 16 08:39:45.146690 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:45.146622 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:39:45.146690 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:45.146664 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:39:45.146690 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:45.146691 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls podName:8cea109b-1867-4bf4-a48a-15604584a8d2 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:49.146671586 +0000 UTC m=+41.066657441 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls") pod "dns-default-fqcdx" (UID: "8cea109b-1867-4bf4-a48a-15604584a8d2") : secret "dns-default-metrics-tls" not found Apr 16 08:39:45.146920 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:45.146734 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert podName:11093fee-55ea-464a-b838-08d5d6f8e907 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:49.146698419 +0000 UTC m=+41.066684281 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert") pod "ingress-canary-4tvsn" (UID: "11093fee-55ea-464a-b838-08d5d6f8e907") : secret "canary-serving-cert" not found Apr 16 08:39:45.915734 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:45.915681 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nsv7m" event={"ID":"2583f7f3-820e-46ff-b710-c2256f41f5c1","Type":"ContainerStarted","Data":"f17267f19b00d202b95224803e8704174650b6566b9cee60627236649704b8ee"} Apr 16 08:39:45.937875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:45.937825 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nsv7m" podStartSLOduration=4.930829974 podStartE2EDuration="37.937809747s" podCreationTimestamp="2026-04-16 08:39:08 +0000 UTC" firstStartedPulling="2026-04-16 08:39:09.920946381 +0000 UTC m=+1.840932245" lastFinishedPulling="2026-04-16 08:39:42.927926164 +0000 UTC m=+34.847912018" observedRunningTime="2026-04-16 08:39:45.936135246 +0000 UTC m=+37.856121157" watchObservedRunningTime="2026-04-16 08:39:45.937809747 +0000 UTC m=+37.857795622" Apr 16 08:39:48.922009 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:48.921971 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7mf7d" event={"ID":"dc53c5fe-6772-48e1-b1d9-82b3bf47aca3","Type":"ContainerStarted","Data":"ad1d24028af86d6df19adb987ee48ee5ac7ecacad8537544b60922081afdf57c"} Apr 16 08:39:49.180035 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:49.179947 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:39:49.180035 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:49.179998 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert\") pod \"ingress-canary-4tvsn\" (UID: \"11093fee-55ea-464a-b838-08d5d6f8e907\") " pod="openshift-ingress-canary/ingress-canary-4tvsn" Apr 16 08:39:49.180204 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:49.180088 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:39:49.180204 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:49.180093 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:39:49.180204 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:49.180147 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert podName:11093fee-55ea-464a-b838-08d5d6f8e907 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:57.180131566 +0000 UTC m=+49.100117420 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert") pod "ingress-canary-4tvsn" (UID: "11093fee-55ea-464a-b838-08d5d6f8e907") : secret "canary-serving-cert" not found Apr 16 08:39:49.180204 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:49.180160 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls podName:8cea109b-1867-4bf4-a48a-15604584a8d2 nodeName:}" failed. No retries permitted until 2026-04-16 08:39:57.180154608 +0000 UTC m=+49.100140461 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls") pod "dns-default-fqcdx" (UID: "8cea109b-1867-4bf4-a48a-15604584a8d2") : secret "dns-default-metrics-tls" not found Apr 16 08:39:57.239533 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:57.239493 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert\") pod \"ingress-canary-4tvsn\" (UID: \"11093fee-55ea-464a-b838-08d5d6f8e907\") " pod="openshift-ingress-canary/ingress-canary-4tvsn" Apr 16 08:39:57.239996 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:39:57.239592 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:39:57.239996 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:57.239651 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:39:57.239996 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:57.239681 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:39:57.239996 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:57.239755 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert podName:11093fee-55ea-464a-b838-08d5d6f8e907 nodeName:}" failed. No retries permitted until 2026-04-16 08:40:13.239738055 +0000 UTC m=+65.159723912 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert") pod "ingress-canary-4tvsn" (UID: "11093fee-55ea-464a-b838-08d5d6f8e907") : secret "canary-serving-cert" not found Apr 16 08:39:57.239996 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:39:57.239771 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls podName:8cea109b-1867-4bf4-a48a-15604584a8d2 nodeName:}" failed. No retries permitted until 2026-04-16 08:40:13.239765002 +0000 UTC m=+65.159750855 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls") pod "dns-default-fqcdx" (UID: "8cea109b-1867-4bf4-a48a-15604584a8d2") : secret "dns-default-metrics-tls" not found Apr 16 08:40:05.903272 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:05.903237 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cvzdh" Apr 16 08:40:05.932074 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:05.930118 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7mf7d" podStartSLOduration=49.912461662 podStartE2EDuration="53.930103178s" podCreationTimestamp="2026-04-16 08:39:12 +0000 UTC" firstStartedPulling="2026-04-16 08:39:44.556387034 +0000 UTC m=+36.476372891" lastFinishedPulling="2026-04-16 08:39:48.574028553 +0000 UTC m=+40.494014407" observedRunningTime="2026-04-16 08:39:48.936550479 +0000 UTC m=+40.856536354" watchObservedRunningTime="2026-04-16 08:40:05.930103178 +0000 UTC m=+57.850089052" Apr 16 08:40:13.250060 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:13.250017 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:40:13.250060 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:13.250070 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert\") pod \"ingress-canary-4tvsn\" (UID: \"11093fee-55ea-464a-b838-08d5d6f8e907\") " pod="openshift-ingress-canary/ingress-canary-4tvsn" Apr 16 08:40:13.250533 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:40:13.250163 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:40:13.250533 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:40:13.250168 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:40:13.250533 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:40:13.250210 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert podName:11093fee-55ea-464a-b838-08d5d6f8e907 nodeName:}" failed. No retries permitted until 2026-04-16 08:40:45.250197653 +0000 UTC m=+97.170183506 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert") pod "ingress-canary-4tvsn" (UID: "11093fee-55ea-464a-b838-08d5d6f8e907") : secret "canary-serving-cert" not found Apr 16 08:40:13.250533 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:40:13.250246 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls podName:8cea109b-1867-4bf4-a48a-15604584a8d2 nodeName:}" failed. No retries permitted until 2026-04-16 08:40:45.250226838 +0000 UTC m=+97.170212709 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls") pod "dns-default-fqcdx" (UID: "8cea109b-1867-4bf4-a48a-15604584a8d2") : secret "dns-default-metrics-tls" not found Apr 16 08:40:13.350704 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:13.350671 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs\") pod \"network-metrics-daemon-kvw77\" (UID: \"13e8353c-4eb0-4abd-98df-42ece4ec0318\") " pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:40:13.353525 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:13.353506 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 08:40:13.361452 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:40:13.361435 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 08:40:13.361532 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:40:13.361486 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs podName:13e8353c-4eb0-4abd-98df-42ece4ec0318 nodeName:}" failed. No retries permitted until 2026-04-16 08:41:17.361470123 +0000 UTC m=+129.281455976 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs") pod "network-metrics-daemon-kvw77" (UID: "13e8353c-4eb0-4abd-98df-42ece4ec0318") : secret "metrics-daemon-secret" not found Apr 16 08:40:13.451444 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:13.451404 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zp5k\" (UniqueName: \"kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k\") pod \"network-check-target-x9sn2\" (UID: \"e7a5a6cf-7715-4262-8d6b-d3268b40a1ad\") " pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:40:13.454350 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:13.454331 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 08:40:13.464597 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:13.464575 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 08:40:13.476162 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:13.476137 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zp5k\" (UniqueName: \"kubernetes.io/projected/e7a5a6cf-7715-4262-8d6b-d3268b40a1ad-kube-api-access-9zp5k\") pod \"network-check-target-x9sn2\" (UID: \"e7a5a6cf-7715-4262-8d6b-d3268b40a1ad\") " pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:40:13.776799 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:13.776771 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-zdwck\"" Apr 16 08:40:13.784789 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:13.784764 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:40:13.901090 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:13.901063 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-x9sn2"] Apr 16 08:40:13.904308 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:40:13.904270 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7a5a6cf_7715_4262_8d6b_d3268b40a1ad.slice/crio-fde765443107750046169a087227e54eaa9d41a7bb5dab9d60085c838ef42aac WatchSource:0}: Error finding container fde765443107750046169a087227e54eaa9d41a7bb5dab9d60085c838ef42aac: Status 404 returned error can't find the container with id fde765443107750046169a087227e54eaa9d41a7bb5dab9d60085c838ef42aac Apr 16 08:40:13.973959 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:13.973918 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-x9sn2" event={"ID":"e7a5a6cf-7715-4262-8d6b-d3268b40a1ad","Type":"ContainerStarted","Data":"fde765443107750046169a087227e54eaa9d41a7bb5dab9d60085c838ef42aac"} Apr 16 08:40:16.982203 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:16.982166 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-x9sn2" event={"ID":"e7a5a6cf-7715-4262-8d6b-d3268b40a1ad","Type":"ContainerStarted","Data":"a3ab0bc6f00406e579a47e66b0a650d9e4e9178f1ebf74b6e909a38875e5979e"} Apr 16 08:40:16.982563 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:16.982348 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:40:16.997784 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:16.997733 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-x9sn2" podStartSLOduration=66.348170483 podStartE2EDuration="1m8.997702964s" podCreationTimestamp="2026-04-16 08:39:08 +0000 UTC" firstStartedPulling="2026-04-16 08:40:13.906060227 +0000 UTC m=+65.826046080" lastFinishedPulling="2026-04-16 08:40:16.555592705 +0000 UTC m=+68.475578561" observedRunningTime="2026-04-16 08:40:16.996913491 +0000 UTC m=+68.916899366" watchObservedRunningTime="2026-04-16 08:40:16.997702964 +0000 UTC m=+68.917688842" Apr 16 08:40:45.261318 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:45.261196 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:40:45.261318 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:45.261253 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert\") pod \"ingress-canary-4tvsn\" (UID: \"11093fee-55ea-464a-b838-08d5d6f8e907\") " pod="openshift-ingress-canary/ingress-canary-4tvsn" Apr 16 08:40:45.261823 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:40:45.261355 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:40:45.261823 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:40:45.261355 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:40:45.261823 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:40:45.261434 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls podName:8cea109b-1867-4bf4-a48a-15604584a8d2 nodeName:}" failed. No retries permitted until 2026-04-16 08:41:49.261411153 +0000 UTC m=+161.181397011 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls") pod "dns-default-fqcdx" (UID: "8cea109b-1867-4bf4-a48a-15604584a8d2") : secret "dns-default-metrics-tls" not found Apr 16 08:40:45.261823 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:40:45.261451 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert podName:11093fee-55ea-464a-b838-08d5d6f8e907 nodeName:}" failed. No retries permitted until 2026-04-16 08:41:49.261440777 +0000 UTC m=+161.181426629 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert") pod "ingress-canary-4tvsn" (UID: "11093fee-55ea-464a-b838-08d5d6f8e907") : secret "canary-serving-cert" not found Apr 16 08:40:47.986529 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:40:47.986499 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-x9sn2" Apr 16 08:41:00.430045 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.430011 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq"] Apr 16 08:41:00.431691 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.431675 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq" Apr 16 08:41:00.434270 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.434245 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 08:41:00.434403 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.434289 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 08:41:00.434403 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.434254 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 08:41:00.435379 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.435362 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:41:00.435464 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.435391 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-t5jrh\"" Apr 16 08:41:00.440528 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.440511 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq"] Apr 16 08:41:00.463903 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.463873 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq7wm\" (UniqueName: \"kubernetes.io/projected/c49165de-d15e-468a-9b37-71d0defef4a1-kube-api-access-mq7wm\") pod \"kube-storage-version-migrator-operator-756bb7d76f-m7lwq\" (UID: \"c49165de-d15e-468a-9b37-71d0defef4a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq" Apr 16 08:41:00.464013 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.463937 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c49165de-d15e-468a-9b37-71d0defef4a1-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-m7lwq\" (UID: \"c49165de-d15e-468a-9b37-71d0defef4a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq" Apr 16 08:41:00.464013 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.463983 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c49165de-d15e-468a-9b37-71d0defef4a1-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-m7lwq\" (UID: \"c49165de-d15e-468a-9b37-71d0defef4a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq" Apr 16 08:41:00.537588 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.537556 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-nz5z7"] Apr 16 08:41:00.539479 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.539462 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.542165 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.542142 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-76kkz\"" Apr 16 08:41:00.542285 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.542148 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 08:41:00.542285 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.542182 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 08:41:00.542285 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.542146 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 08:41:00.542452 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.542293 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 08:41:00.547906 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.547886 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 08:41:00.549958 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.549937 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-nz5z7"] Apr 16 08:41:00.564849 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.564825 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/97f24f3e-056d-4441-bbc0-42973fb6dcc4-tmp\") pod \"insights-operator-5785d4fcdd-nz5z7\" (UID: \"97f24f3e-056d-4441-bbc0-42973fb6dcc4\") " pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.564968 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.564858 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97f24f3e-056d-4441-bbc0-42973fb6dcc4-serving-cert\") pod \"insights-operator-5785d4fcdd-nz5z7\" (UID: \"97f24f3e-056d-4441-bbc0-42973fb6dcc4\") " pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.564968 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.564905 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c49165de-d15e-468a-9b37-71d0defef4a1-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-m7lwq\" (UID: \"c49165de-d15e-468a-9b37-71d0defef4a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq" Apr 16 08:41:00.564968 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.564938 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mq7wm\" (UniqueName: \"kubernetes.io/projected/c49165de-d15e-468a-9b37-71d0defef4a1-kube-api-access-mq7wm\") pod \"kube-storage-version-migrator-operator-756bb7d76f-m7lwq\" (UID: \"c49165de-d15e-468a-9b37-71d0defef4a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq" Apr 16 08:41:00.565067 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.564990 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97f24f3e-056d-4441-bbc0-42973fb6dcc4-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-nz5z7\" (UID: \"97f24f3e-056d-4441-bbc0-42973fb6dcc4\") " pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.565067 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.565020 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c49165de-d15e-468a-9b37-71d0defef4a1-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-m7lwq\" (UID: \"c49165de-d15e-468a-9b37-71d0defef4a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq" Apr 16 08:41:00.565155 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.565129 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/97f24f3e-056d-4441-bbc0-42973fb6dcc4-snapshots\") pod \"insights-operator-5785d4fcdd-nz5z7\" (UID: \"97f24f3e-056d-4441-bbc0-42973fb6dcc4\") " pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.565202 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.565186 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97f24f3e-056d-4441-bbc0-42973fb6dcc4-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-nz5z7\" (UID: \"97f24f3e-056d-4441-bbc0-42973fb6dcc4\") " pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.565261 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.565238 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g765v\" (UniqueName: \"kubernetes.io/projected/97f24f3e-056d-4441-bbc0-42973fb6dcc4-kube-api-access-g765v\") pod \"insights-operator-5785d4fcdd-nz5z7\" (UID: \"97f24f3e-056d-4441-bbc0-42973fb6dcc4\") " pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.565394 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.565379 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c49165de-d15e-468a-9b37-71d0defef4a1-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-m7lwq\" (UID: \"c49165de-d15e-468a-9b37-71d0defef4a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq" Apr 16 08:41:00.567349 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.567329 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c49165de-d15e-468a-9b37-71d0defef4a1-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-m7lwq\" (UID: \"c49165de-d15e-468a-9b37-71d0defef4a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq" Apr 16 08:41:00.574129 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.574109 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq7wm\" (UniqueName: \"kubernetes.io/projected/c49165de-d15e-468a-9b37-71d0defef4a1-kube-api-access-mq7wm\") pod \"kube-storage-version-migrator-operator-756bb7d76f-m7lwq\" (UID: \"c49165de-d15e-468a-9b37-71d0defef4a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq" Apr 16 08:41:00.645825 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.645789 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-mxc5t"] Apr 16 08:41:00.648029 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.648007 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4"] Apr 16 08:41:00.648172 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.648156 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mxc5t" Apr 16 08:41:00.649988 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.649968 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-p5qrs"] Apr 16 08:41:00.650119 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.650102 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4" Apr 16 08:41:00.650900 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.650883 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-mzfbh\"" Apr 16 08:41:00.651782 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.651767 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" Apr 16 08:41:00.653099 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.653082 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 08:41:00.653204 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.653115 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 08:41:00.653204 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.653084 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:41:00.653571 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.653552 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-s7l86\"" Apr 16 08:41:00.654385 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.654365 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:41:00.654811 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.654793 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 08:41:00.654898 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.654855 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 08:41:00.654898 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.654817 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-xk74d\"" Apr 16 08:41:00.654898 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.654813 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 08:41:00.665453 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.665430 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 08:41:00.665654 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.665629 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97f24f3e-056d-4441-bbc0-42973fb6dcc4-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-nz5z7\" (UID: \"97f24f3e-056d-4441-bbc0-42973fb6dcc4\") " pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.665771 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.665673 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/97f24f3e-056d-4441-bbc0-42973fb6dcc4-snapshots\") pod \"insights-operator-5785d4fcdd-nz5z7\" (UID: \"97f24f3e-056d-4441-bbc0-42973fb6dcc4\") " pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.665771 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.665701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97f24f3e-056d-4441-bbc0-42973fb6dcc4-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-nz5z7\" (UID: \"97f24f3e-056d-4441-bbc0-42973fb6dcc4\") " pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.665771 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.665758 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c2240e1-8fc0-49b6-9c23-73dddaed0476-trusted-ca\") pod \"console-operator-d87b8d5fc-p5qrs\" (UID: \"5c2240e1-8fc0-49b6-9c23-73dddaed0476\") " pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" Apr 16 08:41:00.665947 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.665783 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d067a162-f63e-4f9c-a4fe-f75f406f7444-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-pl7x4\" (UID: \"d067a162-f63e-4f9c-a4fe-f75f406f7444\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4" Apr 16 08:41:00.665947 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.665811 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5nr4\" (UniqueName: \"kubernetes.io/projected/d067a162-f63e-4f9c-a4fe-f75f406f7444-kube-api-access-g5nr4\") pod \"cluster-samples-operator-667775844f-pl7x4\" (UID: \"d067a162-f63e-4f9c-a4fe-f75f406f7444\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4" Apr 16 08:41:00.665947 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.665838 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g765v\" (UniqueName: \"kubernetes.io/projected/97f24f3e-056d-4441-bbc0-42973fb6dcc4-kube-api-access-g765v\") pod \"insights-operator-5785d4fcdd-nz5z7\" (UID: \"97f24f3e-056d-4441-bbc0-42973fb6dcc4\") " pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.665947 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.665887 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/97f24f3e-056d-4441-bbc0-42973fb6dcc4-tmp\") pod \"insights-operator-5785d4fcdd-nz5z7\" (UID: \"97f24f3e-056d-4441-bbc0-42973fb6dcc4\") " pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.665947 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.665913 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97f24f3e-056d-4441-bbc0-42973fb6dcc4-serving-cert\") pod \"insights-operator-5785d4fcdd-nz5z7\" (UID: \"97f24f3e-056d-4441-bbc0-42973fb6dcc4\") " pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.665947 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.665936 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c2240e1-8fc0-49b6-9c23-73dddaed0476-config\") pod \"console-operator-d87b8d5fc-p5qrs\" (UID: \"5c2240e1-8fc0-49b6-9c23-73dddaed0476\") " pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" Apr 16 08:41:00.666205 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.665985 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlljj\" (UniqueName: \"kubernetes.io/projected/5c2240e1-8fc0-49b6-9c23-73dddaed0476-kube-api-access-mlljj\") pod \"console-operator-d87b8d5fc-p5qrs\" (UID: \"5c2240e1-8fc0-49b6-9c23-73dddaed0476\") " pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" Apr 16 08:41:00.666205 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.666015 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2wqc\" (UniqueName: \"kubernetes.io/projected/8fabd7f4-27fb-41c6-afdd-d79b183eaa59-kube-api-access-m2wqc\") pod \"network-check-source-7b678d77c7-mxc5t\" (UID: \"8fabd7f4-27fb-41c6-afdd-d79b183eaa59\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mxc5t" Apr 16 08:41:00.666205 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.666043 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c2240e1-8fc0-49b6-9c23-73dddaed0476-serving-cert\") pod \"console-operator-d87b8d5fc-p5qrs\" (UID: \"5c2240e1-8fc0-49b6-9c23-73dddaed0476\") " pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" Apr 16 08:41:00.666338 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.666319 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97f24f3e-056d-4441-bbc0-42973fb6dcc4-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-nz5z7\" (UID: \"97f24f3e-056d-4441-bbc0-42973fb6dcc4\") " pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.666436 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.666414 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/97f24f3e-056d-4441-bbc0-42973fb6dcc4-snapshots\") pod \"insights-operator-5785d4fcdd-nz5z7\" (UID: \"97f24f3e-056d-4441-bbc0-42973fb6dcc4\") " pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.666629 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.666604 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97f24f3e-056d-4441-bbc0-42973fb6dcc4-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-nz5z7\" (UID: \"97f24f3e-056d-4441-bbc0-42973fb6dcc4\") " pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.666799 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.666625 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/97f24f3e-056d-4441-bbc0-42973fb6dcc4-tmp\") pod \"insights-operator-5785d4fcdd-nz5z7\" (UID: \"97f24f3e-056d-4441-bbc0-42973fb6dcc4\") " pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.669519 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.669496 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97f24f3e-056d-4441-bbc0-42973fb6dcc4-serving-cert\") pod \"insights-operator-5785d4fcdd-nz5z7\" (UID: \"97f24f3e-056d-4441-bbc0-42973fb6dcc4\") " pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.671621 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.671600 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-mxc5t"] Apr 16 08:41:00.673305 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.673254 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4"] Apr 16 08:41:00.674348 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.674328 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-p5qrs"] Apr 16 08:41:00.683762 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.683695 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g765v\" (UniqueName: \"kubernetes.io/projected/97f24f3e-056d-4441-bbc0-42973fb6dcc4-kube-api-access-g765v\") pod \"insights-operator-5785d4fcdd-nz5z7\" (UID: \"97f24f3e-056d-4441-bbc0-42973fb6dcc4\") " pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.740805 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.740764 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq" Apr 16 08:41:00.766789 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.766761 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlljj\" (UniqueName: \"kubernetes.io/projected/5c2240e1-8fc0-49b6-9c23-73dddaed0476-kube-api-access-mlljj\") pod \"console-operator-d87b8d5fc-p5qrs\" (UID: \"5c2240e1-8fc0-49b6-9c23-73dddaed0476\") " pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" Apr 16 08:41:00.766929 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.766795 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2wqc\" (UniqueName: \"kubernetes.io/projected/8fabd7f4-27fb-41c6-afdd-d79b183eaa59-kube-api-access-m2wqc\") pod \"network-check-source-7b678d77c7-mxc5t\" (UID: \"8fabd7f4-27fb-41c6-afdd-d79b183eaa59\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mxc5t" Apr 16 08:41:00.766929 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.766814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c2240e1-8fc0-49b6-9c23-73dddaed0476-serving-cert\") pod \"console-operator-d87b8d5fc-p5qrs\" (UID: \"5c2240e1-8fc0-49b6-9c23-73dddaed0476\") " pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" Apr 16 08:41:00.766929 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.766850 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c2240e1-8fc0-49b6-9c23-73dddaed0476-trusted-ca\") pod \"console-operator-d87b8d5fc-p5qrs\" (UID: \"5c2240e1-8fc0-49b6-9c23-73dddaed0476\") " pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" Apr 16 08:41:00.766929 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.766867 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d067a162-f63e-4f9c-a4fe-f75f406f7444-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-pl7x4\" (UID: \"d067a162-f63e-4f9c-a4fe-f75f406f7444\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4" Apr 16 08:41:00.766929 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.766884 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5nr4\" (UniqueName: \"kubernetes.io/projected/d067a162-f63e-4f9c-a4fe-f75f406f7444-kube-api-access-g5nr4\") pod \"cluster-samples-operator-667775844f-pl7x4\" (UID: \"d067a162-f63e-4f9c-a4fe-f75f406f7444\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4" Apr 16 08:41:00.767211 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.766927 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c2240e1-8fc0-49b6-9c23-73dddaed0476-config\") pod \"console-operator-d87b8d5fc-p5qrs\" (UID: \"5c2240e1-8fc0-49b6-9c23-73dddaed0476\") " pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" Apr 16 08:41:00.767211 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:00.766994 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 08:41:00.767211 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:00.767054 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d067a162-f63e-4f9c-a4fe-f75f406f7444-samples-operator-tls podName:d067a162-f63e-4f9c-a4fe-f75f406f7444 nodeName:}" failed. No retries permitted until 2026-04-16 08:41:01.267035166 +0000 UTC m=+113.187021027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d067a162-f63e-4f9c-a4fe-f75f406f7444-samples-operator-tls") pod "cluster-samples-operator-667775844f-pl7x4" (UID: "d067a162-f63e-4f9c-a4fe-f75f406f7444") : secret "samples-operator-tls" not found Apr 16 08:41:00.767629 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.767602 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c2240e1-8fc0-49b6-9c23-73dddaed0476-config\") pod \"console-operator-d87b8d5fc-p5qrs\" (UID: \"5c2240e1-8fc0-49b6-9c23-73dddaed0476\") " pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" Apr 16 08:41:00.767775 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.767732 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c2240e1-8fc0-49b6-9c23-73dddaed0476-trusted-ca\") pod \"console-operator-d87b8d5fc-p5qrs\" (UID: \"5c2240e1-8fc0-49b6-9c23-73dddaed0476\") " pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" Apr 16 08:41:00.769291 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.769266 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c2240e1-8fc0-49b6-9c23-73dddaed0476-serving-cert\") pod \"console-operator-d87b8d5fc-p5qrs\" (UID: \"5c2240e1-8fc0-49b6-9c23-73dddaed0476\") " pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" Apr 16 08:41:00.776326 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.776297 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5nr4\" (UniqueName: \"kubernetes.io/projected/d067a162-f63e-4f9c-a4fe-f75f406f7444-kube-api-access-g5nr4\") pod \"cluster-samples-operator-667775844f-pl7x4\" (UID: \"d067a162-f63e-4f9c-a4fe-f75f406f7444\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4" Apr 16 08:41:00.776496 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.776476 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlljj\" (UniqueName: \"kubernetes.io/projected/5c2240e1-8fc0-49b6-9c23-73dddaed0476-kube-api-access-mlljj\") pod \"console-operator-d87b8d5fc-p5qrs\" (UID: \"5c2240e1-8fc0-49b6-9c23-73dddaed0476\") " pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" Apr 16 08:41:00.776573 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.776480 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2wqc\" (UniqueName: \"kubernetes.io/projected/8fabd7f4-27fb-41c6-afdd-d79b183eaa59-kube-api-access-m2wqc\") pod \"network-check-source-7b678d77c7-mxc5t\" (UID: \"8fabd7f4-27fb-41c6-afdd-d79b183eaa59\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mxc5t" Apr 16 08:41:00.849231 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.849200 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" Apr 16 08:41:00.853886 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.853857 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq"] Apr 16 08:41:00.858311 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:41:00.858287 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc49165de_d15e_468a_9b37_71d0defef4a1.slice/crio-673f0e1e51572fcf9c6d11b5e27480ab5b4c6f24ba4c7386f5d056c8a5e10552 WatchSource:0}: Error finding container 673f0e1e51572fcf9c6d11b5e27480ab5b4c6f24ba4c7386f5d056c8a5e10552: Status 404 returned error can't find the container with id 673f0e1e51572fcf9c6d11b5e27480ab5b4c6f24ba4c7386f5d056c8a5e10552 Apr 16 08:41:00.959458 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.959381 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mxc5t" Apr 16 08:41:00.963196 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.963168 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-nz5z7"] Apr 16 08:41:00.966258 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:41:00.966213 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97f24f3e_056d_4441_bbc0_42973fb6dcc4.slice/crio-12d17142a9203f42eb072068262980937644c7c3bcf808780a7680c2ae34537a WatchSource:0}: Error finding container 12d17142a9203f42eb072068262980937644c7c3bcf808780a7680c2ae34537a: Status 404 returned error can't find the container with id 12d17142a9203f42eb072068262980937644c7c3bcf808780a7680c2ae34537a Apr 16 08:41:00.976785 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:00.976759 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" Apr 16 08:41:01.076445 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:01.076410 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" event={"ID":"97f24f3e-056d-4441-bbc0-42973fb6dcc4","Type":"ContainerStarted","Data":"12d17142a9203f42eb072068262980937644c7c3bcf808780a7680c2ae34537a"} Apr 16 08:41:01.077410 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:01.077383 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq" event={"ID":"c49165de-d15e-468a-9b37-71d0defef4a1","Type":"ContainerStarted","Data":"673f0e1e51572fcf9c6d11b5e27480ab5b4c6f24ba4c7386f5d056c8a5e10552"} Apr 16 08:41:01.081674 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:01.081647 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-mxc5t"] Apr 16 08:41:01.086010 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:41:01.085983 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fabd7f4_27fb_41c6_afdd_d79b183eaa59.slice/crio-768306dd422d7ace4edef6c0d1be8628f19638919b9b39e0f26261bbde482093 WatchSource:0}: Error finding container 768306dd422d7ace4edef6c0d1be8628f19638919b9b39e0f26261bbde482093: Status 404 returned error can't find the container with id 768306dd422d7ace4edef6c0d1be8628f19638919b9b39e0f26261bbde482093 Apr 16 08:41:01.103528 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:01.103504 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-p5qrs"] Apr 16 08:41:01.110730 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:41:01.110688 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c2240e1_8fc0_49b6_9c23_73dddaed0476.slice/crio-7ddab9ba71a2f0c6163ff625525d0eb35db6b3ddf1002580f5f50f027cb0a716 WatchSource:0}: Error finding container 7ddab9ba71a2f0c6163ff625525d0eb35db6b3ddf1002580f5f50f027cb0a716: Status 404 returned error can't find the container with id 7ddab9ba71a2f0c6163ff625525d0eb35db6b3ddf1002580f5f50f027cb0a716 Apr 16 08:41:01.270892 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:01.270787 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d067a162-f63e-4f9c-a4fe-f75f406f7444-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-pl7x4\" (UID: \"d067a162-f63e-4f9c-a4fe-f75f406f7444\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4" Apr 16 08:41:01.271048 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:01.270962 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 08:41:01.271048 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:01.271039 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d067a162-f63e-4f9c-a4fe-f75f406f7444-samples-operator-tls podName:d067a162-f63e-4f9c-a4fe-f75f406f7444 nodeName:}" failed. No retries permitted until 2026-04-16 08:41:02.271017494 +0000 UTC m=+114.191003350 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d067a162-f63e-4f9c-a4fe-f75f406f7444-samples-operator-tls") pod "cluster-samples-operator-667775844f-pl7x4" (UID: "d067a162-f63e-4f9c-a4fe-f75f406f7444") : secret "samples-operator-tls" not found Apr 16 08:41:02.082378 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:02.082341 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mxc5t" event={"ID":"8fabd7f4-27fb-41c6-afdd-d79b183eaa59","Type":"ContainerStarted","Data":"1cebd1e58fd0fec3b8f73f82960a0df54d0dcea4bd51ff0a85092e9cca9ec1d0"} Apr 16 08:41:02.082378 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:02.082387 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mxc5t" event={"ID":"8fabd7f4-27fb-41c6-afdd-d79b183eaa59","Type":"ContainerStarted","Data":"768306dd422d7ace4edef6c0d1be8628f19638919b9b39e0f26261bbde482093"} Apr 16 08:41:02.084212 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:02.084129 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" event={"ID":"5c2240e1-8fc0-49b6-9c23-73dddaed0476","Type":"ContainerStarted","Data":"7ddab9ba71a2f0c6163ff625525d0eb35db6b3ddf1002580f5f50f027cb0a716"} Apr 16 08:41:02.098201 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:02.098155 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mxc5t" podStartSLOduration=2.098120416 podStartE2EDuration="2.098120416s" podCreationTimestamp="2026-04-16 08:41:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:41:02.097123535 +0000 UTC m=+114.017109411" watchObservedRunningTime="2026-04-16 08:41:02.098120416 +0000 UTC m=+114.018106290" Apr 16 08:41:02.277910 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:02.277873 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d067a162-f63e-4f9c-a4fe-f75f406f7444-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-pl7x4\" (UID: \"d067a162-f63e-4f9c-a4fe-f75f406f7444\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4" Apr 16 08:41:02.278101 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:02.278051 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 08:41:02.278167 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:02.278109 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d067a162-f63e-4f9c-a4fe-f75f406f7444-samples-operator-tls podName:d067a162-f63e-4f9c-a4fe-f75f406f7444 nodeName:}" failed. No retries permitted until 2026-04-16 08:41:04.278091803 +0000 UTC m=+116.198077669 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d067a162-f63e-4f9c-a4fe-f75f406f7444-samples-operator-tls") pod "cluster-samples-operator-667775844f-pl7x4" (UID: "d067a162-f63e-4f9c-a4fe-f75f406f7444") : secret "samples-operator-tls" not found Apr 16 08:41:04.090015 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:04.089975 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" event={"ID":"97f24f3e-056d-4441-bbc0-42973fb6dcc4","Type":"ContainerStarted","Data":"21d522cbbb86f3561ebe2bd3a9225e0778065e9292a15b4fdda05458edeac39c"} Apr 16 08:41:04.091217 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:04.091195 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq" event={"ID":"c49165de-d15e-468a-9b37-71d0defef4a1","Type":"ContainerStarted","Data":"fdf1b3bb3fb9f3aa8c38830e658ae077f95f394914a5375f027bf11b1ad1dafe"} Apr 16 08:41:04.092566 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:04.092540 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/0.log" Apr 16 08:41:04.092656 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:04.092581 2578 generic.go:358] "Generic (PLEG): container finished" podID="5c2240e1-8fc0-49b6-9c23-73dddaed0476" containerID="60cfcc6402c7ceb48a74d0eec908efa30e18e972c54c5227905426e6517c7465" exitCode=255 Apr 16 08:41:04.092656 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:04.092610 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" event={"ID":"5c2240e1-8fc0-49b6-9c23-73dddaed0476","Type":"ContainerDied","Data":"60cfcc6402c7ceb48a74d0eec908efa30e18e972c54c5227905426e6517c7465"} Apr 16 08:41:04.092843 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:04.092827 2578 scope.go:117] "RemoveContainer" containerID="60cfcc6402c7ceb48a74d0eec908efa30e18e972c54c5227905426e6517c7465" Apr 16 08:41:04.107285 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:04.107239 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" podStartSLOduration=1.343079575 podStartE2EDuration="4.107223983s" podCreationTimestamp="2026-04-16 08:41:00 +0000 UTC" firstStartedPulling="2026-04-16 08:41:00.968194014 +0000 UTC m=+112.888179891" lastFinishedPulling="2026-04-16 08:41:03.732338445 +0000 UTC m=+115.652324299" observedRunningTime="2026-04-16 08:41:04.106467197 +0000 UTC m=+116.026453115" watchObservedRunningTime="2026-04-16 08:41:04.107223983 +0000 UTC m=+116.027209884" Apr 16 08:41:04.124520 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:04.124469 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq" podStartSLOduration=1.2556230720000001 podStartE2EDuration="4.12445079s" podCreationTimestamp="2026-04-16 08:41:00 +0000 UTC" firstStartedPulling="2026-04-16 08:41:00.860128614 +0000 UTC m=+112.780114469" lastFinishedPulling="2026-04-16 08:41:03.728956326 +0000 UTC m=+115.648942187" observedRunningTime="2026-04-16 08:41:04.124340818 +0000 UTC m=+116.044326718" watchObservedRunningTime="2026-04-16 08:41:04.12445079 +0000 UTC m=+116.044436668" Apr 16 08:41:04.297771 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:04.297705 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d067a162-f63e-4f9c-a4fe-f75f406f7444-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-pl7x4\" (UID: \"d067a162-f63e-4f9c-a4fe-f75f406f7444\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4" Apr 16 08:41:04.297915 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:04.297889 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 08:41:04.297995 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:04.297983 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d067a162-f63e-4f9c-a4fe-f75f406f7444-samples-operator-tls podName:d067a162-f63e-4f9c-a4fe-f75f406f7444 nodeName:}" failed. No retries permitted until 2026-04-16 08:41:08.297961063 +0000 UTC m=+120.217946920 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d067a162-f63e-4f9c-a4fe-f75f406f7444-samples-operator-tls") pod "cluster-samples-operator-667775844f-pl7x4" (UID: "d067a162-f63e-4f9c-a4fe-f75f406f7444") : secret "samples-operator-tls" not found Apr 16 08:41:05.096623 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:05.096597 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/1.log" Apr 16 08:41:05.097007 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:05.096983 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/0.log" Apr 16 08:41:05.097067 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:05.097018 2578 generic.go:358] "Generic (PLEG): container finished" podID="5c2240e1-8fc0-49b6-9c23-73dddaed0476" containerID="46316751a0c68721e4f039a03767fafd307017d359736fb2faedfb1036a07fc6" exitCode=255 Apr 16 08:41:05.097118 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:05.097055 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" event={"ID":"5c2240e1-8fc0-49b6-9c23-73dddaed0476","Type":"ContainerDied","Data":"46316751a0c68721e4f039a03767fafd307017d359736fb2faedfb1036a07fc6"} Apr 16 08:41:05.097118 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:05.097104 2578 scope.go:117] "RemoveContainer" containerID="60cfcc6402c7ceb48a74d0eec908efa30e18e972c54c5227905426e6517c7465" Apr 16 08:41:05.097375 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:05.097352 2578 scope.go:117] "RemoveContainer" containerID="46316751a0c68721e4f039a03767fafd307017d359736fb2faedfb1036a07fc6" Apr 16 08:41:05.097627 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:05.097608 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-p5qrs_openshift-console-operator(5c2240e1-8fc0-49b6-9c23-73dddaed0476)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" podUID="5c2240e1-8fc0-49b6-9c23-73dddaed0476" Apr 16 08:41:06.014205 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.014174 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-j8sn4"] Apr 16 08:41:06.016504 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.016486 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-j8sn4" Apr 16 08:41:06.019167 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.019140 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 08:41:06.019267 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.019166 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 08:41:06.020293 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.020275 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 08:41:06.020416 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.020296 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-cgxhz\"" Apr 16 08:41:06.020416 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.020403 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 08:41:06.024337 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.024316 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-j8sn4"] Apr 16 08:41:06.101210 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.101187 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/1.log" Apr 16 08:41:06.101608 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.101503 2578 scope.go:117] "RemoveContainer" containerID="46316751a0c68721e4f039a03767fafd307017d359736fb2faedfb1036a07fc6" Apr 16 08:41:06.101680 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:06.101663 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-p5qrs_openshift-console-operator(5c2240e1-8fc0-49b6-9c23-73dddaed0476)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" podUID="5c2240e1-8fc0-49b6-9c23-73dddaed0476" Apr 16 08:41:06.114042 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.114018 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/54aedb4f-36c8-436f-92ee-3de821938134-signing-key\") pod \"service-ca-bfc587fb7-j8sn4\" (UID: \"54aedb4f-36c8-436f-92ee-3de821938134\") " pod="openshift-service-ca/service-ca-bfc587fb7-j8sn4" Apr 16 08:41:06.114162 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.114044 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5kzf\" (UniqueName: \"kubernetes.io/projected/54aedb4f-36c8-436f-92ee-3de821938134-kube-api-access-k5kzf\") pod \"service-ca-bfc587fb7-j8sn4\" (UID: \"54aedb4f-36c8-436f-92ee-3de821938134\") " pod="openshift-service-ca/service-ca-bfc587fb7-j8sn4" Apr 16 08:41:06.114162 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.114090 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/54aedb4f-36c8-436f-92ee-3de821938134-signing-cabundle\") pod \"service-ca-bfc587fb7-j8sn4\" (UID: \"54aedb4f-36c8-436f-92ee-3de821938134\") " pod="openshift-service-ca/service-ca-bfc587fb7-j8sn4" Apr 16 08:41:06.215400 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.215352 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/54aedb4f-36c8-436f-92ee-3de821938134-signing-key\") pod \"service-ca-bfc587fb7-j8sn4\" (UID: \"54aedb4f-36c8-436f-92ee-3de821938134\") " pod="openshift-service-ca/service-ca-bfc587fb7-j8sn4" Apr 16 08:41:06.215559 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.215411 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5kzf\" (UniqueName: \"kubernetes.io/projected/54aedb4f-36c8-436f-92ee-3de821938134-kube-api-access-k5kzf\") pod \"service-ca-bfc587fb7-j8sn4\" (UID: \"54aedb4f-36c8-436f-92ee-3de821938134\") " pod="openshift-service-ca/service-ca-bfc587fb7-j8sn4" Apr 16 08:41:06.215559 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.215516 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/54aedb4f-36c8-436f-92ee-3de821938134-signing-cabundle\") pod \"service-ca-bfc587fb7-j8sn4\" (UID: \"54aedb4f-36c8-436f-92ee-3de821938134\") " pod="openshift-service-ca/service-ca-bfc587fb7-j8sn4" Apr 16 08:41:06.216160 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.216133 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/54aedb4f-36c8-436f-92ee-3de821938134-signing-cabundle\") pod \"service-ca-bfc587fb7-j8sn4\" (UID: \"54aedb4f-36c8-436f-92ee-3de821938134\") " pod="openshift-service-ca/service-ca-bfc587fb7-j8sn4" Apr 16 08:41:06.217939 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.217922 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/54aedb4f-36c8-436f-92ee-3de821938134-signing-key\") pod \"service-ca-bfc587fb7-j8sn4\" (UID: \"54aedb4f-36c8-436f-92ee-3de821938134\") " pod="openshift-service-ca/service-ca-bfc587fb7-j8sn4" Apr 16 08:41:06.224512 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.224478 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5kzf\" (UniqueName: \"kubernetes.io/projected/54aedb4f-36c8-436f-92ee-3de821938134-kube-api-access-k5kzf\") pod \"service-ca-bfc587fb7-j8sn4\" (UID: \"54aedb4f-36c8-436f-92ee-3de821938134\") " pod="openshift-service-ca/service-ca-bfc587fb7-j8sn4" Apr 16 08:41:06.325617 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.325529 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-j8sn4" Apr 16 08:41:06.442496 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.442458 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-j8sn4"] Apr 16 08:41:06.445233 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:41:06.445205 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54aedb4f_36c8_436f_92ee_3de821938134.slice/crio-9effc52d786f10144c82c375873f5ff98853aeff414baf224e982a582465ae36 WatchSource:0}: Error finding container 9effc52d786f10144c82c375873f5ff98853aeff414baf224e982a582465ae36: Status 404 returned error can't find the container with id 9effc52d786f10144c82c375873f5ff98853aeff414baf224e982a582465ae36 Apr 16 08:41:06.489859 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:06.489830 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6ngxf_1ba56a43-ff49-4b6d-a602-289479e4e2f7/dns-node-resolver/0.log" Apr 16 08:41:07.108205 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:07.108174 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-j8sn4" event={"ID":"54aedb4f-36c8-436f-92ee-3de821938134","Type":"ContainerStarted","Data":"9effc52d786f10144c82c375873f5ff98853aeff414baf224e982a582465ae36"} Apr 16 08:41:07.686490 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:07.686460 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xxvcl_9befbb38-427c-4f04-9ac5-007147cbf0ea/node-ca/0.log" Apr 16 08:41:08.112434 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:08.112396 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-j8sn4" event={"ID":"54aedb4f-36c8-436f-92ee-3de821938134","Type":"ContainerStarted","Data":"fa114ce96da90aa76715916ab5aad380c7cb4b6931fe4e57418b97d1e695cbbc"} Apr 16 08:41:08.127382 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:08.127338 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-j8sn4" podStartSLOduration=1.585269996 podStartE2EDuration="3.127325327s" podCreationTimestamp="2026-04-16 08:41:05 +0000 UTC" firstStartedPulling="2026-04-16 08:41:06.44748317 +0000 UTC m=+118.367469023" lastFinishedPulling="2026-04-16 08:41:07.989538501 +0000 UTC m=+119.909524354" observedRunningTime="2026-04-16 08:41:08.126034615 +0000 UTC m=+120.046020491" watchObservedRunningTime="2026-04-16 08:41:08.127325327 +0000 UTC m=+120.047311248" Apr 16 08:41:08.330768 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:08.330656 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d067a162-f63e-4f9c-a4fe-f75f406f7444-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-pl7x4\" (UID: \"d067a162-f63e-4f9c-a4fe-f75f406f7444\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4" Apr 16 08:41:08.330916 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:08.330780 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 08:41:08.330916 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:08.330844 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d067a162-f63e-4f9c-a4fe-f75f406f7444-samples-operator-tls podName:d067a162-f63e-4f9c-a4fe-f75f406f7444 nodeName:}" failed. No retries permitted until 2026-04-16 08:41:16.330828405 +0000 UTC m=+128.250814258 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d067a162-f63e-4f9c-a4fe-f75f406f7444-samples-operator-tls") pod "cluster-samples-operator-667775844f-pl7x4" (UID: "d067a162-f63e-4f9c-a4fe-f75f406f7444") : secret "samples-operator-tls" not found Apr 16 08:41:10.977847 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:10.977806 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" Apr 16 08:41:10.977847 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:10.977842 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" Apr 16 08:41:10.978353 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:10.978329 2578 scope.go:117] "RemoveContainer" containerID="46316751a0c68721e4f039a03767fafd307017d359736fb2faedfb1036a07fc6" Apr 16 08:41:10.978534 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:10.978512 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-p5qrs_openshift-console-operator(5c2240e1-8fc0-49b6-9c23-73dddaed0476)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" podUID="5c2240e1-8fc0-49b6-9c23-73dddaed0476" Apr 16 08:41:16.398973 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:16.398932 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d067a162-f63e-4f9c-a4fe-f75f406f7444-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-pl7x4\" (UID: \"d067a162-f63e-4f9c-a4fe-f75f406f7444\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4" Apr 16 08:41:16.401441 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:16.401419 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d067a162-f63e-4f9c-a4fe-f75f406f7444-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-pl7x4\" (UID: \"d067a162-f63e-4f9c-a4fe-f75f406f7444\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4" Apr 16 08:41:16.567626 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:16.567599 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-s7l86\"" Apr 16 08:41:16.575322 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:16.575298 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4" Apr 16 08:41:16.695554 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:16.695523 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4"] Apr 16 08:41:17.136013 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:17.135978 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4" event={"ID":"d067a162-f63e-4f9c-a4fe-f75f406f7444","Type":"ContainerStarted","Data":"6716ef875ca934e7553f82b1df6f3f40087b0e1cc4039e7930a00d22c8783eeb"} Apr 16 08:41:17.408514 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:17.408431 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs\") pod \"network-metrics-daemon-kvw77\" (UID: \"13e8353c-4eb0-4abd-98df-42ece4ec0318\") " pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:41:17.411254 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:17.411222 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13e8353c-4eb0-4abd-98df-42ece4ec0318-metrics-certs\") pod \"network-metrics-daemon-kvw77\" (UID: \"13e8353c-4eb0-4abd-98df-42ece4ec0318\") " pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:41:17.687746 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:17.687651 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9gvlt\"" Apr 16 08:41:17.695914 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:17.695882 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kvw77" Apr 16 08:41:17.827322 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:17.827293 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kvw77"] Apr 16 08:41:18.256619 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:41:18.256582 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13e8353c_4eb0_4abd_98df_42ece4ec0318.slice/crio-a2eb167502845f13f285b227245fc1b05174a275596022e9d1b49dec0d9bdf7a WatchSource:0}: Error finding container a2eb167502845f13f285b227245fc1b05174a275596022e9d1b49dec0d9bdf7a: Status 404 returned error can't find the container with id a2eb167502845f13f285b227245fc1b05174a275596022e9d1b49dec0d9bdf7a Apr 16 08:41:19.142619 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:19.142586 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kvw77" event={"ID":"13e8353c-4eb0-4abd-98df-42ece4ec0318","Type":"ContainerStarted","Data":"a2eb167502845f13f285b227245fc1b05174a275596022e9d1b49dec0d9bdf7a"} Apr 16 08:41:19.144380 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:19.144343 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4" event={"ID":"d067a162-f63e-4f9c-a4fe-f75f406f7444","Type":"ContainerStarted","Data":"4ef5aa05da708166b78d46bdca70b16a41c91bc4a682698ede68c8dcf5aa3c58"} Apr 16 08:41:19.144380 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:19.144380 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4" event={"ID":"d067a162-f63e-4f9c-a4fe-f75f406f7444","Type":"ContainerStarted","Data":"a42a8ce7c2026b6ddc57ed40b4bbaa73ceff305f2d917bcb4a2821fb2793f333"} Apr 16 08:41:19.160115 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:19.160069 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-pl7x4" podStartSLOduration=17.610050068 podStartE2EDuration="19.160053661s" podCreationTimestamp="2026-04-16 08:41:00 +0000 UTC" firstStartedPulling="2026-04-16 08:41:16.733812727 +0000 UTC m=+128.653798581" lastFinishedPulling="2026-04-16 08:41:18.28381632 +0000 UTC m=+130.203802174" observedRunningTime="2026-04-16 08:41:19.159309955 +0000 UTC m=+131.079295854" watchObservedRunningTime="2026-04-16 08:41:19.160053661 +0000 UTC m=+131.080039536" Apr 16 08:41:20.148642 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:20.148601 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kvw77" event={"ID":"13e8353c-4eb0-4abd-98df-42ece4ec0318","Type":"ContainerStarted","Data":"d1c1ec49105fb7b535f5a3acc58955385de58cc96db47731e98efbff6a0bf35c"} Apr 16 08:41:20.149035 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:20.148648 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kvw77" event={"ID":"13e8353c-4eb0-4abd-98df-42ece4ec0318","Type":"ContainerStarted","Data":"b39b5e02db337f37e80e08c1ad410515a401a441fc4a0f0d6999ee320049c743"} Apr 16 08:41:20.167101 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:20.167055 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kvw77" podStartSLOduration=131.189609417 podStartE2EDuration="2m12.167040933s" podCreationTimestamp="2026-04-16 08:39:08 +0000 UTC" firstStartedPulling="2026-04-16 08:41:18.258574567 +0000 UTC m=+130.178560422" lastFinishedPulling="2026-04-16 08:41:19.236006067 +0000 UTC m=+131.155991938" observedRunningTime="2026-04-16 08:41:20.166217108 +0000 UTC m=+132.086202983" watchObservedRunningTime="2026-04-16 08:41:20.167040933 +0000 UTC m=+132.087026842" Apr 16 08:41:21.662016 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:21.661983 2578 scope.go:117] "RemoveContainer" containerID="46316751a0c68721e4f039a03767fafd307017d359736fb2faedfb1036a07fc6" Apr 16 08:41:22.160263 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:22.160235 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/1.log" Apr 16 08:41:22.160426 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:22.160313 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" event={"ID":"5c2240e1-8fc0-49b6-9c23-73dddaed0476","Type":"ContainerStarted","Data":"f924fdfd1c4dc80db8206ca94124280bc5e3ea85b337da7109cbbe34c23c4281"} Apr 16 08:41:22.160609 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:22.160591 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" Apr 16 08:41:22.177427 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:22.177382 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" podStartSLOduration=19.556040966 podStartE2EDuration="22.177367915s" podCreationTimestamp="2026-04-16 08:41:00 +0000 UTC" firstStartedPulling="2026-04-16 08:41:01.112436932 +0000 UTC m=+113.032422784" lastFinishedPulling="2026-04-16 08:41:03.73376387 +0000 UTC m=+115.653749733" observedRunningTime="2026-04-16 08:41:22.176096257 +0000 UTC m=+134.096082136" watchObservedRunningTime="2026-04-16 08:41:22.177367915 +0000 UTC m=+134.097353789" Apr 16 08:41:22.543826 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:22.543747 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-p5qrs" Apr 16 08:41:25.964925 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:25.964891 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2rm2g"] Apr 16 08:41:25.967160 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:25.967138 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2rm2g" Apr 16 08:41:25.970895 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:25.970872 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jjxz9\"" Apr 16 08:41:25.971016 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:25.970893 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 08:41:25.971016 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:25.970880 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 08:41:25.978438 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:25.978411 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-f45f95974-smmsv"] Apr 16 08:41:25.980305 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:25.980288 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:25.982331 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:25.982312 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2rm2g"] Apr 16 08:41:25.982827 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:25.982809 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-v9nxr\"" Apr 16 08:41:25.982972 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:25.982908 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 08:41:25.983042 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:25.982976 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 08:41:25.983122 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:25.983101 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 08:41:25.990081 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:25.990064 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 08:41:26.010431 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.010408 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-f45f95974-smmsv"] Apr 16 08:41:26.076030 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.075997 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-lgk8t"] Apr 16 08:41:26.077980 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.077965 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-lgk8t" Apr 16 08:41:26.079596 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.079567 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgkvk\" (UniqueName: \"kubernetes.io/projected/b2a84042-f667-4c3a-9654-28b88137d9f8-kube-api-access-pgkvk\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.079736 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.079625 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b2a84042-f667-4c3a-9654-28b88137d9f8-image-registry-private-configuration\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.079736 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.079648 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2a84042-f667-4c3a-9654-28b88137d9f8-ca-trust-extracted\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.079736 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.079675 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2a84042-f667-4c3a-9654-28b88137d9f8-trusted-ca\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.079913 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.079782 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5a5de9ed-1057-4943-9b06-aade8bc24270-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2rm2g\" (UID: \"5a5de9ed-1057-4943-9b06-aade8bc24270\") " pod="openshift-insights/insights-runtime-extractor-2rm2g" Apr 16 08:41:26.080287 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.079819 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2a84042-f667-4c3a-9654-28b88137d9f8-installation-pull-secrets\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.080287 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.080173 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cbjk\" (UniqueName: \"kubernetes.io/projected/5a5de9ed-1057-4943-9b06-aade8bc24270-kube-api-access-4cbjk\") pod \"insights-runtime-extractor-2rm2g\" (UID: \"5a5de9ed-1057-4943-9b06-aade8bc24270\") " pod="openshift-insights/insights-runtime-extractor-2rm2g" Apr 16 08:41:26.080287 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.080265 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2a84042-f667-4c3a-9654-28b88137d9f8-bound-sa-token\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.083734 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.080549 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2a84042-f667-4c3a-9654-28b88137d9f8-registry-tls\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.083734 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.080593 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5a5de9ed-1057-4943-9b06-aade8bc24270-data-volume\") pod \"insights-runtime-extractor-2rm2g\" (UID: \"5a5de9ed-1057-4943-9b06-aade8bc24270\") " pod="openshift-insights/insights-runtime-extractor-2rm2g" Apr 16 08:41:26.083734 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.080642 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2a84042-f667-4c3a-9654-28b88137d9f8-registry-certificates\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.083734 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.080676 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5a5de9ed-1057-4943-9b06-aade8bc24270-crio-socket\") pod \"insights-runtime-extractor-2rm2g\" (UID: \"5a5de9ed-1057-4943-9b06-aade8bc24270\") " pod="openshift-insights/insights-runtime-extractor-2rm2g" Apr 16 08:41:26.083734 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.080741 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5a5de9ed-1057-4943-9b06-aade8bc24270-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2rm2g\" (UID: \"5a5de9ed-1057-4943-9b06-aade8bc24270\") " pod="openshift-insights/insights-runtime-extractor-2rm2g" Apr 16 08:41:26.083734 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.080863 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 08:41:26.083734 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.080900 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 08:41:26.083734 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.081432 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-4ztkf\"" Apr 16 08:41:26.085303 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.085278 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-f45f95974-smmsv"] Apr 16 08:41:26.085996 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:26.085959 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[bound-sa-token ca-trust-extracted image-registry-private-configuration installation-pull-secrets kube-api-access-pgkvk registry-certificates registry-tls trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-f45f95974-smmsv" podUID="b2a84042-f667-4c3a-9654-28b88137d9f8" Apr 16 08:41:26.093407 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.093384 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-lgk8t"] Apr 16 08:41:26.123990 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.123960 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5bbbf98cd9-n68bb"] Apr 16 08:41:26.126487 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.126460 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.139133 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.139099 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5bbbf98cd9-n68bb"] Apr 16 08:41:26.168743 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.168697 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.172983 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.172962 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.181670 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.181646 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2a84042-f667-4c3a-9654-28b88137d9f8-bound-sa-token\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.181846 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.181675 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2a84042-f667-4c3a-9654-28b88137d9f8-registry-tls\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.181846 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.181692 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5a5de9ed-1057-4943-9b06-aade8bc24270-data-volume\") pod \"insights-runtime-extractor-2rm2g\" (UID: \"5a5de9ed-1057-4943-9b06-aade8bc24270\") " pod="openshift-insights/insights-runtime-extractor-2rm2g" Apr 16 08:41:26.181846 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.181731 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2a84042-f667-4c3a-9654-28b88137d9f8-registry-certificates\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.181846 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.181749 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5a5de9ed-1057-4943-9b06-aade8bc24270-crio-socket\") pod \"insights-runtime-extractor-2rm2g\" (UID: \"5a5de9ed-1057-4943-9b06-aade8bc24270\") " pod="openshift-insights/insights-runtime-extractor-2rm2g" Apr 16 08:41:26.181846 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.181767 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5a5de9ed-1057-4943-9b06-aade8bc24270-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2rm2g\" (UID: \"5a5de9ed-1057-4943-9b06-aade8bc24270\") " pod="openshift-insights/insights-runtime-extractor-2rm2g" Apr 16 08:41:26.181846 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.181786 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgkvk\" (UniqueName: \"kubernetes.io/projected/b2a84042-f667-4c3a-9654-28b88137d9f8-kube-api-access-pgkvk\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.181846 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.181828 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b2a84042-f667-4c3a-9654-28b88137d9f8-image-registry-private-configuration\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.182177 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.181857 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2a84042-f667-4c3a-9654-28b88137d9f8-ca-trust-extracted\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.182177 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.181880 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2a84042-f667-4c3a-9654-28b88137d9f8-trusted-ca\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.182177 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.181878 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5a5de9ed-1057-4943-9b06-aade8bc24270-crio-socket\") pod \"insights-runtime-extractor-2rm2g\" (UID: \"5a5de9ed-1057-4943-9b06-aade8bc24270\") " pod="openshift-insights/insights-runtime-extractor-2rm2g" Apr 16 08:41:26.182177 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.181943 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5a5de9ed-1057-4943-9b06-aade8bc24270-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2rm2g\" (UID: \"5a5de9ed-1057-4943-9b06-aade8bc24270\") " pod="openshift-insights/insights-runtime-extractor-2rm2g" Apr 16 08:41:26.182177 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.181977 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpl29\" (UniqueName: \"kubernetes.io/projected/74643581-ba72-4b9d-ae5a-bc893d97b6a0-kube-api-access-mpl29\") pod \"downloads-586b57c7b4-lgk8t\" (UID: \"74643581-ba72-4b9d-ae5a-bc893d97b6a0\") " pod="openshift-console/downloads-586b57c7b4-lgk8t" Apr 16 08:41:26.182177 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.182016 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2a84042-f667-4c3a-9654-28b88137d9f8-installation-pull-secrets\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.182177 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.182043 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cbjk\" (UniqueName: \"kubernetes.io/projected/5a5de9ed-1057-4943-9b06-aade8bc24270-kube-api-access-4cbjk\") pod \"insights-runtime-extractor-2rm2g\" (UID: \"5a5de9ed-1057-4943-9b06-aade8bc24270\") " pod="openshift-insights/insights-runtime-extractor-2rm2g" Apr 16 08:41:26.182177 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.182163 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5a5de9ed-1057-4943-9b06-aade8bc24270-data-volume\") pod \"insights-runtime-extractor-2rm2g\" (UID: \"5a5de9ed-1057-4943-9b06-aade8bc24270\") " pod="openshift-insights/insights-runtime-extractor-2rm2g" Apr 16 08:41:26.182543 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.182375 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2a84042-f667-4c3a-9654-28b88137d9f8-ca-trust-extracted\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.182879 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.182858 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5a5de9ed-1057-4943-9b06-aade8bc24270-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2rm2g\" (UID: \"5a5de9ed-1057-4943-9b06-aade8bc24270\") " pod="openshift-insights/insights-runtime-extractor-2rm2g" Apr 16 08:41:26.183089 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.183066 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2a84042-f667-4c3a-9654-28b88137d9f8-trusted-ca\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.183365 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.183113 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2a84042-f667-4c3a-9654-28b88137d9f8-registry-certificates\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.184545 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.184523 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5a5de9ed-1057-4943-9b06-aade8bc24270-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2rm2g\" (UID: \"5a5de9ed-1057-4943-9b06-aade8bc24270\") " pod="openshift-insights/insights-runtime-extractor-2rm2g" Apr 16 08:41:26.184651 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.184597 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2a84042-f667-4c3a-9654-28b88137d9f8-registry-tls\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.184814 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.184799 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b2a84042-f667-4c3a-9654-28b88137d9f8-image-registry-private-configuration\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.185080 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.185060 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2a84042-f667-4c3a-9654-28b88137d9f8-installation-pull-secrets\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.193425 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.193402 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2a84042-f667-4c3a-9654-28b88137d9f8-bound-sa-token\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.193500 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.193480 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cbjk\" (UniqueName: \"kubernetes.io/projected/5a5de9ed-1057-4943-9b06-aade8bc24270-kube-api-access-4cbjk\") pod \"insights-runtime-extractor-2rm2g\" (UID: \"5a5de9ed-1057-4943-9b06-aade8bc24270\") " pod="openshift-insights/insights-runtime-extractor-2rm2g" Apr 16 08:41:26.193544 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.193499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgkvk\" (UniqueName: \"kubernetes.io/projected/b2a84042-f667-4c3a-9654-28b88137d9f8-kube-api-access-pgkvk\") pod \"image-registry-f45f95974-smmsv\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:26.276197 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.276103 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2rm2g" Apr 16 08:41:26.283763 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.283669 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2a84042-f667-4c3a-9654-28b88137d9f8-installation-pull-secrets\") pod \"b2a84042-f667-4c3a-9654-28b88137d9f8\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " Apr 16 08:41:26.283763 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.283735 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b2a84042-f667-4c3a-9654-28b88137d9f8-image-registry-private-configuration\") pod \"b2a84042-f667-4c3a-9654-28b88137d9f8\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " Apr 16 08:41:26.283928 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.283805 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgkvk\" (UniqueName: \"kubernetes.io/projected/b2a84042-f667-4c3a-9654-28b88137d9f8-kube-api-access-pgkvk\") pod \"b2a84042-f667-4c3a-9654-28b88137d9f8\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " Apr 16 08:41:26.283928 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.283835 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2a84042-f667-4c3a-9654-28b88137d9f8-registry-certificates\") pod \"b2a84042-f667-4c3a-9654-28b88137d9f8\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " Apr 16 08:41:26.283928 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.283873 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2a84042-f667-4c3a-9654-28b88137d9f8-ca-trust-extracted\") pod \"b2a84042-f667-4c3a-9654-28b88137d9f8\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " Apr 16 08:41:26.283928 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.283912 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2a84042-f667-4c3a-9654-28b88137d9f8-registry-tls\") pod \"b2a84042-f667-4c3a-9654-28b88137d9f8\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " Apr 16 08:41:26.284122 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.283957 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2a84042-f667-4c3a-9654-28b88137d9f8-trusted-ca\") pod \"b2a84042-f667-4c3a-9654-28b88137d9f8\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " Apr 16 08:41:26.284122 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.283981 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2a84042-f667-4c3a-9654-28b88137d9f8-bound-sa-token\") pod \"b2a84042-f667-4c3a-9654-28b88137d9f8\" (UID: \"b2a84042-f667-4c3a-9654-28b88137d9f8\") " Apr 16 08:41:26.284291 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.284246 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1231a7af-9e87-4c88-9b24-457ae238ae51-ca-trust-extracted\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.284291 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.284250 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2a84042-f667-4c3a-9654-28b88137d9f8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b2a84042-f667-4c3a-9654-28b88137d9f8" (UID: "b2a84042-f667-4c3a-9654-28b88137d9f8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:41:26.284409 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.284384 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpl29\" (UniqueName: \"kubernetes.io/projected/74643581-ba72-4b9d-ae5a-bc893d97b6a0-kube-api-access-mpl29\") pod \"downloads-586b57c7b4-lgk8t\" (UID: \"74643581-ba72-4b9d-ae5a-bc893d97b6a0\") " pod="openshift-console/downloads-586b57c7b4-lgk8t" Apr 16 08:41:26.284553 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.284485 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2a84042-f667-4c3a-9654-28b88137d9f8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b2a84042-f667-4c3a-9654-28b88137d9f8" (UID: "b2a84042-f667-4c3a-9654-28b88137d9f8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 08:41:26.284553 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.284491 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1231a7af-9e87-4c88-9b24-457ae238ae51-bound-sa-token\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.286057 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.284881 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2a84042-f667-4c3a-9654-28b88137d9f8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b2a84042-f667-4c3a-9654-28b88137d9f8" (UID: "b2a84042-f667-4c3a-9654-28b88137d9f8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:41:26.286057 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.284954 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1231a7af-9e87-4c88-9b24-457ae238ae51-image-registry-private-configuration\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.286057 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.285003 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1231a7af-9e87-4c88-9b24-457ae238ae51-trusted-ca\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.286057 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.285028 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk4tq\" (UniqueName: \"kubernetes.io/projected/1231a7af-9e87-4c88-9b24-457ae238ae51-kube-api-access-tk4tq\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.286057 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.285054 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1231a7af-9e87-4c88-9b24-457ae238ae51-registry-certificates\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.286057 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.285089 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1231a7af-9e87-4c88-9b24-457ae238ae51-installation-pull-secrets\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.286057 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.285114 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1231a7af-9e87-4c88-9b24-457ae238ae51-registry-tls\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.286057 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.285193 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2a84042-f667-4c3a-9654-28b88137d9f8-ca-trust-extracted\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:41:26.286057 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.285213 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2a84042-f667-4c3a-9654-28b88137d9f8-trusted-ca\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:41:26.286057 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.285227 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2a84042-f667-4c3a-9654-28b88137d9f8-registry-certificates\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:41:26.286872 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.286821 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2a84042-f667-4c3a-9654-28b88137d9f8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b2a84042-f667-4c3a-9654-28b88137d9f8" (UID: "b2a84042-f667-4c3a-9654-28b88137d9f8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:41:26.286872 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.286843 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2a84042-f667-4c3a-9654-28b88137d9f8-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b2a84042-f667-4c3a-9654-28b88137d9f8" (UID: "b2a84042-f667-4c3a-9654-28b88137d9f8"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:41:26.287334 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.287124 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2a84042-f667-4c3a-9654-28b88137d9f8-kube-api-access-pgkvk" (OuterVolumeSpecName: "kube-api-access-pgkvk") pod "b2a84042-f667-4c3a-9654-28b88137d9f8" (UID: "b2a84042-f667-4c3a-9654-28b88137d9f8"). InnerVolumeSpecName "kube-api-access-pgkvk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:41:26.287537 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.287500 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2a84042-f667-4c3a-9654-28b88137d9f8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b2a84042-f667-4c3a-9654-28b88137d9f8" (UID: "b2a84042-f667-4c3a-9654-28b88137d9f8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:41:26.288407 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.288384 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2a84042-f667-4c3a-9654-28b88137d9f8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b2a84042-f667-4c3a-9654-28b88137d9f8" (UID: "b2a84042-f667-4c3a-9654-28b88137d9f8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:41:26.293430 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.293406 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpl29\" (UniqueName: \"kubernetes.io/projected/74643581-ba72-4b9d-ae5a-bc893d97b6a0-kube-api-access-mpl29\") pod \"downloads-586b57c7b4-lgk8t\" (UID: \"74643581-ba72-4b9d-ae5a-bc893d97b6a0\") " pod="openshift-console/downloads-586b57c7b4-lgk8t" Apr 16 08:41:26.386231 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.386202 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1231a7af-9e87-4c88-9b24-457ae238ae51-bound-sa-token\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.386384 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.386255 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1231a7af-9e87-4c88-9b24-457ae238ae51-image-registry-private-configuration\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.386384 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.386283 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1231a7af-9e87-4c88-9b24-457ae238ae51-trusted-ca\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.386384 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.386299 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tk4tq\" (UniqueName: \"kubernetes.io/projected/1231a7af-9e87-4c88-9b24-457ae238ae51-kube-api-access-tk4tq\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.386384 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.386314 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1231a7af-9e87-4c88-9b24-457ae238ae51-registry-certificates\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.386384 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.386335 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1231a7af-9e87-4c88-9b24-457ae238ae51-installation-pull-secrets\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.386817 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.386471 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1231a7af-9e87-4c88-9b24-457ae238ae51-registry-tls\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.386817 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.386519 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1231a7af-9e87-4c88-9b24-457ae238ae51-ca-trust-extracted\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.386817 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.386629 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2a84042-f667-4c3a-9654-28b88137d9f8-bound-sa-token\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:41:26.386817 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.386648 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2a84042-f667-4c3a-9654-28b88137d9f8-installation-pull-secrets\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:41:26.386817 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.386664 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b2a84042-f667-4c3a-9654-28b88137d9f8-image-registry-private-configuration\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:41:26.386817 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.386681 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pgkvk\" (UniqueName: \"kubernetes.io/projected/b2a84042-f667-4c3a-9654-28b88137d9f8-kube-api-access-pgkvk\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:41:26.386817 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.386704 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2a84042-f667-4c3a-9654-28b88137d9f8-registry-tls\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:41:26.387158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.387051 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1231a7af-9e87-4c88-9b24-457ae238ae51-ca-trust-extracted\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.387557 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.387430 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1231a7af-9e87-4c88-9b24-457ae238ae51-registry-certificates\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.387557 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.387457 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1231a7af-9e87-4c88-9b24-457ae238ae51-trusted-ca\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.389046 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.389009 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1231a7af-9e87-4c88-9b24-457ae238ae51-installation-pull-secrets\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.389147 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.389065 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1231a7af-9e87-4c88-9b24-457ae238ae51-image-registry-private-configuration\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.389249 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.389232 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1231a7af-9e87-4c88-9b24-457ae238ae51-registry-tls\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.389341 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.389330 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-lgk8t" Apr 16 08:41:26.399886 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.399861 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1231a7af-9e87-4c88-9b24-457ae238ae51-bound-sa-token\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.400136 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.400086 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2rm2g"] Apr 16 08:41:26.402043 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.402013 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk4tq\" (UniqueName: \"kubernetes.io/projected/1231a7af-9e87-4c88-9b24-457ae238ae51-kube-api-access-tk4tq\") pod \"image-registry-5bbbf98cd9-n68bb\" (UID: \"1231a7af-9e87-4c88-9b24-457ae238ae51\") " pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.402993 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:41:26.402969 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a5de9ed_1057_4943_9b06_aade8bc24270.slice/crio-4fdfb81a9f3cf6f79de56258563f38c564705ab6bee5b34e0a498eb08c865a96 WatchSource:0}: Error finding container 4fdfb81a9f3cf6f79de56258563f38c564705ab6bee5b34e0a498eb08c865a96: Status 404 returned error can't find the container with id 4fdfb81a9f3cf6f79de56258563f38c564705ab6bee5b34e0a498eb08c865a96 Apr 16 08:41:26.435638 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.435595 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:26.547522 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.547466 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-lgk8t"] Apr 16 08:41:26.551448 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:41:26.551412 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74643581_ba72_4b9d_ae5a_bc893d97b6a0.slice/crio-23e90d5648336679a0c3c8b2965f7b919fa02e9cc7d8064f62c50d94c01c4dce WatchSource:0}: Error finding container 23e90d5648336679a0c3c8b2965f7b919fa02e9cc7d8064f62c50d94c01c4dce: Status 404 returned error can't find the container with id 23e90d5648336679a0c3c8b2965f7b919fa02e9cc7d8064f62c50d94c01c4dce Apr 16 08:41:26.593406 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:26.593379 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5bbbf98cd9-n68bb"] Apr 16 08:41:26.596100 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:41:26.596069 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1231a7af_9e87_4c88_9b24_457ae238ae51.slice/crio-99c4b3787747fd5aa542326be3343828b6d90dd2686205d19d02fa0ec358e0aa WatchSource:0}: Error finding container 99c4b3787747fd5aa542326be3343828b6d90dd2686205d19d02fa0ec358e0aa: Status 404 returned error can't find the container with id 99c4b3787747fd5aa542326be3343828b6d90dd2686205d19d02fa0ec358e0aa Apr 16 08:41:27.173567 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.173533 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" event={"ID":"1231a7af-9e87-4c88-9b24-457ae238ae51","Type":"ContainerStarted","Data":"83c3f1e06b7513c31c60497a50c9646400fb7b43f42cf73a22408cb7509dd3d4"} Apr 16 08:41:27.174031 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.173575 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" event={"ID":"1231a7af-9e87-4c88-9b24-457ae238ae51","Type":"ContainerStarted","Data":"99c4b3787747fd5aa542326be3343828b6d90dd2686205d19d02fa0ec358e0aa"} Apr 16 08:41:27.174031 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.173611 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:27.174875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.174849 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-lgk8t" event={"ID":"74643581-ba72-4b9d-ae5a-bc893d97b6a0","Type":"ContainerStarted","Data":"23e90d5648336679a0c3c8b2965f7b919fa02e9cc7d8064f62c50d94c01c4dce"} Apr 16 08:41:27.176609 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.176577 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f45f95974-smmsv" Apr 16 08:41:27.177037 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.176982 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2rm2g" event={"ID":"5a5de9ed-1057-4943-9b06-aade8bc24270","Type":"ContainerStarted","Data":"4abe4485da02db6f1572517c49cf13fe1f2797c0d99c3a27f9f0a62cac2ea5d9"} Apr 16 08:41:27.177037 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.177009 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2rm2g" event={"ID":"5a5de9ed-1057-4943-9b06-aade8bc24270","Type":"ContainerStarted","Data":"86f60219ec23e90c95575f664ced9b66594ca85f5235a6661a6fa30316d363ab"} Apr 16 08:41:27.177037 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.177022 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2rm2g" event={"ID":"5a5de9ed-1057-4943-9b06-aade8bc24270","Type":"ContainerStarted","Data":"4fdfb81a9f3cf6f79de56258563f38c564705ab6bee5b34e0a498eb08c865a96"} Apr 16 08:41:27.192495 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.192444 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" podStartSLOduration=1.192426747 podStartE2EDuration="1.192426747s" podCreationTimestamp="2026-04-16 08:41:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:41:27.191862918 +0000 UTC m=+139.111848793" watchObservedRunningTime="2026-04-16 08:41:27.192426747 +0000 UTC m=+139.112412620" Apr 16 08:41:27.218128 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.218090 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-f45f95974-smmsv"] Apr 16 08:41:27.223752 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.223688 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-f45f95974-smmsv"] Apr 16 08:41:27.245028 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.244998 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8bc9c8f99-fslt6"] Apr 16 08:41:27.247092 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.247066 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.249517 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.249489 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 08:41:27.249654 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.249503 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-w94jt\"" Apr 16 08:41:27.249654 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.249636 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 08:41:27.249804 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.249594 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 08:41:27.249804 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.249736 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 08:41:27.249804 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.249798 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 08:41:27.260024 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.259991 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8bc9c8f99-fslt6"] Apr 16 08:41:27.395009 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.394977 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c8cfd1ac-9069-4487-884b-0746af4a9d81-console-oauth-config\") pod \"console-8bc9c8f99-fslt6\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.395201 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.395014 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2hhs\" (UniqueName: \"kubernetes.io/projected/c8cfd1ac-9069-4487-884b-0746af4a9d81-kube-api-access-z2hhs\") pod \"console-8bc9c8f99-fslt6\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.395201 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.395050 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c8cfd1ac-9069-4487-884b-0746af4a9d81-console-config\") pod \"console-8bc9c8f99-fslt6\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.395201 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.395149 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c8cfd1ac-9069-4487-884b-0746af4a9d81-oauth-serving-cert\") pod \"console-8bc9c8f99-fslt6\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.395201 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.395187 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8cfd1ac-9069-4487-884b-0746af4a9d81-console-serving-cert\") pod \"console-8bc9c8f99-fslt6\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.395393 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.395264 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8cfd1ac-9069-4487-884b-0746af4a9d81-service-ca\") pod \"console-8bc9c8f99-fslt6\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.496587 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.496531 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c8cfd1ac-9069-4487-884b-0746af4a9d81-console-config\") pod \"console-8bc9c8f99-fslt6\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.496813 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.496599 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c8cfd1ac-9069-4487-884b-0746af4a9d81-oauth-serving-cert\") pod \"console-8bc9c8f99-fslt6\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.496813 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.496626 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8cfd1ac-9069-4487-884b-0746af4a9d81-console-serving-cert\") pod \"console-8bc9c8f99-fslt6\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.496813 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.496679 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8cfd1ac-9069-4487-884b-0746af4a9d81-service-ca\") pod \"console-8bc9c8f99-fslt6\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.496813 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.496763 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c8cfd1ac-9069-4487-884b-0746af4a9d81-console-oauth-config\") pod \"console-8bc9c8f99-fslt6\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.496813 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.496790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2hhs\" (UniqueName: \"kubernetes.io/projected/c8cfd1ac-9069-4487-884b-0746af4a9d81-kube-api-access-z2hhs\") pod \"console-8bc9c8f99-fslt6\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.497422 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.497391 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c8cfd1ac-9069-4487-884b-0746af4a9d81-console-config\") pod \"console-8bc9c8f99-fslt6\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.497534 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.497398 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c8cfd1ac-9069-4487-884b-0746af4a9d81-oauth-serving-cert\") pod \"console-8bc9c8f99-fslt6\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.498047 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.498023 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8cfd1ac-9069-4487-884b-0746af4a9d81-service-ca\") pod \"console-8bc9c8f99-fslt6\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.499990 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.499963 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c8cfd1ac-9069-4487-884b-0746af4a9d81-console-oauth-config\") pod \"console-8bc9c8f99-fslt6\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.500192 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.500169 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8cfd1ac-9069-4487-884b-0746af4a9d81-console-serving-cert\") pod \"console-8bc9c8f99-fslt6\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.504763 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.504741 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2hhs\" (UniqueName: \"kubernetes.io/projected/c8cfd1ac-9069-4487-884b-0746af4a9d81-kube-api-access-z2hhs\") pod \"console-8bc9c8f99-fslt6\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.558882 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.558844 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:27.706707 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:27.706667 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8bc9c8f99-fslt6"] Apr 16 08:41:27.711346 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:41:27.711288 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8cfd1ac_9069_4487_884b_0746af4a9d81.slice/crio-e64cd1ba3c45731c49a48f5f9e7e9b26d7c1b5bf600615bc26cc1f0f2810b87c WatchSource:0}: Error finding container e64cd1ba3c45731c49a48f5f9e7e9b26d7c1b5bf600615bc26cc1f0f2810b87c: Status 404 returned error can't find the container with id e64cd1ba3c45731c49a48f5f9e7e9b26d7c1b5bf600615bc26cc1f0f2810b87c Apr 16 08:41:28.181177 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:28.181134 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8bc9c8f99-fslt6" event={"ID":"c8cfd1ac-9069-4487-884b-0746af4a9d81","Type":"ContainerStarted","Data":"e64cd1ba3c45731c49a48f5f9e7e9b26d7c1b5bf600615bc26cc1f0f2810b87c"} Apr 16 08:41:28.666840 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:28.666809 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2a84042-f667-4c3a-9654-28b88137d9f8" path="/var/lib/kubelet/pods/b2a84042-f667-4c3a-9654-28b88137d9f8/volumes" Apr 16 08:41:29.185923 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:29.185881 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2rm2g" event={"ID":"5a5de9ed-1057-4943-9b06-aade8bc24270","Type":"ContainerStarted","Data":"d08e85579c7aa297dbb03a9bd6f59b5acbb6319a66529592b4ec26666565cf0a"} Apr 16 08:41:29.205129 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:29.205082 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2rm2g" podStartSLOduration=1.992150485 podStartE2EDuration="4.205061173s" podCreationTimestamp="2026-04-16 08:41:25 +0000 UTC" firstStartedPulling="2026-04-16 08:41:26.487036678 +0000 UTC m=+138.407022531" lastFinishedPulling="2026-04-16 08:41:28.699947362 +0000 UTC m=+140.619933219" observedRunningTime="2026-04-16 08:41:29.202830942 +0000 UTC m=+141.122816819" watchObservedRunningTime="2026-04-16 08:41:29.205061173 +0000 UTC m=+141.125047049" Apr 16 08:41:31.194176 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:31.194089 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8bc9c8f99-fslt6" event={"ID":"c8cfd1ac-9069-4487-884b-0746af4a9d81","Type":"ContainerStarted","Data":"a4f893bc8dd00b30d88b9c46af68e692ebd434620cb676583f49013d05e48d56"} Apr 16 08:41:31.211089 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:31.211028 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8bc9c8f99-fslt6" podStartSLOduration=1.066208802 podStartE2EDuration="4.211010307s" podCreationTimestamp="2026-04-16 08:41:27 +0000 UTC" firstStartedPulling="2026-04-16 08:41:27.714027267 +0000 UTC m=+139.634013135" lastFinishedPulling="2026-04-16 08:41:30.858828783 +0000 UTC m=+142.778814640" observedRunningTime="2026-04-16 08:41:31.209468649 +0000 UTC m=+143.129454526" watchObservedRunningTime="2026-04-16 08:41:31.211010307 +0000 UTC m=+143.130996183" Apr 16 08:41:37.559920 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:37.559882 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:37.559920 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:37.559933 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:41:37.561644 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:37.561617 2578 patch_prober.go:28] interesting pod/console-8bc9c8f99-fslt6 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.18:8443/health\": dial tcp 10.133.0.18:8443: connect: connection refused" start-of-body= Apr 16 08:41:37.561794 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:37.561665 2578 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-8bc9c8f99-fslt6" podUID="c8cfd1ac-9069-4487-884b-0746af4a9d81" containerName="console" probeResult="failure" output="Get \"https://10.133.0.18:8443/health\": dial tcp 10.133.0.18:8443: connect: connection refused" Apr 16 08:41:42.193256 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.191613 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qsc79"] Apr 16 08:41:42.202578 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.202546 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.205337 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.205303 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 08:41:42.205457 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.205407 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 08:41:42.205524 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.205311 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lc2hb\"" Apr 16 08:41:42.207552 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.206789 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 08:41:42.207552 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.206995 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 08:41:42.207552 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.207184 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 08:41:42.208881 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.208759 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 08:41:42.224801 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.224762 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/baddd5f3-613c-43e0-92bd-81661f559e01-sys\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.224966 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.224874 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dpgg\" (UniqueName: \"kubernetes.io/projected/baddd5f3-613c-43e0-92bd-81661f559e01-kube-api-access-5dpgg\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.224966 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.224941 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/baddd5f3-613c-43e0-92bd-81661f559e01-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.225075 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.224972 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/baddd5f3-613c-43e0-92bd-81661f559e01-node-exporter-textfile\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.225075 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.225020 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/baddd5f3-613c-43e0-92bd-81661f559e01-node-exporter-wtmp\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.225075 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.225050 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/baddd5f3-613c-43e0-92bd-81661f559e01-node-exporter-tls\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.225223 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.225085 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/baddd5f3-613c-43e0-92bd-81661f559e01-root\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.225223 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.225122 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/baddd5f3-613c-43e0-92bd-81661f559e01-metrics-client-ca\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.225223 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.225151 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/baddd5f3-613c-43e0-92bd-81661f559e01-node-exporter-accelerators-collector-config\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.326338 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.326304 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/baddd5f3-613c-43e0-92bd-81661f559e01-root\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.326544 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.326353 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/baddd5f3-613c-43e0-92bd-81661f559e01-metrics-client-ca\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.326544 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.326380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/baddd5f3-613c-43e0-92bd-81661f559e01-node-exporter-accelerators-collector-config\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.326544 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.326415 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/baddd5f3-613c-43e0-92bd-81661f559e01-root\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.326544 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.326420 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/baddd5f3-613c-43e0-92bd-81661f559e01-sys\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.326544 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.326450 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/baddd5f3-613c-43e0-92bd-81661f559e01-sys\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.326544 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.326490 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dpgg\" (UniqueName: \"kubernetes.io/projected/baddd5f3-613c-43e0-92bd-81661f559e01-kube-api-access-5dpgg\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.326544 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.326539 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/baddd5f3-613c-43e0-92bd-81661f559e01-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.326984 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.326569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/baddd5f3-613c-43e0-92bd-81661f559e01-node-exporter-textfile\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.326984 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.326609 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/baddd5f3-613c-43e0-92bd-81661f559e01-node-exporter-wtmp\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.326984 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.326638 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/baddd5f3-613c-43e0-92bd-81661f559e01-node-exporter-tls\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.326984 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:42.326787 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 08:41:42.326984 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:42.326860 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/baddd5f3-613c-43e0-92bd-81661f559e01-node-exporter-tls podName:baddd5f3-613c-43e0-92bd-81661f559e01 nodeName:}" failed. No retries permitted until 2026-04-16 08:41:42.826840306 +0000 UTC m=+154.746826159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/baddd5f3-613c-43e0-92bd-81661f559e01-node-exporter-tls") pod "node-exporter-qsc79" (UID: "baddd5f3-613c-43e0-92bd-81661f559e01") : secret "node-exporter-tls" not found Apr 16 08:41:42.327294 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.327108 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/baddd5f3-613c-43e0-92bd-81661f559e01-metrics-client-ca\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.327741 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.327543 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/baddd5f3-613c-43e0-92bd-81661f559e01-node-exporter-accelerators-collector-config\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.327741 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.327587 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/baddd5f3-613c-43e0-92bd-81661f559e01-node-exporter-wtmp\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.327741 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.327666 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/baddd5f3-613c-43e0-92bd-81661f559e01-node-exporter-textfile\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.332564 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.332536 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/baddd5f3-613c-43e0-92bd-81661f559e01-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.341936 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.341883 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dpgg\" (UniqueName: \"kubernetes.io/projected/baddd5f3-613c-43e0-92bd-81661f559e01-kube-api-access-5dpgg\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.830912 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.830813 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/baddd5f3-613c-43e0-92bd-81661f559e01-node-exporter-tls\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:42.833828 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:42.833797 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/baddd5f3-613c-43e0-92bd-81661f559e01-node-exporter-tls\") pod \"node-exporter-qsc79\" (UID: \"baddd5f3-613c-43e0-92bd-81661f559e01\") " pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:43.117371 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.117338 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qsc79" Apr 16 08:41:43.232296 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.232258 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qsc79" event={"ID":"baddd5f3-613c-43e0-92bd-81661f559e01","Type":"ContainerStarted","Data":"804545e5cd24a9dcaa2ccb5b2f40d3d68079e378deb9dc8a7ce2d512b168c8fc"} Apr 16 08:41:43.233807 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.233774 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-lgk8t" event={"ID":"74643581-ba72-4b9d-ae5a-bc893d97b6a0","Type":"ContainerStarted","Data":"da44c3312bf95ffcbfafbb309b81c6ff878e5a42e04eb669a10ea2872ff0b45a"} Apr 16 08:41:43.234033 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.234012 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-lgk8t" Apr 16 08:41:43.242028 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.242000 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 08:41:43.247310 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.247283 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.250191 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.250168 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 08:41:43.250414 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.250398 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 08:41:43.250518 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.250507 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-lgk8t" Apr 16 08:41:43.250580 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.250532 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 08:41:43.250639 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.250599 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-ccbv6\"" Apr 16 08:41:43.250639 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.250620 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 08:41:43.250639 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.250632 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 08:41:43.250814 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.250620 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 08:41:43.251026 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.251007 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 08:41:43.251124 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.251062 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 08:41:43.257444 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.257426 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 08:41:43.257556 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.257495 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-lgk8t" podStartSLOduration=1.326103471 podStartE2EDuration="17.257480475s" podCreationTimestamp="2026-04-16 08:41:26 +0000 UTC" firstStartedPulling="2026-04-16 08:41:26.553856283 +0000 UTC m=+138.473842136" lastFinishedPulling="2026-04-16 08:41:42.485233282 +0000 UTC m=+154.405219140" observedRunningTime="2026-04-16 08:41:43.256310281 +0000 UTC m=+155.176296190" watchObservedRunningTime="2026-04-16 08:41:43.257480475 +0000 UTC m=+155.177466351" Apr 16 08:41:43.260648 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.260619 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 08:41:43.335444 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.335406 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.335619 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.335458 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.335619 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.335503 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.335619 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.335581 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.335619 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.335613 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3c65bce5-5192-4f53-bc36-bbc33717edac-config-out\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.335841 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.335640 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.335841 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.335679 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3c65bce5-5192-4f53-bc36-bbc33717edac-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.335841 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.335706 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-web-config\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.335841 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.335810 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3c65bce5-5192-4f53-bc36-bbc33717edac-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.335963 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.335851 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-config-volume\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.335963 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.335882 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3c65bce5-5192-4f53-bc36-bbc33717edac-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.335963 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.335913 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c65bce5-5192-4f53-bc36-bbc33717edac-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.336052 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.335969 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj5ld\" (UniqueName: \"kubernetes.io/projected/3c65bce5-5192-4f53-bc36-bbc33717edac-kube-api-access-tj5ld\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.437063 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.436961 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.437063 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.437028 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3c65bce5-5192-4f53-bc36-bbc33717edac-config-out\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.437283 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:43.437136 2578 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 08:41:43.437283 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.437175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.437283 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:43.437222 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-secret-alertmanager-main-tls podName:3c65bce5-5192-4f53-bc36-bbc33717edac nodeName:}" failed. No retries permitted until 2026-04-16 08:41:43.937200609 +0000 UTC m=+155.857186472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "3c65bce5-5192-4f53-bc36-bbc33717edac") : secret "alertmanager-main-tls" not found Apr 16 08:41:43.437283 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.437263 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3c65bce5-5192-4f53-bc36-bbc33717edac-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.438442 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.437293 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-web-config\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.438442 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.437324 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3c65bce5-5192-4f53-bc36-bbc33717edac-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.438442 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.437349 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-config-volume\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.438442 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.437378 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3c65bce5-5192-4f53-bc36-bbc33717edac-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.438442 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.437403 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c65bce5-5192-4f53-bc36-bbc33717edac-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.438442 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.437451 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tj5ld\" (UniqueName: \"kubernetes.io/projected/3c65bce5-5192-4f53-bc36-bbc33717edac-kube-api-access-tj5ld\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.438442 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.437521 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.438442 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.437551 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.438442 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.437596 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.438442 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.438017 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3c65bce5-5192-4f53-bc36-bbc33717edac-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.438442 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.438279 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3c65bce5-5192-4f53-bc36-bbc33717edac-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.438442 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.438307 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c65bce5-5192-4f53-bc36-bbc33717edac-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.441035 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.440190 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3c65bce5-5192-4f53-bc36-bbc33717edac-config-out\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.441035 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.440313 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.441207 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.441117 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-web-config\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.441683 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.441510 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.441683 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.441574 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.441683 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.441634 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3c65bce5-5192-4f53-bc36-bbc33717edac-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.442740 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.442480 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.442838 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.442816 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-config-volume\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.446501 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.446477 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj5ld\" (UniqueName: \"kubernetes.io/projected/3c65bce5-5192-4f53-bc36-bbc33717edac-kube-api-access-tj5ld\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.942277 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.942225 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:43.945336 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:43.945298 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3c65bce5-5192-4f53-bc36-bbc33717edac-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3c65bce5-5192-4f53-bc36-bbc33717edac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:44.160827 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:44.160788 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:41:44.428317 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:44.428245 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-fqcdx" podUID="8cea109b-1867-4bf4-a48a-15604584a8d2" Apr 16 08:41:44.444393 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:41:44.444347 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-4tvsn" podUID="11093fee-55ea-464a-b838-08d5d6f8e907" Apr 16 08:41:44.495678 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:44.495637 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 08:41:44.499232 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:41:44.499170 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c65bce5_5192_4f53_bc36_bbc33717edac.slice/crio-56dcc0452b87dbbbca6f064e41ce8f79f3ea89cefd0ecf81b1ff649c8d40c744 WatchSource:0}: Error finding container 56dcc0452b87dbbbca6f064e41ce8f79f3ea89cefd0ecf81b1ff649c8d40c744: Status 404 returned error can't find the container with id 56dcc0452b87dbbbca6f064e41ce8f79f3ea89cefd0ecf81b1ff649c8d40c744 Apr 16 08:41:45.243136 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:45.243103 2578 generic.go:358] "Generic (PLEG): container finished" podID="baddd5f3-613c-43e0-92bd-81661f559e01" containerID="63934f456a77773091b362a346318ba5ffdb8fa36abed3a702ff0108d847b1cd" exitCode=0 Apr 16 08:41:45.243311 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:45.243220 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qsc79" event={"ID":"baddd5f3-613c-43e0-92bd-81661f559e01","Type":"ContainerDied","Data":"63934f456a77773091b362a346318ba5ffdb8fa36abed3a702ff0108d847b1cd"} Apr 16 08:41:45.245014 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:45.244993 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4tvsn" Apr 16 08:41:45.245241 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:45.245224 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3c65bce5-5192-4f53-bc36-bbc33717edac","Type":"ContainerStarted","Data":"56dcc0452b87dbbbca6f064e41ce8f79f3ea89cefd0ecf81b1ff649c8d40c744"} Apr 16 08:41:45.245341 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:45.245311 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fqcdx" Apr 16 08:41:46.250476 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.250403 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qsc79" event={"ID":"baddd5f3-613c-43e0-92bd-81661f559e01","Type":"ContainerStarted","Data":"1bd7dd4ec8e28aaeb7872cd7214ea8059b7f9a7b24186c0df58b4ba49aee3dac"} Apr 16 08:41:46.250476 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.250448 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qsc79" event={"ID":"baddd5f3-613c-43e0-92bd-81661f559e01","Type":"ContainerStarted","Data":"8b6751b529c666fd7955a65dde6dff34763e54910af594bef189a67341e4f36f"} Apr 16 08:41:46.269441 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.269387 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qsc79" podStartSLOduration=2.928987536 podStartE2EDuration="4.26936905s" podCreationTimestamp="2026-04-16 08:41:42 +0000 UTC" firstStartedPulling="2026-04-16 08:41:43.131683196 +0000 UTC m=+155.051669061" lastFinishedPulling="2026-04-16 08:41:44.472064716 +0000 UTC m=+156.392050575" observedRunningTime="2026-04-16 08:41:46.267050184 +0000 UTC m=+158.187036053" watchObservedRunningTime="2026-04-16 08:41:46.26936905 +0000 UTC m=+158.189354925" Apr 16 08:41:46.440660 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.440564 2578 patch_prober.go:28] interesting pod/image-registry-5bbbf98cd9-n68bb container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 08:41:46.440660 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.440636 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" podUID="1231a7af-9e87-4c88-9b24-457ae238ae51" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 08:41:46.565606 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.565566 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-57bbcc56b8-r4vck"] Apr 16 08:41:46.587385 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.587353 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-57bbcc56b8-r4vck"] Apr 16 08:41:46.587544 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.587511 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.589959 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.589935 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 08:41:46.590220 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.590200 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 08:41:46.590328 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.590243 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-12k7gnovutnr\"" Apr 16 08:41:46.590398 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.590346 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-ms4r5\"" Apr 16 08:41:46.590460 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.590442 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 08:41:46.590514 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.590477 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 08:41:46.671057 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.671020 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8884475-b8ab-446b-bf80-e0b74c7da6f6-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.671233 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.671083 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a8884475-b8ab-446b-bf80-e0b74c7da6f6-secret-metrics-server-tls\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.671233 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.671123 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l2v8\" (UniqueName: \"kubernetes.io/projected/a8884475-b8ab-446b-bf80-e0b74c7da6f6-kube-api-access-2l2v8\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.671233 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.671149 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a8884475-b8ab-446b-bf80-e0b74c7da6f6-secret-metrics-server-client-certs\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.671233 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.671211 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a8884475-b8ab-446b-bf80-e0b74c7da6f6-audit-log\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.671410 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.671265 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a8884475-b8ab-446b-bf80-e0b74c7da6f6-metrics-server-audit-profiles\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.671410 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.671301 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8884475-b8ab-446b-bf80-e0b74c7da6f6-client-ca-bundle\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.771953 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.771866 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2l2v8\" (UniqueName: \"kubernetes.io/projected/a8884475-b8ab-446b-bf80-e0b74c7da6f6-kube-api-access-2l2v8\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.771953 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.771910 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a8884475-b8ab-446b-bf80-e0b74c7da6f6-secret-metrics-server-client-certs\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.772160 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.772046 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a8884475-b8ab-446b-bf80-e0b74c7da6f6-audit-log\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.772160 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.772120 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a8884475-b8ab-446b-bf80-e0b74c7da6f6-metrics-server-audit-profiles\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.772280 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.772178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8884475-b8ab-446b-bf80-e0b74c7da6f6-client-ca-bundle\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.772280 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.772240 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8884475-b8ab-446b-bf80-e0b74c7da6f6-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.772366 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.772330 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a8884475-b8ab-446b-bf80-e0b74c7da6f6-secret-metrics-server-tls\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.772613 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.772588 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a8884475-b8ab-446b-bf80-e0b74c7da6f6-audit-log\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.773359 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.773330 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8884475-b8ab-446b-bf80-e0b74c7da6f6-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.773464 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.773371 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a8884475-b8ab-446b-bf80-e0b74c7da6f6-metrics-server-audit-profiles\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.775489 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.774956 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a8884475-b8ab-446b-bf80-e0b74c7da6f6-secret-metrics-server-client-certs\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.775489 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.775164 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8884475-b8ab-446b-bf80-e0b74c7da6f6-client-ca-bundle\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.775489 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.775200 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a8884475-b8ab-446b-bf80-e0b74c7da6f6-secret-metrics-server-tls\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.780526 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.780500 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l2v8\" (UniqueName: \"kubernetes.io/projected/a8884475-b8ab-446b-bf80-e0b74c7da6f6-kube-api-access-2l2v8\") pod \"metrics-server-57bbcc56b8-r4vck\" (UID: \"a8884475-b8ab-446b-bf80-e0b74c7da6f6\") " pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.899345 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.899308 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:41:46.950374 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.950320 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-mj4jq"] Apr 16 08:41:46.988704 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.988669 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-mj4jq"] Apr 16 08:41:46.988924 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.988841 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mj4jq" Apr 16 08:41:46.991221 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.991184 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 08:41:46.991466 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:46.991434 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-r6z9r\"" Apr 16 08:41:47.056656 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:47.056560 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-57bbcc56b8-r4vck"] Apr 16 08:41:47.060390 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:41:47.060350 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8884475_b8ab_446b_bf80_e0b74c7da6f6.slice/crio-1999f345de75f0ae0658c291c5d2ee4d4af2a0fbc57463e4516255deba462435 WatchSource:0}: Error finding container 1999f345de75f0ae0658c291c5d2ee4d4af2a0fbc57463e4516255deba462435: Status 404 returned error can't find the container with id 1999f345de75f0ae0658c291c5d2ee4d4af2a0fbc57463e4516255deba462435 Apr 16 08:41:47.076026 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:47.075983 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/441d3e87-1392-4ee0-97c2-285af9c5f52d-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-mj4jq\" (UID: \"441d3e87-1392-4ee0-97c2-285af9c5f52d\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mj4jq" Apr 16 08:41:47.177629 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:47.177588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/441d3e87-1392-4ee0-97c2-285af9c5f52d-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-mj4jq\" (UID: \"441d3e87-1392-4ee0-97c2-285af9c5f52d\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mj4jq" Apr 16 08:41:47.180515 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:47.180483 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/441d3e87-1392-4ee0-97c2-285af9c5f52d-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-mj4jq\" (UID: \"441d3e87-1392-4ee0-97c2-285af9c5f52d\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mj4jq" Apr 16 08:41:47.256504 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:47.256460 2578 generic.go:358] "Generic (PLEG): container finished" podID="3c65bce5-5192-4f53-bc36-bbc33717edac" containerID="9f391dce9efe43594770c67054637dcc56c107eb35ab4975b22a093726b4c42b" exitCode=0 Apr 16 08:41:47.256965 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:47.256589 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3c65bce5-5192-4f53-bc36-bbc33717edac","Type":"ContainerDied","Data":"9f391dce9efe43594770c67054637dcc56c107eb35ab4975b22a093726b4c42b"} Apr 16 08:41:47.257857 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:47.257831 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" event={"ID":"a8884475-b8ab-446b-bf80-e0b74c7da6f6","Type":"ContainerStarted","Data":"1999f345de75f0ae0658c291c5d2ee4d4af2a0fbc57463e4516255deba462435"} Apr 16 08:41:47.300939 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:47.300904 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mj4jq" Apr 16 08:41:47.443970 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:47.443920 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-mj4jq"] Apr 16 08:41:47.447082 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:41:47.447045 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod441d3e87_1392_4ee0_97c2_285af9c5f52d.slice/crio-da1aac2629ae6a76f33d108c7e98bb5a0e7679551c453bc5844922db72b40988 WatchSource:0}: Error finding container da1aac2629ae6a76f33d108c7e98bb5a0e7679551c453bc5844922db72b40988: Status 404 returned error can't find the container with id da1aac2629ae6a76f33d108c7e98bb5a0e7679551c453bc5844922db72b40988 Apr 16 08:41:47.559342 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:47.559306 2578 patch_prober.go:28] interesting pod/console-8bc9c8f99-fslt6 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.18:8443/health\": dial tcp 10.133.0.18:8443: connect: connection refused" start-of-body= Apr 16 08:41:47.559526 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:47.559368 2578 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-8bc9c8f99-fslt6" podUID="c8cfd1ac-9069-4487-884b-0746af4a9d81" containerName="console" probeResult="failure" output="Get \"https://10.133.0.18:8443/health\": dial tcp 10.133.0.18:8443: connect: connection refused" Apr 16 08:41:48.191347 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.191307 2578 patch_prober.go:28] interesting pod/image-registry-5bbbf98cd9-n68bb container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 08:41:48.191347 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.191381 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" podUID="1231a7af-9e87-4c88-9b24-457ae238ae51" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 08:41:48.239676 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.239635 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8bc9c8f99-fslt6"] Apr 16 08:41:48.263589 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.263551 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mj4jq" event={"ID":"441d3e87-1392-4ee0-97c2-285af9c5f52d","Type":"ContainerStarted","Data":"da1aac2629ae6a76f33d108c7e98bb5a0e7679551c453bc5844922db72b40988"} Apr 16 08:41:48.354468 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.354437 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 08:41:48.377976 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.377900 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 08:41:48.378599 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.378378 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.382319 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.382054 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 08:41:48.383846 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.382484 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 08:41:48.383846 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.382901 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 08:41:48.383846 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.383107 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 08:41:48.383846 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.383327 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 08:41:48.383846 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.383520 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 08:41:48.384203 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.384021 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 08:41:48.385505 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.384378 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 08:41:48.385505 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.384647 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 08:41:48.385505 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.385092 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-sh72c\"" Apr 16 08:41:48.385505 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.385310 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 08:41:48.385505 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.385367 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 08:41:48.385505 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.385309 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-37jguvvc61arb\"" Apr 16 08:41:48.388190 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.388170 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 08:41:48.491526 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.491446 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-config\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.491526 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.491494 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/193ac91b-a7b4-46b8-bd09-983570fff5c6-config-out\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.491526 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.491515 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.491983 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.491534 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-web-config\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.491983 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.491559 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.491983 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.491587 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.491983 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.491640 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/193ac91b-a7b4-46b8-bd09-983570fff5c6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.491983 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.491820 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/193ac91b-a7b4-46b8-bd09-983570fff5c6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.491983 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.491896 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/193ac91b-a7b4-46b8-bd09-983570fff5c6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.492289 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.492246 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/193ac91b-a7b4-46b8-bd09-983570fff5c6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.492474 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.492297 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/193ac91b-a7b4-46b8-bd09-983570fff5c6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.492847 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.492826 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.493732 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.493642 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6q6n\" (UniqueName: \"kubernetes.io/projected/193ac91b-a7b4-46b8-bd09-983570fff5c6-kube-api-access-f6q6n\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.493875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.493854 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.493970 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.493951 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/193ac91b-a7b4-46b8-bd09-983570fff5c6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.494038 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.494000 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.494095 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.494033 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.494095 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.494059 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/193ac91b-a7b4-46b8-bd09-983570fff5c6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.594847 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.594809 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-config\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.594847 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.594850 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/193ac91b-a7b4-46b8-bd09-983570fff5c6-config-out\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.595114 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.594871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.595114 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.594889 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-web-config\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.595114 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.594907 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.595114 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.594926 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.595114 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.594949 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/193ac91b-a7b4-46b8-bd09-983570fff5c6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.595114 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.594978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/193ac91b-a7b4-46b8-bd09-983570fff5c6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.595114 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.595005 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/193ac91b-a7b4-46b8-bd09-983570fff5c6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.595114 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.595086 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/193ac91b-a7b4-46b8-bd09-983570fff5c6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.595419 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.595120 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/193ac91b-a7b4-46b8-bd09-983570fff5c6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.595419 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.595151 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.595419 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.595178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6q6n\" (UniqueName: \"kubernetes.io/projected/193ac91b-a7b4-46b8-bd09-983570fff5c6-kube-api-access-f6q6n\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.595419 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.595228 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.595419 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.595249 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/193ac91b-a7b4-46b8-bd09-983570fff5c6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.595419 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.595277 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.595419 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.595304 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.595419 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.595330 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/193ac91b-a7b4-46b8-bd09-983570fff5c6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.604227 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.596496 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/193ac91b-a7b4-46b8-bd09-983570fff5c6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.604227 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.597797 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/193ac91b-a7b4-46b8-bd09-983570fff5c6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.604227 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.599552 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/193ac91b-a7b4-46b8-bd09-983570fff5c6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.604227 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.600943 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/193ac91b-a7b4-46b8-bd09-983570fff5c6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.604560 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.604502 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-web-config\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.606843 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.606430 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/193ac91b-a7b4-46b8-bd09-983570fff5c6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.607757 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.607705 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.609062 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.609027 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/193ac91b-a7b4-46b8-bd09-983570fff5c6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.612679 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.611886 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.612679 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.612499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.613392 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.613373 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.615070 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.614981 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6q6n\" (UniqueName: \"kubernetes.io/projected/193ac91b-a7b4-46b8-bd09-983570fff5c6-kube-api-access-f6q6n\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.615315 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.615229 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/193ac91b-a7b4-46b8-bd09-983570fff5c6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.615414 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.615370 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/193ac91b-a7b4-46b8-bd09-983570fff5c6-config-out\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.616757 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.616664 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.617622 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.617518 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-config\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.618344 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.618289 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.620655 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.620635 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/193ac91b-a7b4-46b8-bd09-983570fff5c6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"193ac91b-a7b4-46b8-bd09-983570fff5c6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:48.697071 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:48.696583 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:41:49.303991 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:49.303949 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert\") pod \"ingress-canary-4tvsn\" (UID: \"11093fee-55ea-464a-b838-08d5d6f8e907\") " pod="openshift-ingress-canary/ingress-canary-4tvsn" Apr 16 08:41:49.304685 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:49.304033 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:41:49.308182 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:49.308153 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cea109b-1867-4bf4-a48a-15604584a8d2-metrics-tls\") pod \"dns-default-fqcdx\" (UID: \"8cea109b-1867-4bf4-a48a-15604584a8d2\") " pod="openshift-dns/dns-default-fqcdx" Apr 16 08:41:49.311581 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:49.311554 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11093fee-55ea-464a-b838-08d5d6f8e907-cert\") pod \"ingress-canary-4tvsn\" (UID: \"11093fee-55ea-464a-b838-08d5d6f8e907\") " pod="openshift-ingress-canary/ingress-canary-4tvsn" Apr 16 08:41:49.448322 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:49.448284 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-q9sj7\"" Apr 16 08:41:49.449246 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:49.449220 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9d2hv\"" Apr 16 08:41:49.456218 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:49.456185 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fqcdx" Apr 16 08:41:49.456218 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:49.456210 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4tvsn" Apr 16 08:41:50.888924 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:50.887509 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fqcdx"] Apr 16 08:41:50.893053 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:41:50.893016 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cea109b_1867_4bf4_a48a_15604584a8d2.slice/crio-bc6d564b3c88517777b33d280d5c32b88a5d6a791b426412183e481499e561ed WatchSource:0}: Error finding container bc6d564b3c88517777b33d280d5c32b88a5d6a791b426412183e481499e561ed: Status 404 returned error can't find the container with id bc6d564b3c88517777b33d280d5c32b88a5d6a791b426412183e481499e561ed Apr 16 08:41:50.909829 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:50.909786 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4tvsn"] Apr 16 08:41:50.914092 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:41:50.914044 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11093fee_55ea_464a_b838_08d5d6f8e907.slice/crio-d74df01e6cdd95f89d3a06a692992a9268d472402896bc407ed971b2e68d0b93 WatchSource:0}: Error finding container d74df01e6cdd95f89d3a06a692992a9268d472402896bc407ed971b2e68d0b93: Status 404 returned error can't find the container with id d74df01e6cdd95f89d3a06a692992a9268d472402896bc407ed971b2e68d0b93 Apr 16 08:41:50.931126 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:50.930836 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 08:41:50.933464 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:41:50.933431 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod193ac91b_a7b4_46b8_bd09_983570fff5c6.slice/crio-4eaf8576ebe387a56be64e19c39491e04ea2b8cf784d881863cc05c1896a1376 WatchSource:0}: Error finding container 4eaf8576ebe387a56be64e19c39491e04ea2b8cf784d881863cc05c1896a1376: Status 404 returned error can't find the container with id 4eaf8576ebe387a56be64e19c39491e04ea2b8cf784d881863cc05c1896a1376 Apr 16 08:41:51.279208 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:51.279179 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3c65bce5-5192-4f53-bc36-bbc33717edac","Type":"ContainerStarted","Data":"946ed7c8fc3314d02defed2eef492c82dd8b123879109baade49e9e73151ab4d"} Apr 16 08:41:51.279315 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:51.279220 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3c65bce5-5192-4f53-bc36-bbc33717edac","Type":"ContainerStarted","Data":"5805737256b8d4254f0977c1d268d17251bd5f6a0cf1d7805dd344ffa41dd37a"} Apr 16 08:41:51.279315 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:51.279233 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3c65bce5-5192-4f53-bc36-bbc33717edac","Type":"ContainerStarted","Data":"d7b57db3ffff1dd7d837d3111b8aee557b4f727e5797d2d55b27dcaa84d23cfd"} Apr 16 08:41:51.281577 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:51.281347 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mj4jq" event={"ID":"441d3e87-1392-4ee0-97c2-285af9c5f52d","Type":"ContainerStarted","Data":"55ae88fa53b2e480b3b3ace7dedb8a062885c35b7c1a99fbbd61460b282c5201"} Apr 16 08:41:51.281577 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:51.281564 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mj4jq" Apr 16 08:41:51.283365 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:51.283327 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fqcdx" event={"ID":"8cea109b-1867-4bf4-a48a-15604584a8d2","Type":"ContainerStarted","Data":"bc6d564b3c88517777b33d280d5c32b88a5d6a791b426412183e481499e561ed"} Apr 16 08:41:51.285732 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:51.284933 2578 generic.go:358] "Generic (PLEG): container finished" podID="193ac91b-a7b4-46b8-bd09-983570fff5c6" containerID="1f1fde6705aea06d8eed5bdd2d9767cf537d339be08023a543d2bae50673244b" exitCode=0 Apr 16 08:41:51.285732 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:51.285070 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"193ac91b-a7b4-46b8-bd09-983570fff5c6","Type":"ContainerDied","Data":"1f1fde6705aea06d8eed5bdd2d9767cf537d339be08023a543d2bae50673244b"} Apr 16 08:41:51.285732 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:51.285099 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"193ac91b-a7b4-46b8-bd09-983570fff5c6","Type":"ContainerStarted","Data":"4eaf8576ebe387a56be64e19c39491e04ea2b8cf784d881863cc05c1896a1376"} Apr 16 08:41:51.286879 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:51.286853 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4tvsn" event={"ID":"11093fee-55ea-464a-b838-08d5d6f8e907","Type":"ContainerStarted","Data":"d74df01e6cdd95f89d3a06a692992a9268d472402896bc407ed971b2e68d0b93"} Apr 16 08:41:51.287197 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:51.287182 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mj4jq" Apr 16 08:41:51.289007 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:51.288984 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" event={"ID":"a8884475-b8ab-446b-bf80-e0b74c7da6f6","Type":"ContainerStarted","Data":"cdebdbbfbd787bef3fb644ac0b2680e1312b881c065c00694bfb387b08b439bc"} Apr 16 08:41:51.297089 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:51.295847 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mj4jq" podStartSLOduration=2.054887891 podStartE2EDuration="5.295831945s" podCreationTimestamp="2026-04-16 08:41:46 +0000 UTC" firstStartedPulling="2026-04-16 08:41:47.449351615 +0000 UTC m=+159.369337469" lastFinishedPulling="2026-04-16 08:41:50.690295657 +0000 UTC m=+162.610281523" observedRunningTime="2026-04-16 08:41:51.295038598 +0000 UTC m=+163.215024476" watchObservedRunningTime="2026-04-16 08:41:51.295831945 +0000 UTC m=+163.215817819" Apr 16 08:41:51.310663 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:51.310607 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" podStartSLOduration=1.6871883680000002 podStartE2EDuration="5.310590812s" podCreationTimestamp="2026-04-16 08:41:46 +0000 UTC" firstStartedPulling="2026-04-16 08:41:47.062681476 +0000 UTC m=+158.982667328" lastFinishedPulling="2026-04-16 08:41:50.686083908 +0000 UTC m=+162.606069772" observedRunningTime="2026-04-16 08:41:51.309915144 +0000 UTC m=+163.229901041" watchObservedRunningTime="2026-04-16 08:41:51.310590812 +0000 UTC m=+163.230576702" Apr 16 08:41:52.298449 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:52.298339 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3c65bce5-5192-4f53-bc36-bbc33717edac","Type":"ContainerStarted","Data":"2b27b9cd4cd9cc1f6f66495ab4f5014eea8d6a8b8c64ae9991fd26c3f1b5e323"} Apr 16 08:41:52.298449 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:52.298411 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3c65bce5-5192-4f53-bc36-bbc33717edac","Type":"ContainerStarted","Data":"69f85cdc53a5e7b4897ffa6e585ab4c99631ebd5a82bcfe3110265a052493a1d"} Apr 16 08:41:56.314032 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:56.313993 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4tvsn" event={"ID":"11093fee-55ea-464a-b838-08d5d6f8e907","Type":"ContainerStarted","Data":"679bfd10152b9679042ca49272ec6dd2955a74d8bc7c88d53555a8f602ac7f04"} Apr 16 08:41:56.317777 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:56.317746 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3c65bce5-5192-4f53-bc36-bbc33717edac","Type":"ContainerStarted","Data":"a2913f25bf49901778a3f885f55ef0f33a9f94fb9a1b05456a5492cdc01b1865"} Apr 16 08:41:56.319663 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:56.319627 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fqcdx" event={"ID":"8cea109b-1867-4bf4-a48a-15604584a8d2","Type":"ContainerStarted","Data":"d7b3e5a4b26936b853c4037c2c3944f632448608fc32aaa0849c0bbdd4b1dfb1"} Apr 16 08:41:56.319663 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:56.319663 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fqcdx" event={"ID":"8cea109b-1867-4bf4-a48a-15604584a8d2","Type":"ContainerStarted","Data":"d7f84bdb28c833875e8e9d9ab309bab62669cc43ae475b67de745053cdb6e17a"} Apr 16 08:41:56.319899 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:56.319756 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fqcdx" Apr 16 08:41:56.321751 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:56.321707 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"193ac91b-a7b4-46b8-bd09-983570fff5c6","Type":"ContainerStarted","Data":"6086632b0e8cbb1805be6e188bf5249f52477e347047e3dc262778b04294c687"} Apr 16 08:41:56.321872 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:56.321756 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"193ac91b-a7b4-46b8-bd09-983570fff5c6","Type":"ContainerStarted","Data":"5600bd9b0322fb7cc908b4a313cbabd4e94ce5aa43cc4408e31f3efceb345d1c"} Apr 16 08:41:56.333297 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:56.333219 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4tvsn" podStartSLOduration=130.978000943 podStartE2EDuration="2m15.333199187s" podCreationTimestamp="2026-04-16 08:39:41 +0000 UTC" firstStartedPulling="2026-04-16 08:41:50.917485363 +0000 UTC m=+162.837471216" lastFinishedPulling="2026-04-16 08:41:55.272683608 +0000 UTC m=+167.192669460" observedRunningTime="2026-04-16 08:41:56.331685884 +0000 UTC m=+168.251671761" watchObservedRunningTime="2026-04-16 08:41:56.333199187 +0000 UTC m=+168.253185062" Apr 16 08:41:56.352836 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:56.352774 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fqcdx" podStartSLOduration=130.980846862 podStartE2EDuration="2m15.352754335s" podCreationTimestamp="2026-04-16 08:39:41 +0000 UTC" firstStartedPulling="2026-04-16 08:41:50.895954964 +0000 UTC m=+162.815940817" lastFinishedPulling="2026-04-16 08:41:55.267862437 +0000 UTC m=+167.187848290" observedRunningTime="2026-04-16 08:41:56.352074991 +0000 UTC m=+168.272060886" watchObservedRunningTime="2026-04-16 08:41:56.352754335 +0000 UTC m=+168.272740212" Apr 16 08:41:56.385745 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:56.385665 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.621989487 podStartE2EDuration="13.385649636s" podCreationTimestamp="2026-04-16 08:41:43 +0000 UTC" firstStartedPulling="2026-04-16 08:41:44.503014329 +0000 UTC m=+156.423000185" lastFinishedPulling="2026-04-16 08:41:55.266674474 +0000 UTC m=+167.186660334" observedRunningTime="2026-04-16 08:41:56.383830586 +0000 UTC m=+168.303816462" watchObservedRunningTime="2026-04-16 08:41:56.385649636 +0000 UTC m=+168.305635510" Apr 16 08:41:56.440109 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:56.440067 2578 patch_prober.go:28] interesting pod/image-registry-5bbbf98cd9-n68bb container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 08:41:56.440343 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:56.440147 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" podUID="1231a7af-9e87-4c88-9b24-457ae238ae51" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 08:41:58.185640 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:58.185580 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5bbbf98cd9-n68bb" Apr 16 08:41:58.331205 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:58.331176 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"193ac91b-a7b4-46b8-bd09-983570fff5c6","Type":"ContainerStarted","Data":"2a258c1a8124b92210899d7ce899f640cdfe1cad0e1a76f2b3c8eff88c4fea90"} Apr 16 08:41:58.331205 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:58.331209 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"193ac91b-a7b4-46b8-bd09-983570fff5c6","Type":"ContainerStarted","Data":"7066c309776d815a7b2b849e22bb9ab4fa804d125ce81ecb5c52337c6728db33"} Apr 16 08:41:58.331392 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:58.331218 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"193ac91b-a7b4-46b8-bd09-983570fff5c6","Type":"ContainerStarted","Data":"e7865578b6cb831d51c013694662a149ffe98ce7af8bdec884068a62aa083646"} Apr 16 08:41:59.336416 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:59.336382 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"193ac91b-a7b4-46b8-bd09-983570fff5c6","Type":"ContainerStarted","Data":"7598ef9d0115a8f1c7738bbb6166ae44a50bdecde771e561f1b757381702e8f5"} Apr 16 08:41:59.364423 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:41:59.364360 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.715027485 podStartE2EDuration="11.364342678s" podCreationTimestamp="2026-04-16 08:41:48 +0000 UTC" firstStartedPulling="2026-04-16 08:41:51.287050188 +0000 UTC m=+163.207036041" lastFinishedPulling="2026-04-16 08:41:57.936365369 +0000 UTC m=+169.856351234" observedRunningTime="2026-04-16 08:41:59.361849472 +0000 UTC m=+171.281835348" watchObservedRunningTime="2026-04-16 08:41:59.364342678 +0000 UTC m=+171.284328555" Apr 16 08:42:03.698088 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:03.698043 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:42:06.328566 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:06.328535 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fqcdx" Apr 16 08:42:06.899904 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:06.899862 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:42:06.899904 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:06.899907 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:42:13.268245 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.268191 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8bc9c8f99-fslt6" podUID="c8cfd1ac-9069-4487-884b-0746af4a9d81" containerName="console" containerID="cri-o://a4f893bc8dd00b30d88b9c46af68e692ebd434620cb676583f49013d05e48d56" gracePeriod=15 Apr 16 08:42:13.503546 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.503522 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8bc9c8f99-fslt6_c8cfd1ac-9069-4487-884b-0746af4a9d81/console/0.log" Apr 16 08:42:13.503661 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.503582 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:42:13.643674 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.643622 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8cfd1ac-9069-4487-884b-0746af4a9d81-console-serving-cert\") pod \"c8cfd1ac-9069-4487-884b-0746af4a9d81\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " Apr 16 08:42:13.643901 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.643689 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c8cfd1ac-9069-4487-884b-0746af4a9d81-oauth-serving-cert\") pod \"c8cfd1ac-9069-4487-884b-0746af4a9d81\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " Apr 16 08:42:13.643901 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.643780 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c8cfd1ac-9069-4487-884b-0746af4a9d81-console-config\") pod \"c8cfd1ac-9069-4487-884b-0746af4a9d81\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " Apr 16 08:42:13.643901 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.643818 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c8cfd1ac-9069-4487-884b-0746af4a9d81-console-oauth-config\") pod \"c8cfd1ac-9069-4487-884b-0746af4a9d81\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " Apr 16 08:42:13.644059 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.643899 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2hhs\" (UniqueName: \"kubernetes.io/projected/c8cfd1ac-9069-4487-884b-0746af4a9d81-kube-api-access-z2hhs\") pod \"c8cfd1ac-9069-4487-884b-0746af4a9d81\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " Apr 16 08:42:13.644059 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.643980 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8cfd1ac-9069-4487-884b-0746af4a9d81-service-ca\") pod \"c8cfd1ac-9069-4487-884b-0746af4a9d81\" (UID: \"c8cfd1ac-9069-4487-884b-0746af4a9d81\") " Apr 16 08:42:13.644176 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.644149 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8cfd1ac-9069-4487-884b-0746af4a9d81-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c8cfd1ac-9069-4487-884b-0746af4a9d81" (UID: "c8cfd1ac-9069-4487-884b-0746af4a9d81"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:42:13.644322 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.644280 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c8cfd1ac-9069-4487-884b-0746af4a9d81-oauth-serving-cert\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:42:13.644322 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.644269 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8cfd1ac-9069-4487-884b-0746af4a9d81-console-config" (OuterVolumeSpecName: "console-config") pod "c8cfd1ac-9069-4487-884b-0746af4a9d81" (UID: "c8cfd1ac-9069-4487-884b-0746af4a9d81"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:42:13.644474 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.644378 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8cfd1ac-9069-4487-884b-0746af4a9d81-service-ca" (OuterVolumeSpecName: "service-ca") pod "c8cfd1ac-9069-4487-884b-0746af4a9d81" (UID: "c8cfd1ac-9069-4487-884b-0746af4a9d81"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:42:13.646202 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.646171 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8cfd1ac-9069-4487-884b-0746af4a9d81-kube-api-access-z2hhs" (OuterVolumeSpecName: "kube-api-access-z2hhs") pod "c8cfd1ac-9069-4487-884b-0746af4a9d81" (UID: "c8cfd1ac-9069-4487-884b-0746af4a9d81"). InnerVolumeSpecName "kube-api-access-z2hhs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:42:13.646292 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.646213 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8cfd1ac-9069-4487-884b-0746af4a9d81-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c8cfd1ac-9069-4487-884b-0746af4a9d81" (UID: "c8cfd1ac-9069-4487-884b-0746af4a9d81"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:42:13.646292 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.646233 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8cfd1ac-9069-4487-884b-0746af4a9d81-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c8cfd1ac-9069-4487-884b-0746af4a9d81" (UID: "c8cfd1ac-9069-4487-884b-0746af4a9d81"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:42:13.745404 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.745356 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8cfd1ac-9069-4487-884b-0746af4a9d81-service-ca\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:42:13.745404 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.745400 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8cfd1ac-9069-4487-884b-0746af4a9d81-console-serving-cert\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:42:13.745404 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.745411 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c8cfd1ac-9069-4487-884b-0746af4a9d81-console-config\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:42:13.745404 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.745421 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c8cfd1ac-9069-4487-884b-0746af4a9d81-console-oauth-config\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:42:13.745667 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:13.745430 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z2hhs\" (UniqueName: \"kubernetes.io/projected/c8cfd1ac-9069-4487-884b-0746af4a9d81-kube-api-access-z2hhs\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:42:14.382994 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:14.382963 2578 generic.go:358] "Generic (PLEG): container finished" podID="97f24f3e-056d-4441-bbc0-42973fb6dcc4" containerID="21d522cbbb86f3561ebe2bd3a9225e0778065e9292a15b4fdda05458edeac39c" exitCode=0 Apr 16 08:42:14.383388 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:14.383041 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" event={"ID":"97f24f3e-056d-4441-bbc0-42973fb6dcc4","Type":"ContainerDied","Data":"21d522cbbb86f3561ebe2bd3a9225e0778065e9292a15b4fdda05458edeac39c"} Apr 16 08:42:14.383473 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:14.383442 2578 scope.go:117] "RemoveContainer" containerID="21d522cbbb86f3561ebe2bd3a9225e0778065e9292a15b4fdda05458edeac39c" Apr 16 08:42:14.384330 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:14.384316 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8bc9c8f99-fslt6_c8cfd1ac-9069-4487-884b-0746af4a9d81/console/0.log" Apr 16 08:42:14.384422 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:14.384348 2578 generic.go:358] "Generic (PLEG): container finished" podID="c8cfd1ac-9069-4487-884b-0746af4a9d81" containerID="a4f893bc8dd00b30d88b9c46af68e692ebd434620cb676583f49013d05e48d56" exitCode=2 Apr 16 08:42:14.384422 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:14.384375 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8bc9c8f99-fslt6" event={"ID":"c8cfd1ac-9069-4487-884b-0746af4a9d81","Type":"ContainerDied","Data":"a4f893bc8dd00b30d88b9c46af68e692ebd434620cb676583f49013d05e48d56"} Apr 16 08:42:14.384422 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:14.384416 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8bc9c8f99-fslt6" Apr 16 08:42:14.384573 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:14.384430 2578 scope.go:117] "RemoveContainer" containerID="a4f893bc8dd00b30d88b9c46af68e692ebd434620cb676583f49013d05e48d56" Apr 16 08:42:14.384573 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:14.384420 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8bc9c8f99-fslt6" event={"ID":"c8cfd1ac-9069-4487-884b-0746af4a9d81","Type":"ContainerDied","Data":"e64cd1ba3c45731c49a48f5f9e7e9b26d7c1b5bf600615bc26cc1f0f2810b87c"} Apr 16 08:42:14.455003 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:14.454978 2578 scope.go:117] "RemoveContainer" containerID="a4f893bc8dd00b30d88b9c46af68e692ebd434620cb676583f49013d05e48d56" Apr 16 08:42:14.455356 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:42:14.455331 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4f893bc8dd00b30d88b9c46af68e692ebd434620cb676583f49013d05e48d56\": container with ID starting with a4f893bc8dd00b30d88b9c46af68e692ebd434620cb676583f49013d05e48d56 not found: ID does not exist" containerID="a4f893bc8dd00b30d88b9c46af68e692ebd434620cb676583f49013d05e48d56" Apr 16 08:42:14.455424 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:14.455366 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4f893bc8dd00b30d88b9c46af68e692ebd434620cb676583f49013d05e48d56"} err="failed to get container status \"a4f893bc8dd00b30d88b9c46af68e692ebd434620cb676583f49013d05e48d56\": rpc error: code = NotFound desc = could not find container \"a4f893bc8dd00b30d88b9c46af68e692ebd434620cb676583f49013d05e48d56\": container with ID starting with a4f893bc8dd00b30d88b9c46af68e692ebd434620cb676583f49013d05e48d56 not found: ID does not exist" Apr 16 08:42:14.480504 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:14.480475 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8bc9c8f99-fslt6"] Apr 16 08:42:14.484119 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:14.484092 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8bc9c8f99-fslt6"] Apr 16 08:42:14.666594 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:14.666517 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8cfd1ac-9069-4487-884b-0746af4a9d81" path="/var/lib/kubelet/pods/c8cfd1ac-9069-4487-884b-0746af4a9d81/volumes" Apr 16 08:42:15.062747 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:15.062657 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4tvsn_11093fee-55ea-464a-b838-08d5d6f8e907/serve-healthcheck-canary/0.log" Apr 16 08:42:15.388581 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:15.388551 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-nz5z7" event={"ID":"97f24f3e-056d-4441-bbc0-42973fb6dcc4","Type":"ContainerStarted","Data":"58c036b75520df742af3a90e592b17a17d532afa63b9c24bfa306c5432deb3ae"} Apr 16 08:42:15.390584 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:15.390562 2578 generic.go:358] "Generic (PLEG): container finished" podID="c49165de-d15e-468a-9b37-71d0defef4a1" containerID="fdf1b3bb3fb9f3aa8c38830e658ae077f95f394914a5375f027bf11b1ad1dafe" exitCode=0 Apr 16 08:42:15.390704 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:15.390629 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq" event={"ID":"c49165de-d15e-468a-9b37-71d0defef4a1","Type":"ContainerDied","Data":"fdf1b3bb3fb9f3aa8c38830e658ae077f95f394914a5375f027bf11b1ad1dafe"} Apr 16 08:42:15.390924 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:15.390911 2578 scope.go:117] "RemoveContainer" containerID="fdf1b3bb3fb9f3aa8c38830e658ae077f95f394914a5375f027bf11b1ad1dafe" Apr 16 08:42:16.395185 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:16.395154 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m7lwq" event={"ID":"c49165de-d15e-468a-9b37-71d0defef4a1","Type":"ContainerStarted","Data":"53aff2d1121a77e800c937e6550fb2e70b2fd8eeb781faf8551c6899f78f97ec"} Apr 16 08:42:26.905578 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:26.905546 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:42:26.915577 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:26.915482 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-57bbcc56b8-r4vck" Apr 16 08:42:48.697385 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:48.697346 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:42:48.717908 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:48.717879 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:42:49.517670 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:42:49.517645 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:44:08.579072 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:44:08.579044 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/1.log" Apr 16 08:44:08.579620 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:44:08.579342 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/1.log" Apr 16 08:44:08.584845 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:44:08.584822 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/0.log" Apr 16 08:44:08.585198 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:44:08.585181 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/0.log" Apr 16 08:44:08.589842 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:44:08.589827 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 08:45:19.108516 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.108427 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7"] Apr 16 08:45:19.109016 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.108814 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8cfd1ac-9069-4487-884b-0746af4a9d81" containerName="console" Apr 16 08:45:19.109016 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.108828 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8cfd1ac-9069-4487-884b-0746af4a9d81" containerName="console" Apr 16 08:45:19.109016 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.108902 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8cfd1ac-9069-4487-884b-0746af4a9d81" containerName="console" Apr 16 08:45:19.110914 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.110889 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7" Apr 16 08:45:19.114538 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.114516 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 08:45:19.114665 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.114554 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 08:45:19.114746 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.114695 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 08:45:19.114820 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.114799 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-k8z2v\"" Apr 16 08:45:19.114880 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.114863 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 08:45:19.130728 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.130688 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7"] Apr 16 08:45:19.238088 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.238043 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b488352f-eb0b-4eec-b9ba-5e9c536e67ea-apiservice-cert\") pod \"opendatahub-operator-controller-manager-569944d57d-zqvw7\" (UID: \"b488352f-eb0b-4eec-b9ba-5e9c536e67ea\") " pod="opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7" Apr 16 08:45:19.238088 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.238081 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b488352f-eb0b-4eec-b9ba-5e9c536e67ea-webhook-cert\") pod \"opendatahub-operator-controller-manager-569944d57d-zqvw7\" (UID: \"b488352f-eb0b-4eec-b9ba-5e9c536e67ea\") " pod="opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7" Apr 16 08:45:19.238301 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.238121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27v7s\" (UniqueName: \"kubernetes.io/projected/b488352f-eb0b-4eec-b9ba-5e9c536e67ea-kube-api-access-27v7s\") pod \"opendatahub-operator-controller-manager-569944d57d-zqvw7\" (UID: \"b488352f-eb0b-4eec-b9ba-5e9c536e67ea\") " pod="opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7" Apr 16 08:45:19.338986 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.338944 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27v7s\" (UniqueName: \"kubernetes.io/projected/b488352f-eb0b-4eec-b9ba-5e9c536e67ea-kube-api-access-27v7s\") pod \"opendatahub-operator-controller-manager-569944d57d-zqvw7\" (UID: \"b488352f-eb0b-4eec-b9ba-5e9c536e67ea\") " pod="opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7" Apr 16 08:45:19.339167 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.339019 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b488352f-eb0b-4eec-b9ba-5e9c536e67ea-apiservice-cert\") pod \"opendatahub-operator-controller-manager-569944d57d-zqvw7\" (UID: \"b488352f-eb0b-4eec-b9ba-5e9c536e67ea\") " pod="opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7" Apr 16 08:45:19.339167 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.339037 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b488352f-eb0b-4eec-b9ba-5e9c536e67ea-webhook-cert\") pod \"opendatahub-operator-controller-manager-569944d57d-zqvw7\" (UID: \"b488352f-eb0b-4eec-b9ba-5e9c536e67ea\") " pod="opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7" Apr 16 08:45:19.341700 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.341674 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b488352f-eb0b-4eec-b9ba-5e9c536e67ea-apiservice-cert\") pod \"opendatahub-operator-controller-manager-569944d57d-zqvw7\" (UID: \"b488352f-eb0b-4eec-b9ba-5e9c536e67ea\") " pod="opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7" Apr 16 08:45:19.341831 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.341694 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b488352f-eb0b-4eec-b9ba-5e9c536e67ea-webhook-cert\") pod \"opendatahub-operator-controller-manager-569944d57d-zqvw7\" (UID: \"b488352f-eb0b-4eec-b9ba-5e9c536e67ea\") " pod="opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7" Apr 16 08:45:19.349532 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.349504 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27v7s\" (UniqueName: \"kubernetes.io/projected/b488352f-eb0b-4eec-b9ba-5e9c536e67ea-kube-api-access-27v7s\") pod \"opendatahub-operator-controller-manager-569944d57d-zqvw7\" (UID: \"b488352f-eb0b-4eec-b9ba-5e9c536e67ea\") " pod="opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7" Apr 16 08:45:19.421611 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.421527 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7" Apr 16 08:45:19.565349 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.565316 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7"] Apr 16 08:45:19.568295 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:45:19.568266 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb488352f_eb0b_4eec_b9ba_5e9c536e67ea.slice/crio-b8cfb2b92122467651a73c86e06a4d81229d67b7916b520e76601371314f7c6e WatchSource:0}: Error finding container b8cfb2b92122467651a73c86e06a4d81229d67b7916b520e76601371314f7c6e: Status 404 returned error can't find the container with id b8cfb2b92122467651a73c86e06a4d81229d67b7916b520e76601371314f7c6e Apr 16 08:45:19.569910 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.569894 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 08:45:19.929938 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:19.929901 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7" event={"ID":"b488352f-eb0b-4eec-b9ba-5e9c536e67ea","Type":"ContainerStarted","Data":"b8cfb2b92122467651a73c86e06a4d81229d67b7916b520e76601371314f7c6e"} Apr 16 08:45:22.941755 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:22.941691 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7" event={"ID":"b488352f-eb0b-4eec-b9ba-5e9c536e67ea","Type":"ContainerStarted","Data":"54e7dbe71f83842ffb65395d166be2c8820229f80e20b12404a0865bfa9b4b7e"} Apr 16 08:45:22.942155 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:22.941849 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7" Apr 16 08:45:22.962142 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:22.962085 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7" podStartSLOduration=1.5902954249999999 podStartE2EDuration="3.962067253s" podCreationTimestamp="2026-04-16 08:45:19 +0000 UTC" firstStartedPulling="2026-04-16 08:45:19.570017279 +0000 UTC m=+371.490003132" lastFinishedPulling="2026-04-16 08:45:21.941789103 +0000 UTC m=+373.861774960" observedRunningTime="2026-04-16 08:45:22.959825314 +0000 UTC m=+374.879811186" watchObservedRunningTime="2026-04-16 08:45:22.962067253 +0000 UTC m=+374.882053130" Apr 16 08:45:33.947285 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:33.947252 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-569944d57d-zqvw7" Apr 16 08:45:39.530858 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.530828 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j"] Apr 16 08:45:39.534031 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.534015 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" Apr 16 08:45:39.537181 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.537154 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 08:45:39.537326 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.537154 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 08:45:39.538181 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.538152 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 08:45:39.538315 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.538238 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:45:39.538387 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.538358 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-pkkzs\"" Apr 16 08:45:39.538448 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.538416 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 08:45:39.545152 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.545130 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j"] Apr 16 08:45:39.592090 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.592060 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e852241-561c-4ec2-b6bb-4f4811673c98-metrics-cert\") pod \"lws-controller-manager-7cbc7f8cc-pgx6j\" (UID: \"9e852241-561c-4ec2-b6bb-4f4811673c98\") " pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" Apr 16 08:45:39.592256 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.592102 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e852241-561c-4ec2-b6bb-4f4811673c98-cert\") pod \"lws-controller-manager-7cbc7f8cc-pgx6j\" (UID: \"9e852241-561c-4ec2-b6bb-4f4811673c98\") " pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" Apr 16 08:45:39.592256 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.592158 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp7xc\" (UniqueName: \"kubernetes.io/projected/9e852241-561c-4ec2-b6bb-4f4811673c98-kube-api-access-kp7xc\") pod \"lws-controller-manager-7cbc7f8cc-pgx6j\" (UID: \"9e852241-561c-4ec2-b6bb-4f4811673c98\") " pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" Apr 16 08:45:39.592256 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.592229 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9e852241-561c-4ec2-b6bb-4f4811673c98-manager-config\") pod \"lws-controller-manager-7cbc7f8cc-pgx6j\" (UID: \"9e852241-561c-4ec2-b6bb-4f4811673c98\") " pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" Apr 16 08:45:39.693258 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.693213 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9e852241-561c-4ec2-b6bb-4f4811673c98-manager-config\") pod \"lws-controller-manager-7cbc7f8cc-pgx6j\" (UID: \"9e852241-561c-4ec2-b6bb-4f4811673c98\") " pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" Apr 16 08:45:39.693453 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.693289 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e852241-561c-4ec2-b6bb-4f4811673c98-metrics-cert\") pod \"lws-controller-manager-7cbc7f8cc-pgx6j\" (UID: \"9e852241-561c-4ec2-b6bb-4f4811673c98\") " pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" Apr 16 08:45:39.693453 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.693329 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e852241-561c-4ec2-b6bb-4f4811673c98-cert\") pod \"lws-controller-manager-7cbc7f8cc-pgx6j\" (UID: \"9e852241-561c-4ec2-b6bb-4f4811673c98\") " pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" Apr 16 08:45:39.693453 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.693355 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kp7xc\" (UniqueName: \"kubernetes.io/projected/9e852241-561c-4ec2-b6bb-4f4811673c98-kube-api-access-kp7xc\") pod \"lws-controller-manager-7cbc7f8cc-pgx6j\" (UID: \"9e852241-561c-4ec2-b6bb-4f4811673c98\") " pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" Apr 16 08:45:39.693898 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.693873 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9e852241-561c-4ec2-b6bb-4f4811673c98-manager-config\") pod \"lws-controller-manager-7cbc7f8cc-pgx6j\" (UID: \"9e852241-561c-4ec2-b6bb-4f4811673c98\") " pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" Apr 16 08:45:39.695847 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.695816 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e852241-561c-4ec2-b6bb-4f4811673c98-cert\") pod \"lws-controller-manager-7cbc7f8cc-pgx6j\" (UID: \"9e852241-561c-4ec2-b6bb-4f4811673c98\") " pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" Apr 16 08:45:39.696009 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.695989 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e852241-561c-4ec2-b6bb-4f4811673c98-metrics-cert\") pod \"lws-controller-manager-7cbc7f8cc-pgx6j\" (UID: \"9e852241-561c-4ec2-b6bb-4f4811673c98\") " pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" Apr 16 08:45:39.703028 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.703000 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp7xc\" (UniqueName: \"kubernetes.io/projected/9e852241-561c-4ec2-b6bb-4f4811673c98-kube-api-access-kp7xc\") pod \"lws-controller-manager-7cbc7f8cc-pgx6j\" (UID: \"9e852241-561c-4ec2-b6bb-4f4811673c98\") " pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" Apr 16 08:45:39.843490 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.843464 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" Apr 16 08:45:39.970954 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:39.970928 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j"] Apr 16 08:45:39.973690 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:45:39.973660 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e852241_561c_4ec2_b6bb_4f4811673c98.slice/crio-df4f0d6c3e0391230998c1cacb280da9263c3cae4aef2deae644cf575b4f2f51 WatchSource:0}: Error finding container df4f0d6c3e0391230998c1cacb280da9263c3cae4aef2deae644cf575b4f2f51: Status 404 returned error can't find the container with id df4f0d6c3e0391230998c1cacb280da9263c3cae4aef2deae644cf575b4f2f51 Apr 16 08:45:40.001410 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:40.001380 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" event={"ID":"9e852241-561c-4ec2-b6bb-4f4811673c98","Type":"ContainerStarted","Data":"df4f0d6c3e0391230998c1cacb280da9263c3cae4aef2deae644cf575b4f2f51"} Apr 16 08:45:43.014313 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:43.014274 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" event={"ID":"9e852241-561c-4ec2-b6bb-4f4811673c98","Type":"ContainerStarted","Data":"0cc5cda4ad94d61f8b6b9a31897364328aeea243a13e3939fa4a8d6ac09d060d"} Apr 16 08:45:43.014680 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:43.014501 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" Apr 16 08:45:43.029280 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:43.029227 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" podStartSLOduration=1.66067294 podStartE2EDuration="4.029214261s" podCreationTimestamp="2026-04-16 08:45:39 +0000 UTC" firstStartedPulling="2026-04-16 08:45:39.975533097 +0000 UTC m=+391.895519148" lastFinishedPulling="2026-04-16 08:45:42.344074609 +0000 UTC m=+394.264060469" observedRunningTime="2026-04-16 08:45:43.028685853 +0000 UTC m=+394.948671729" watchObservedRunningTime="2026-04-16 08:45:43.029214261 +0000 UTC m=+394.949200136" Apr 16 08:45:54.020504 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:45:54.020464 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7cbc7f8cc-pgx6j" Apr 16 08:46:22.766957 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.766860 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n"] Apr 16 08:46:22.771747 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.771702 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.774418 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.774395 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 08:46:22.774778 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.774754 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 08:46:22.774903 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.774773 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 08:46:22.774976 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.774845 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-2vc65\"" Apr 16 08:46:22.779107 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.779079 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qpnh\" (UniqueName: \"kubernetes.io/projected/c9eccfee-e4a0-4a84-be39-271840d1a2ae-kube-api-access-9qpnh\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.779300 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.779278 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.779406 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.779392 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.779518 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.779504 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.779644 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.779629 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.781742 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.779796 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.781831 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.781796 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.781883 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.781831 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.781883 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.781868 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.781975 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.781329 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n"] Apr 16 08:46:22.883204 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.883154 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qpnh\" (UniqueName: \"kubernetes.io/projected/c9eccfee-e4a0-4a84-be39-271840d1a2ae-kube-api-access-9qpnh\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.883388 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.883209 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.883388 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.883238 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.883388 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.883269 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.883388 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.883293 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.883388 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.883327 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.883388 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.883377 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.883684 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.883404 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.883684 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.883438 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.883831 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.883767 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.883831 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.883779 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.883934 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.883835 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.884246 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.884213 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.884426 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.884408 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.885917 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.885893 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.886223 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.886201 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.891331 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.891311 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:22.891447 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:22.891411 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qpnh\" (UniqueName: \"kubernetes.io/projected/c9eccfee-e4a0-4a84-be39-271840d1a2ae-kube-api-access-9qpnh\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:23.032726 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.032610 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt"] Apr 16 08:46:23.035388 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.035362 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.045175 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.045143 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt"] Apr 16 08:46:23.085174 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.085130 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/fc028e22-de25-4e9d-b201-a94f19dd4e66-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.085349 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.085187 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/fc028e22-de25-4e9d-b201-a94f19dd4e66-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.085349 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.085220 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:23.085349 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.085257 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78b2w\" (UniqueName: \"kubernetes.io/projected/fc028e22-de25-4e9d-b201-a94f19dd4e66-kube-api-access-78b2w\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.085349 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.085317 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/fc028e22-de25-4e9d-b201-a94f19dd4e66-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.085550 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.085445 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/fc028e22-de25-4e9d-b201-a94f19dd4e66-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.085550 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.085482 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/fc028e22-de25-4e9d-b201-a94f19dd4e66-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.085642 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.085555 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/fc028e22-de25-4e9d-b201-a94f19dd4e66-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.085642 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.085596 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/fc028e22-de25-4e9d-b201-a94f19dd4e66-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.085757 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.085668 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/fc028e22-de25-4e9d-b201-a94f19dd4e66-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.186624 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.186591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/fc028e22-de25-4e9d-b201-a94f19dd4e66-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.186821 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.186639 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/fc028e22-de25-4e9d-b201-a94f19dd4e66-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.186821 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.186687 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/fc028e22-de25-4e9d-b201-a94f19dd4e66-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.186821 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.186750 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/fc028e22-de25-4e9d-b201-a94f19dd4e66-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.186821 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.186791 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/fc028e22-de25-4e9d-b201-a94f19dd4e66-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.187028 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.186822 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78b2w\" (UniqueName: \"kubernetes.io/projected/fc028e22-de25-4e9d-b201-a94f19dd4e66-kube-api-access-78b2w\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.187028 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.186853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/fc028e22-de25-4e9d-b201-a94f19dd4e66-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.187028 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.186877 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/fc028e22-de25-4e9d-b201-a94f19dd4e66-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.187028 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.186907 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/fc028e22-de25-4e9d-b201-a94f19dd4e66-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.187220 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.187125 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/fc028e22-de25-4e9d-b201-a94f19dd4e66-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.187220 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.187147 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/fc028e22-de25-4e9d-b201-a94f19dd4e66-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.187410 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.187386 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/fc028e22-de25-4e9d-b201-a94f19dd4e66-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.187610 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.187584 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/fc028e22-de25-4e9d-b201-a94f19dd4e66-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.187974 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.187954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/fc028e22-de25-4e9d-b201-a94f19dd4e66-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.190289 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.190243 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/fc028e22-de25-4e9d-b201-a94f19dd4e66-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.190619 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.190588 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/fc028e22-de25-4e9d-b201-a94f19dd4e66-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.195582 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.195530 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/fc028e22-de25-4e9d-b201-a94f19dd4e66-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.196028 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.196005 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78b2w\" (UniqueName: \"kubernetes.io/projected/fc028e22-de25-4e9d-b201-a94f19dd4e66-kube-api-access-78b2w\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt\" (UID: \"fc028e22-de25-4e9d-b201-a94f19dd4e66\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.224549 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.224516 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n"] Apr 16 08:46:23.228489 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:46:23.228456 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9eccfee_e4a0_4a84_be39_271840d1a2ae.slice/crio-a2afb225c2aaeafac3c7553af6cb7f85c53890dd14ac7b1665cc8f5b9776e3ca WatchSource:0}: Error finding container a2afb225c2aaeafac3c7553af6cb7f85c53890dd14ac7b1665cc8f5b9776e3ca: Status 404 returned error can't find the container with id a2afb225c2aaeafac3c7553af6cb7f85c53890dd14ac7b1665cc8f5b9776e3ca Apr 16 08:46:23.346607 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.346579 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:23.479569 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:23.479541 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt"] Apr 16 08:46:23.482742 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:46:23.482688 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc028e22_de25_4e9d_b201_a94f19dd4e66.slice/crio-066e55776d58d2ccab96665446d35b9815f7d043d6642d4d106e0ef1b8dd9d2b WatchSource:0}: Error finding container 066e55776d58d2ccab96665446d35b9815f7d043d6642d4d106e0ef1b8dd9d2b: Status 404 returned error can't find the container with id 066e55776d58d2ccab96665446d35b9815f7d043d6642d4d106e0ef1b8dd9d2b Apr 16 08:46:24.152866 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:24.152828 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" event={"ID":"fc028e22-de25-4e9d-b201-a94f19dd4e66","Type":"ContainerStarted","Data":"066e55776d58d2ccab96665446d35b9815f7d043d6642d4d106e0ef1b8dd9d2b"} Apr 16 08:46:24.154155 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:24.154121 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" event={"ID":"c9eccfee-e4a0-4a84-be39-271840d1a2ae","Type":"ContainerStarted","Data":"a2afb225c2aaeafac3c7553af6cb7f85c53890dd14ac7b1665cc8f5b9776e3ca"} Apr 16 08:46:25.668047 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:25.668011 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 08:46:25.668286 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:25.668082 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 08:46:25.668286 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:25.668112 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 08:46:25.768883 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:25.768843 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 08:46:25.769008 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:25.768911 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 08:46:25.769008 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:25.768937 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 08:46:26.162200 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:26.162162 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" event={"ID":"fc028e22-de25-4e9d-b201-a94f19dd4e66","Type":"ContainerStarted","Data":"0331b27c3d967d58623a055ddd0f7e3d5538a4e94c7919d9c14fcabd86193b8e"} Apr 16 08:46:26.163539 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:26.163513 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" event={"ID":"c9eccfee-e4a0-4a84-be39-271840d1a2ae","Type":"ContainerStarted","Data":"97a75b51d25006a61955975959ab5e480b0b5db9f896dde9971f59f43e36450a"} Apr 16 08:46:26.197398 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:26.197344 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" podStartSLOduration=1.760043056 podStartE2EDuration="4.19732689s" podCreationTimestamp="2026-04-16 08:46:22 +0000 UTC" firstStartedPulling="2026-04-16 08:46:23.230468078 +0000 UTC m=+435.150453948" lastFinishedPulling="2026-04-16 08:46:25.667751918 +0000 UTC m=+437.587737782" observedRunningTime="2026-04-16 08:46:26.196300616 +0000 UTC m=+438.116286502" watchObservedRunningTime="2026-04-16 08:46:26.19732689 +0000 UTC m=+438.117313167" Apr 16 08:46:26.197964 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:26.197931 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" podStartSLOduration=0.914239167 podStartE2EDuration="3.197923777s" podCreationTimestamp="2026-04-16 08:46:23 +0000 UTC" firstStartedPulling="2026-04-16 08:46:23.484860528 +0000 UTC m=+435.404846385" lastFinishedPulling="2026-04-16 08:46:25.76854514 +0000 UTC m=+437.688530995" observedRunningTime="2026-04-16 08:46:26.179939761 +0000 UTC m=+438.099925665" watchObservedRunningTime="2026-04-16 08:46:26.197923777 +0000 UTC m=+438.117909654" Apr 16 08:46:26.347808 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:26.347767 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:26.352473 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:26.352449 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:27.086401 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:27.086366 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:27.087769 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:27.087741 2578 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.25:15021/healthz/ready\": dial tcp 10.133.0.25:15021: connect: connection refused" start-of-body= Apr 16 08:46:27.087884 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:27.087793 2578 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" podUID="c9eccfee-e4a0-4a84-be39-271840d1a2ae" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.25:15021/healthz/ready\": dial tcp 10.133.0.25:15021: connect: connection refused" Apr 16 08:46:27.166617 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:27.166588 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:27.167663 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:27.167641 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt" Apr 16 08:46:27.212492 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:27.212458 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n"] Apr 16 08:46:28.086134 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:28.086101 2578 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.25:15021/healthz/ready\": dial tcp 10.133.0.25:15021: connect: connection refused" start-of-body= Apr 16 08:46:28.086313 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:28.086155 2578 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" podUID="c9eccfee-e4a0-4a84-be39-271840d1a2ae" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.25:15021/healthz/ready\": dial tcp 10.133.0.25:15021: connect: connection refused" Apr 16 08:46:29.086402 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:29.086348 2578 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.25:15021/healthz/ready\": dial tcp 10.133.0.25:15021: connect: connection refused" start-of-body= Apr 16 08:46:29.086811 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:29.086428 2578 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" podUID="c9eccfee-e4a0-4a84-be39-271840d1a2ae" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.25:15021/healthz/ready\": dial tcp 10.133.0.25:15021: connect: connection refused" Apr 16 08:46:29.173446 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:29.173408 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" podUID="c9eccfee-e4a0-4a84-be39-271840d1a2ae" containerName="istio-proxy" containerID="cri-o://97a75b51d25006a61955975959ab5e480b0b5db9f896dde9971f59f43e36450a" gracePeriod=30 Apr 16 08:46:34.426668 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.426644 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:34.491589 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.491508 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-workload-certs\") pod \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " Apr 16 08:46:34.491589 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.491547 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istiod-ca-cert\") pod \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " Apr 16 08:46:34.491589 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.491565 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-workload-socket\") pod \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " Apr 16 08:46:34.491855 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.491612 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-envoy\") pod \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " Apr 16 08:46:34.491855 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.491643 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-data\") pod \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " Apr 16 08:46:34.491855 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.491695 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-podinfo\") pod \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " Apr 16 08:46:34.491855 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.491756 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-credential-socket\") pod \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " Apr 16 08:46:34.491855 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.491789 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-token\") pod \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " Apr 16 08:46:34.491855 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.491851 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qpnh\" (UniqueName: \"kubernetes.io/projected/c9eccfee-e4a0-4a84-be39-271840d1a2ae-kube-api-access-9qpnh\") pod \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\" (UID: \"c9eccfee-e4a0-4a84-be39-271840d1a2ae\") " Apr 16 08:46:34.492126 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.491870 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-workload-certs" (OuterVolumeSpecName: "workload-certs") pod "c9eccfee-e4a0-4a84-be39-271840d1a2ae" (UID: "c9eccfee-e4a0-4a84-be39-271840d1a2ae"). InnerVolumeSpecName "workload-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 08:46:34.492126 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.491942 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-workload-socket" (OuterVolumeSpecName: "workload-socket") pod "c9eccfee-e4a0-4a84-be39-271840d1a2ae" (UID: "c9eccfee-e4a0-4a84-be39-271840d1a2ae"). InnerVolumeSpecName "workload-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 08:46:34.492126 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.492067 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istiod-ca-cert" (OuterVolumeSpecName: "istiod-ca-cert") pod "c9eccfee-e4a0-4a84-be39-271840d1a2ae" (UID: "c9eccfee-e4a0-4a84-be39-271840d1a2ae"). InnerVolumeSpecName "istiod-ca-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:46:34.492245 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.492093 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-credential-socket" (OuterVolumeSpecName: "credential-socket") pod "c9eccfee-e4a0-4a84-be39-271840d1a2ae" (UID: "c9eccfee-e4a0-4a84-be39-271840d1a2ae"). InnerVolumeSpecName "credential-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 08:46:34.492245 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.492120 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-data" (OuterVolumeSpecName: "istio-data") pod "c9eccfee-e4a0-4a84-be39-271840d1a2ae" (UID: "c9eccfee-e4a0-4a84-be39-271840d1a2ae"). InnerVolumeSpecName "istio-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 08:46:34.492245 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.492202 2578 reconciler_common.go:299] "Volume detached for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-workload-certs\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:46:34.492245 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.492225 2578 reconciler_common.go:299] "Volume detached for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istiod-ca-cert\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:46:34.492245 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.492242 2578 reconciler_common.go:299] "Volume detached for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-workload-socket\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:46:34.494205 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.494179 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-podinfo" (OuterVolumeSpecName: "istio-podinfo") pod "c9eccfee-e4a0-4a84-be39-271840d1a2ae" (UID: "c9eccfee-e4a0-4a84-be39-271840d1a2ae"). InnerVolumeSpecName "istio-podinfo". PluginName "kubernetes.io/downward-api", VolumeGIDValue "" Apr 16 08:46:34.494337 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.494268 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9eccfee-e4a0-4a84-be39-271840d1a2ae-kube-api-access-9qpnh" (OuterVolumeSpecName: "kube-api-access-9qpnh") pod "c9eccfee-e4a0-4a84-be39-271840d1a2ae" (UID: "c9eccfee-e4a0-4a84-be39-271840d1a2ae"). InnerVolumeSpecName "kube-api-access-9qpnh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:46:34.494337 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.494313 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-envoy" (OuterVolumeSpecName: "istio-envoy") pod "c9eccfee-e4a0-4a84-be39-271840d1a2ae" (UID: "c9eccfee-e4a0-4a84-be39-271840d1a2ae"). InnerVolumeSpecName "istio-envoy". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 08:46:34.494337 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.494327 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-token" (OuterVolumeSpecName: "istio-token") pod "c9eccfee-e4a0-4a84-be39-271840d1a2ae" (UID: "c9eccfee-e4a0-4a84-be39-271840d1a2ae"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:46:34.593415 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.593383 2578 reconciler_common.go:299] "Volume detached for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-envoy\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:46:34.593415 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.593408 2578 reconciler_common.go:299] "Volume detached for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-data\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:46:34.593415 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.593419 2578 reconciler_common.go:299] "Volume detached for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-podinfo\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:46:34.593640 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.593428 2578 reconciler_common.go:299] "Volume detached for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c9eccfee-e4a0-4a84-be39-271840d1a2ae-credential-socket\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:46:34.593640 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.593438 2578 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c9eccfee-e4a0-4a84-be39-271840d1a2ae-istio-token\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:46:34.593640 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:34.593447 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9qpnh\" (UniqueName: \"kubernetes.io/projected/c9eccfee-e4a0-4a84-be39-271840d1a2ae-kube-api-access-9qpnh\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:46:35.192667 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:35.192617 2578 generic.go:358] "Generic (PLEG): container finished" podID="c9eccfee-e4a0-4a84-be39-271840d1a2ae" containerID="97a75b51d25006a61955975959ab5e480b0b5db9f896dde9971f59f43e36450a" exitCode=0 Apr 16 08:46:35.192841 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:35.192697 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" Apr 16 08:46:35.192841 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:35.192701 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" event={"ID":"c9eccfee-e4a0-4a84-be39-271840d1a2ae","Type":"ContainerDied","Data":"97a75b51d25006a61955975959ab5e480b0b5db9f896dde9971f59f43e36450a"} Apr 16 08:46:35.192841 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:35.192761 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n" event={"ID":"c9eccfee-e4a0-4a84-be39-271840d1a2ae","Type":"ContainerDied","Data":"a2afb225c2aaeafac3c7553af6cb7f85c53890dd14ac7b1665cc8f5b9776e3ca"} Apr 16 08:46:35.192841 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:35.192776 2578 scope.go:117] "RemoveContainer" containerID="97a75b51d25006a61955975959ab5e480b0b5db9f896dde9971f59f43e36450a" Apr 16 08:46:35.201214 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:35.201189 2578 scope.go:117] "RemoveContainer" containerID="97a75b51d25006a61955975959ab5e480b0b5db9f896dde9971f59f43e36450a" Apr 16 08:46:35.201461 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:46:35.201441 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97a75b51d25006a61955975959ab5e480b0b5db9f896dde9971f59f43e36450a\": container with ID starting with 97a75b51d25006a61955975959ab5e480b0b5db9f896dde9971f59f43e36450a not found: ID does not exist" containerID="97a75b51d25006a61955975959ab5e480b0b5db9f896dde9971f59f43e36450a" Apr 16 08:46:35.201542 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:35.201469 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97a75b51d25006a61955975959ab5e480b0b5db9f896dde9971f59f43e36450a"} err="failed to get container status \"97a75b51d25006a61955975959ab5e480b0b5db9f896dde9971f59f43e36450a\": rpc error: code = NotFound desc = could not find container \"97a75b51d25006a61955975959ab5e480b0b5db9f896dde9971f59f43e36450a\": container with ID starting with 97a75b51d25006a61955975959ab5e480b0b5db9f896dde9971f59f43e36450a not found: ID does not exist" Apr 16 08:46:35.210588 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:35.210560 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n"] Apr 16 08:46:35.214108 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:35.214084 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lrr5n"] Apr 16 08:46:36.666542 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:36.666505 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9eccfee-e4a0-4a84-be39-271840d1a2ae" path="/var/lib/kubelet/pods/c9eccfee-e4a0-4a84-be39-271840d1a2ae/volumes" Apr 16 08:46:37.151133 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.151096 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-9584z"] Apr 16 08:46:37.151482 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.151469 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9eccfee-e4a0-4a84-be39-271840d1a2ae" containerName="istio-proxy" Apr 16 08:46:37.151522 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.151484 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9eccfee-e4a0-4a84-be39-271840d1a2ae" containerName="istio-proxy" Apr 16 08:46:37.151563 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.151553 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9eccfee-e4a0-4a84-be39-271840d1a2ae" containerName="istio-proxy" Apr 16 08:46:37.154242 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.154223 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-9584z" Apr 16 08:46:37.156674 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.156652 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 08:46:37.156787 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.156671 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-bth84\"" Apr 16 08:46:37.156839 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.156821 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 08:46:37.163614 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.163586 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-9584z"] Apr 16 08:46:37.216048 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.216012 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pr67\" (UniqueName: \"kubernetes.io/projected/2ee57fd3-76c6-4dbd-83a0-37199b70ec7c-kube-api-access-8pr67\") pod \"kuadrant-operator-catalog-9584z\" (UID: \"2ee57fd3-76c6-4dbd-83a0-37199b70ec7c\") " pod="kuadrant-system/kuadrant-operator-catalog-9584z" Apr 16 08:46:37.317251 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.317214 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pr67\" (UniqueName: \"kubernetes.io/projected/2ee57fd3-76c6-4dbd-83a0-37199b70ec7c-kube-api-access-8pr67\") pod \"kuadrant-operator-catalog-9584z\" (UID: \"2ee57fd3-76c6-4dbd-83a0-37199b70ec7c\") " pod="kuadrant-system/kuadrant-operator-catalog-9584z" Apr 16 08:46:37.324915 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.324893 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pr67\" (UniqueName: \"kubernetes.io/projected/2ee57fd3-76c6-4dbd-83a0-37199b70ec7c-kube-api-access-8pr67\") pod \"kuadrant-operator-catalog-9584z\" (UID: \"2ee57fd3-76c6-4dbd-83a0-37199b70ec7c\") " pod="kuadrant-system/kuadrant-operator-catalog-9584z" Apr 16 08:46:37.465672 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.465568 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-9584z" Apr 16 08:46:37.525909 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.525849 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-9584z"] Apr 16 08:46:37.591761 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.591735 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-9584z"] Apr 16 08:46:37.593765 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:46:37.593739 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ee57fd3_76c6_4dbd_83a0_37199b70ec7c.slice/crio-9cf41f59977b1a4bed7dff10a453d2ea46c64f6444c31969d7646b882599983b WatchSource:0}: Error finding container 9cf41f59977b1a4bed7dff10a453d2ea46c64f6444c31969d7646b882599983b: Status 404 returned error can't find the container with id 9cf41f59977b1a4bed7dff10a453d2ea46c64f6444c31969d7646b882599983b Apr 16 08:46:37.732169 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.732091 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-s9hcc"] Apr 16 08:46:37.734979 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.734962 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-s9hcc" Apr 16 08:46:37.741394 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.741366 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-s9hcc"] Apr 16 08:46:37.821140 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.821105 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv6mg\" (UniqueName: \"kubernetes.io/projected/074e55dc-7424-4e94-8cba-fc8d31c62fe0-kube-api-access-mv6mg\") pod \"kuadrant-operator-catalog-s9hcc\" (UID: \"074e55dc-7424-4e94-8cba-fc8d31c62fe0\") " pod="kuadrant-system/kuadrant-operator-catalog-s9hcc" Apr 16 08:46:37.922062 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.922010 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mv6mg\" (UniqueName: \"kubernetes.io/projected/074e55dc-7424-4e94-8cba-fc8d31c62fe0-kube-api-access-mv6mg\") pod \"kuadrant-operator-catalog-s9hcc\" (UID: \"074e55dc-7424-4e94-8cba-fc8d31c62fe0\") " pod="kuadrant-system/kuadrant-operator-catalog-s9hcc" Apr 16 08:46:37.930262 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:37.930221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv6mg\" (UniqueName: \"kubernetes.io/projected/074e55dc-7424-4e94-8cba-fc8d31c62fe0-kube-api-access-mv6mg\") pod \"kuadrant-operator-catalog-s9hcc\" (UID: \"074e55dc-7424-4e94-8cba-fc8d31c62fe0\") " pod="kuadrant-system/kuadrant-operator-catalog-s9hcc" Apr 16 08:46:38.046150 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:38.046060 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-s9hcc" Apr 16 08:46:38.177871 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:38.177847 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-s9hcc"] Apr 16 08:46:38.204128 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:38.204096 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-9584z" event={"ID":"2ee57fd3-76c6-4dbd-83a0-37199b70ec7c","Type":"ContainerStarted","Data":"9cf41f59977b1a4bed7dff10a453d2ea46c64f6444c31969d7646b882599983b"} Apr 16 08:46:38.214545 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:46:38.214505 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod074e55dc_7424_4e94_8cba_fc8d31c62fe0.slice/crio-5233e42e6acadd7fa8d21ee255d577aab29cc6319a54730501a14ea428fddb56 WatchSource:0}: Error finding container 5233e42e6acadd7fa8d21ee255d577aab29cc6319a54730501a14ea428fddb56: Status 404 returned error can't find the container with id 5233e42e6acadd7fa8d21ee255d577aab29cc6319a54730501a14ea428fddb56 Apr 16 08:46:39.210273 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:39.210238 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-s9hcc" event={"ID":"074e55dc-7424-4e94-8cba-fc8d31c62fe0","Type":"ContainerStarted","Data":"5233e42e6acadd7fa8d21ee255d577aab29cc6319a54730501a14ea428fddb56"} Apr 16 08:46:40.215396 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:40.215353 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-9584z" event={"ID":"2ee57fd3-76c6-4dbd-83a0-37199b70ec7c","Type":"ContainerStarted","Data":"0d191185037f20d4a9fc6f039858ab40919dcaf265df35b6fb829657932806bb"} Apr 16 08:46:40.215892 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:40.215449 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-9584z" podUID="2ee57fd3-76c6-4dbd-83a0-37199b70ec7c" containerName="registry-server" containerID="cri-o://0d191185037f20d4a9fc6f039858ab40919dcaf265df35b6fb829657932806bb" gracePeriod=2 Apr 16 08:46:40.217223 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:40.217198 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-s9hcc" event={"ID":"074e55dc-7424-4e94-8cba-fc8d31c62fe0","Type":"ContainerStarted","Data":"ea63458155941181b0f61bfd9b4eaef98a04e2518bafd572012d09b67bf3d079"} Apr 16 08:46:40.233349 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:40.233304 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-9584z" podStartSLOduration=1.127453313 podStartE2EDuration="3.23328799s" podCreationTimestamp="2026-04-16 08:46:37 +0000 UTC" firstStartedPulling="2026-04-16 08:46:37.595036983 +0000 UTC m=+449.515022839" lastFinishedPulling="2026-04-16 08:46:39.700871661 +0000 UTC m=+451.620857516" observedRunningTime="2026-04-16 08:46:40.231372582 +0000 UTC m=+452.151358457" watchObservedRunningTime="2026-04-16 08:46:40.23328799 +0000 UTC m=+452.153273864" Apr 16 08:46:40.245462 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:40.245410 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-s9hcc" podStartSLOduration=1.75736066 podStartE2EDuration="3.245395042s" podCreationTimestamp="2026-04-16 08:46:37 +0000 UTC" firstStartedPulling="2026-04-16 08:46:38.215894172 +0000 UTC m=+450.135880026" lastFinishedPulling="2026-04-16 08:46:39.703928544 +0000 UTC m=+451.623914408" observedRunningTime="2026-04-16 08:46:40.244329433 +0000 UTC m=+452.164315308" watchObservedRunningTime="2026-04-16 08:46:40.245395042 +0000 UTC m=+452.165380916" Apr 16 08:46:40.462243 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:40.462210 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-9584z" Apr 16 08:46:40.547241 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:40.547142 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pr67\" (UniqueName: \"kubernetes.io/projected/2ee57fd3-76c6-4dbd-83a0-37199b70ec7c-kube-api-access-8pr67\") pod \"2ee57fd3-76c6-4dbd-83a0-37199b70ec7c\" (UID: \"2ee57fd3-76c6-4dbd-83a0-37199b70ec7c\") " Apr 16 08:46:40.549537 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:40.549499 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee57fd3-76c6-4dbd-83a0-37199b70ec7c-kube-api-access-8pr67" (OuterVolumeSpecName: "kube-api-access-8pr67") pod "2ee57fd3-76c6-4dbd-83a0-37199b70ec7c" (UID: "2ee57fd3-76c6-4dbd-83a0-37199b70ec7c"). InnerVolumeSpecName "kube-api-access-8pr67". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:46:40.648327 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:40.648283 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8pr67\" (UniqueName: \"kubernetes.io/projected/2ee57fd3-76c6-4dbd-83a0-37199b70ec7c-kube-api-access-8pr67\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:46:41.221048 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:41.221008 2578 generic.go:358] "Generic (PLEG): container finished" podID="2ee57fd3-76c6-4dbd-83a0-37199b70ec7c" containerID="0d191185037f20d4a9fc6f039858ab40919dcaf265df35b6fb829657932806bb" exitCode=0 Apr 16 08:46:41.221493 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:41.221088 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-9584z" Apr 16 08:46:41.221493 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:41.221109 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-9584z" event={"ID":"2ee57fd3-76c6-4dbd-83a0-37199b70ec7c","Type":"ContainerDied","Data":"0d191185037f20d4a9fc6f039858ab40919dcaf265df35b6fb829657932806bb"} Apr 16 08:46:41.221493 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:41.221151 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-9584z" event={"ID":"2ee57fd3-76c6-4dbd-83a0-37199b70ec7c","Type":"ContainerDied","Data":"9cf41f59977b1a4bed7dff10a453d2ea46c64f6444c31969d7646b882599983b"} Apr 16 08:46:41.221493 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:41.221166 2578 scope.go:117] "RemoveContainer" containerID="0d191185037f20d4a9fc6f039858ab40919dcaf265df35b6fb829657932806bb" Apr 16 08:46:41.230352 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:41.230336 2578 scope.go:117] "RemoveContainer" containerID="0d191185037f20d4a9fc6f039858ab40919dcaf265df35b6fb829657932806bb" Apr 16 08:46:41.230617 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:46:41.230598 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d191185037f20d4a9fc6f039858ab40919dcaf265df35b6fb829657932806bb\": container with ID starting with 0d191185037f20d4a9fc6f039858ab40919dcaf265df35b6fb829657932806bb not found: ID does not exist" containerID="0d191185037f20d4a9fc6f039858ab40919dcaf265df35b6fb829657932806bb" Apr 16 08:46:41.230677 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:41.230630 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d191185037f20d4a9fc6f039858ab40919dcaf265df35b6fb829657932806bb"} err="failed to get container status \"0d191185037f20d4a9fc6f039858ab40919dcaf265df35b6fb829657932806bb\": rpc error: code = NotFound desc = could not find container \"0d191185037f20d4a9fc6f039858ab40919dcaf265df35b6fb829657932806bb\": container with ID starting with 0d191185037f20d4a9fc6f039858ab40919dcaf265df35b6fb829657932806bb not found: ID does not exist" Apr 16 08:46:41.237102 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:41.237074 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-9584z"] Apr 16 08:46:41.240264 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:41.240242 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-9584z"] Apr 16 08:46:42.666337 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:42.666293 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ee57fd3-76c6-4dbd-83a0-37199b70ec7c" path="/var/lib/kubelet/pods/2ee57fd3-76c6-4dbd-83a0-37199b70ec7c/volumes" Apr 16 08:46:48.046178 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:48.046146 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-s9hcc" Apr 16 08:46:48.046643 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:48.046248 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-s9hcc" Apr 16 08:46:48.068217 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:48.068188 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-s9hcc" Apr 16 08:46:48.268672 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:48.268642 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-s9hcc" Apr 16 08:46:52.485175 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.485137 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-85b4bfbb8c-kvbhs"] Apr 16 08:46:52.485656 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.485633 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ee57fd3-76c6-4dbd-83a0-37199b70ec7c" containerName="registry-server" Apr 16 08:46:52.485764 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.485661 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee57fd3-76c6-4dbd-83a0-37199b70ec7c" containerName="registry-server" Apr 16 08:46:52.485816 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.485786 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ee57fd3-76c6-4dbd-83a0-37199b70ec7c" containerName="registry-server" Apr 16 08:46:52.492289 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.492251 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.496043 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.496017 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-w94jt\"" Apr 16 08:46:52.496191 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.496044 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 08:46:52.496191 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.496066 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 08:46:52.496338 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.496304 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 08:46:52.496536 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.496515 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85b4bfbb8c-kvbhs"] Apr 16 08:46:52.496981 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.496955 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 08:46:52.497093 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.496981 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 08:46:52.500899 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.500876 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 08:46:52.560409 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.560376 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/337fbebc-d7d7-4879-a750-c451f801dd55-service-ca\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.560581 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.560433 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/337fbebc-d7d7-4879-a750-c451f801dd55-oauth-serving-cert\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.560581 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.560479 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/337fbebc-d7d7-4879-a750-c451f801dd55-console-oauth-config\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.560581 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.560561 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/337fbebc-d7d7-4879-a750-c451f801dd55-console-serving-cert\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.560789 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.560586 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/337fbebc-d7d7-4879-a750-c451f801dd55-console-config\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.560789 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.560610 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/337fbebc-d7d7-4879-a750-c451f801dd55-trusted-ca-bundle\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.560789 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.560690 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjjsk\" (UniqueName: \"kubernetes.io/projected/337fbebc-d7d7-4879-a750-c451f801dd55-kube-api-access-qjjsk\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.661987 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.661943 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/337fbebc-d7d7-4879-a750-c451f801dd55-console-oauth-config\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.662150 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.662038 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/337fbebc-d7d7-4879-a750-c451f801dd55-console-serving-cert\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.662150 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.662067 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/337fbebc-d7d7-4879-a750-c451f801dd55-console-config\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.662150 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.662092 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/337fbebc-d7d7-4879-a750-c451f801dd55-trusted-ca-bundle\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.662338 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.662243 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjjsk\" (UniqueName: \"kubernetes.io/projected/337fbebc-d7d7-4879-a750-c451f801dd55-kube-api-access-qjjsk\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.662338 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.662301 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/337fbebc-d7d7-4879-a750-c451f801dd55-service-ca\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.662441 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.662374 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/337fbebc-d7d7-4879-a750-c451f801dd55-oauth-serving-cert\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.662950 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.662863 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/337fbebc-d7d7-4879-a750-c451f801dd55-console-config\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.663123 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.663044 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/337fbebc-d7d7-4879-a750-c451f801dd55-trusted-ca-bundle\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.663123 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.663085 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/337fbebc-d7d7-4879-a750-c451f801dd55-service-ca\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.663256 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.663235 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/337fbebc-d7d7-4879-a750-c451f801dd55-oauth-serving-cert\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.664795 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.664771 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/337fbebc-d7d7-4879-a750-c451f801dd55-console-serving-cert\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.664884 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.664844 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/337fbebc-d7d7-4879-a750-c451f801dd55-console-oauth-config\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.670523 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.670499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjjsk\" (UniqueName: \"kubernetes.io/projected/337fbebc-d7d7-4879-a750-c451f801dd55-kube-api-access-qjjsk\") pod \"console-85b4bfbb8c-kvbhs\" (UID: \"337fbebc-d7d7-4879-a750-c451f801dd55\") " pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.805560 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.805468 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:46:52.930119 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:52.930079 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85b4bfbb8c-kvbhs"] Apr 16 08:46:52.933254 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:46:52.933225 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod337fbebc_d7d7_4879_a750_c451f801dd55.slice/crio-afd90eb70989d606847ceff9878475dfab40c86ce8d5dbc74bbaffe6640b6b49 WatchSource:0}: Error finding container afd90eb70989d606847ceff9878475dfab40c86ce8d5dbc74bbaffe6640b6b49: Status 404 returned error can't find the container with id afd90eb70989d606847ceff9878475dfab40c86ce8d5dbc74bbaffe6640b6b49 Apr 16 08:46:53.265600 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:53.265561 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85b4bfbb8c-kvbhs" event={"ID":"337fbebc-d7d7-4879-a750-c451f801dd55","Type":"ContainerStarted","Data":"09f17e7084a72de69bd5d237a86eaf6949a20f8540987bdf5fa1387773e98856"} Apr 16 08:46:53.265600 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:53.265601 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85b4bfbb8c-kvbhs" event={"ID":"337fbebc-d7d7-4879-a750-c451f801dd55","Type":"ContainerStarted","Data":"afd90eb70989d606847ceff9878475dfab40c86ce8d5dbc74bbaffe6640b6b49"} Apr 16 08:46:53.282535 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:46:53.282470 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85b4bfbb8c-kvbhs" podStartSLOduration=1.28244968 podStartE2EDuration="1.28244968s" podCreationTimestamp="2026-04-16 08:46:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:46:53.281688456 +0000 UTC m=+465.201674343" watchObservedRunningTime="2026-04-16 08:46:53.28244968 +0000 UTC m=+465.202435556" Apr 16 08:47:02.806011 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:02.805970 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:47:02.806499 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:02.806051 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:47:02.810829 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:02.810808 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:47:03.304629 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:03.304595 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85b4bfbb8c-kvbhs" Apr 16 08:47:10.800482 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:10.800446 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2"] Apr 16 08:47:10.811945 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:10.811926 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" Apr 16 08:47:10.814968 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:10.814947 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-zv6tx\"" Apr 16 08:47:10.815730 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:10.815691 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2"] Apr 16 08:47:10.825202 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:10.825174 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b54077d1-bfa1-4086-aa66-5f01e0e437ee-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5rpp2\" (UID: \"b54077d1-bfa1-4086-aa66-5f01e0e437ee\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" Apr 16 08:47:10.825327 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:10.825249 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w67lx\" (UniqueName: \"kubernetes.io/projected/b54077d1-bfa1-4086-aa66-5f01e0e437ee-kube-api-access-w67lx\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5rpp2\" (UID: \"b54077d1-bfa1-4086-aa66-5f01e0e437ee\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" Apr 16 08:47:10.926293 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:10.926259 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b54077d1-bfa1-4086-aa66-5f01e0e437ee-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5rpp2\" (UID: \"b54077d1-bfa1-4086-aa66-5f01e0e437ee\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" Apr 16 08:47:10.926465 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:10.926308 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w67lx\" (UniqueName: \"kubernetes.io/projected/b54077d1-bfa1-4086-aa66-5f01e0e437ee-kube-api-access-w67lx\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5rpp2\" (UID: \"b54077d1-bfa1-4086-aa66-5f01e0e437ee\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" Apr 16 08:47:10.926626 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:10.926604 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b54077d1-bfa1-4086-aa66-5f01e0e437ee-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5rpp2\" (UID: \"b54077d1-bfa1-4086-aa66-5f01e0e437ee\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" Apr 16 08:47:10.949074 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:10.949044 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w67lx\" (UniqueName: \"kubernetes.io/projected/b54077d1-bfa1-4086-aa66-5f01e0e437ee-kube-api-access-w67lx\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5rpp2\" (UID: \"b54077d1-bfa1-4086-aa66-5f01e0e437ee\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" Apr 16 08:47:11.123587 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:11.123555 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" Apr 16 08:47:11.254037 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:11.254003 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2"] Apr 16 08:47:11.256075 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:47:11.256043 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb54077d1_bfa1_4086_aa66_5f01e0e437ee.slice/crio-0d9e3c64291f41335dee3a61db7a67bf565ce80c82948f8f74e517ccc7283c59 WatchSource:0}: Error finding container 0d9e3c64291f41335dee3a61db7a67bf565ce80c82948f8f74e517ccc7283c59: Status 404 returned error can't find the container with id 0d9e3c64291f41335dee3a61db7a67bf565ce80c82948f8f74e517ccc7283c59 Apr 16 08:47:11.330399 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:11.330359 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" event={"ID":"b54077d1-bfa1-4086-aa66-5f01e0e437ee","Type":"ContainerStarted","Data":"0d9e3c64291f41335dee3a61db7a67bf565ce80c82948f8f74e517ccc7283c59"} Apr 16 08:47:16.028305 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:16.028267 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-q8bvd"] Apr 16 08:47:16.031970 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:16.031943 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-q8bvd" Apr 16 08:47:16.034408 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:16.034381 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-8t9tv\"" Apr 16 08:47:16.043281 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:16.043253 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-q8bvd"] Apr 16 08:47:16.070467 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:16.070437 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmk9v\" (UniqueName: \"kubernetes.io/projected/bc9c7905-4995-4372-bb7f-d566eddd696a-kube-api-access-fmk9v\") pod \"authorino-operator-657f44b778-q8bvd\" (UID: \"bc9c7905-4995-4372-bb7f-d566eddd696a\") " pod="kuadrant-system/authorino-operator-657f44b778-q8bvd" Apr 16 08:47:16.171407 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:16.171360 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmk9v\" (UniqueName: \"kubernetes.io/projected/bc9c7905-4995-4372-bb7f-d566eddd696a-kube-api-access-fmk9v\") pod \"authorino-operator-657f44b778-q8bvd\" (UID: \"bc9c7905-4995-4372-bb7f-d566eddd696a\") " pod="kuadrant-system/authorino-operator-657f44b778-q8bvd" Apr 16 08:47:16.185639 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:16.185598 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmk9v\" (UniqueName: \"kubernetes.io/projected/bc9c7905-4995-4372-bb7f-d566eddd696a-kube-api-access-fmk9v\") pod \"authorino-operator-657f44b778-q8bvd\" (UID: \"bc9c7905-4995-4372-bb7f-d566eddd696a\") " pod="kuadrant-system/authorino-operator-657f44b778-q8bvd" Apr 16 08:47:16.345379 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:16.345348 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-q8bvd" Apr 16 08:47:16.505841 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:16.505811 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-q8bvd"] Apr 16 08:47:16.507531 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:47:16.507496 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc9c7905_4995_4372_bb7f_d566eddd696a.slice/crio-3f22ae1aba2087a0f8957dfcb3f8ef903506f3da6290a7d67ca46a157950797d WatchSource:0}: Error finding container 3f22ae1aba2087a0f8957dfcb3f8ef903506f3da6290a7d67ca46a157950797d: Status 404 returned error can't find the container with id 3f22ae1aba2087a0f8957dfcb3f8ef903506f3da6290a7d67ca46a157950797d Apr 16 08:47:17.356760 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:17.356702 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" event={"ID":"b54077d1-bfa1-4086-aa66-5f01e0e437ee","Type":"ContainerStarted","Data":"58904254166d9210cd0841d465c5057b8ab0a097bd4f2903436f6c2516850fbe"} Apr 16 08:47:17.357223 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:17.356804 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" Apr 16 08:47:17.357992 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:17.357957 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-q8bvd" event={"ID":"bc9c7905-4995-4372-bb7f-d566eddd696a","Type":"ContainerStarted","Data":"3f22ae1aba2087a0f8957dfcb3f8ef903506f3da6290a7d67ca46a157950797d"} Apr 16 08:47:17.381902 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:17.380897 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" podStartSLOduration=2.33012773 podStartE2EDuration="7.380879864s" podCreationTimestamp="2026-04-16 08:47:10 +0000 UTC" firstStartedPulling="2026-04-16 08:47:11.258355154 +0000 UTC m=+483.178341010" lastFinishedPulling="2026-04-16 08:47:16.309107291 +0000 UTC m=+488.229093144" observedRunningTime="2026-04-16 08:47:17.379230925 +0000 UTC m=+489.299216841" watchObservedRunningTime="2026-04-16 08:47:17.380879864 +0000 UTC m=+489.300865740" Apr 16 08:47:19.372344 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:19.372302 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-q8bvd" event={"ID":"bc9c7905-4995-4372-bb7f-d566eddd696a","Type":"ContainerStarted","Data":"a0cedfafd23be0af6847d22f79014164c1ff8e5d360935cdef1d29f0f8d8a4f4"} Apr 16 08:47:19.372771 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:19.372429 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-q8bvd" Apr 16 08:47:19.391957 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:19.391903 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-q8bvd" podStartSLOduration=1.36205136 podStartE2EDuration="3.391887013s" podCreationTimestamp="2026-04-16 08:47:16 +0000 UTC" firstStartedPulling="2026-04-16 08:47:16.509643479 +0000 UTC m=+488.429629336" lastFinishedPulling="2026-04-16 08:47:18.539479137 +0000 UTC m=+490.459464989" observedRunningTime="2026-04-16 08:47:19.390207407 +0000 UTC m=+491.310193308" watchObservedRunningTime="2026-04-16 08:47:19.391887013 +0000 UTC m=+491.311872890" Apr 16 08:47:28.370683 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:28.370651 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" Apr 16 08:47:29.284371 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.284332 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2"] Apr 16 08:47:29.284595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.284572 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" podUID="b54077d1-bfa1-4086-aa66-5f01e0e437ee" containerName="manager" containerID="cri-o://58904254166d9210cd0841d465c5057b8ab0a097bd4f2903436f6c2516850fbe" gracePeriod=2 Apr 16 08:47:29.294931 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.294900 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2"] Apr 16 08:47:29.309643 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.309611 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr"] Apr 16 08:47:29.309986 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.309972 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b54077d1-bfa1-4086-aa66-5f01e0e437ee" containerName="manager" Apr 16 08:47:29.310026 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.309988 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54077d1-bfa1-4086-aa66-5f01e0e437ee" containerName="manager" Apr 16 08:47:29.310068 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.310055 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b54077d1-bfa1-4086-aa66-5f01e0e437ee" containerName="manager" Apr 16 08:47:29.313271 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.313244 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr" Apr 16 08:47:29.324270 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.324240 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr"] Apr 16 08:47:29.386684 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.386656 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfsjx\" (UniqueName: \"kubernetes.io/projected/c3053d60-28d8-46d1-9532-5e9ddbbf6dd7-kube-api-access-sfsjx\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cpkkr\" (UID: \"c3053d60-28d8-46d1-9532-5e9ddbbf6dd7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr" Apr 16 08:47:29.387075 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.386737 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c3053d60-28d8-46d1-9532-5e9ddbbf6dd7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cpkkr\" (UID: \"c3053d60-28d8-46d1-9532-5e9ddbbf6dd7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr" Apr 16 08:47:29.408415 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.408386 2578 generic.go:358] "Generic (PLEG): container finished" podID="b54077d1-bfa1-4086-aa66-5f01e0e437ee" containerID="58904254166d9210cd0841d465c5057b8ab0a097bd4f2903436f6c2516850fbe" exitCode=0 Apr 16 08:47:29.488188 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.488145 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfsjx\" (UniqueName: \"kubernetes.io/projected/c3053d60-28d8-46d1-9532-5e9ddbbf6dd7-kube-api-access-sfsjx\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cpkkr\" (UID: \"c3053d60-28d8-46d1-9532-5e9ddbbf6dd7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr" Apr 16 08:47:29.488368 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.488232 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c3053d60-28d8-46d1-9532-5e9ddbbf6dd7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cpkkr\" (UID: \"c3053d60-28d8-46d1-9532-5e9ddbbf6dd7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr" Apr 16 08:47:29.488650 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.488620 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c3053d60-28d8-46d1-9532-5e9ddbbf6dd7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cpkkr\" (UID: \"c3053d60-28d8-46d1-9532-5e9ddbbf6dd7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr" Apr 16 08:47:29.496673 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.496644 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfsjx\" (UniqueName: \"kubernetes.io/projected/c3053d60-28d8-46d1-9532-5e9ddbbf6dd7-kube-api-access-sfsjx\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cpkkr\" (UID: \"c3053d60-28d8-46d1-9532-5e9ddbbf6dd7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr" Apr 16 08:47:29.516944 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.516920 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" Apr 16 08:47:29.519103 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.519076 2578 status_manager.go:895] "Failed to get status for pod" podUID="b54077d1-bfa1-4086-aa66-5f01e0e437ee" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5rpp2\" is forbidden: User \"system:node:ip-10-0-139-8.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-8.ec2.internal' and this object" Apr 16 08:47:29.589429 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.589398 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b54077d1-bfa1-4086-aa66-5f01e0e437ee-extensions-socket-volume\") pod \"b54077d1-bfa1-4086-aa66-5f01e0e437ee\" (UID: \"b54077d1-bfa1-4086-aa66-5f01e0e437ee\") " Apr 16 08:47:29.589555 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.589438 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w67lx\" (UniqueName: \"kubernetes.io/projected/b54077d1-bfa1-4086-aa66-5f01e0e437ee-kube-api-access-w67lx\") pod \"b54077d1-bfa1-4086-aa66-5f01e0e437ee\" (UID: \"b54077d1-bfa1-4086-aa66-5f01e0e437ee\") " Apr 16 08:47:29.591385 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.590168 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b54077d1-bfa1-4086-aa66-5f01e0e437ee-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "b54077d1-bfa1-4086-aa66-5f01e0e437ee" (UID: "b54077d1-bfa1-4086-aa66-5f01e0e437ee"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 08:47:29.596239 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.596211 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b54077d1-bfa1-4086-aa66-5f01e0e437ee-kube-api-access-w67lx" (OuterVolumeSpecName: "kube-api-access-w67lx") pod "b54077d1-bfa1-4086-aa66-5f01e0e437ee" (UID: "b54077d1-bfa1-4086-aa66-5f01e0e437ee"). InnerVolumeSpecName "kube-api-access-w67lx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:47:29.675867 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.675827 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr" Apr 16 08:47:29.691028 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.690992 2578 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b54077d1-bfa1-4086-aa66-5f01e0e437ee-extensions-socket-volume\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:47:29.691028 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.691021 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w67lx\" (UniqueName: \"kubernetes.io/projected/b54077d1-bfa1-4086-aa66-5f01e0e437ee-kube-api-access-w67lx\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:47:29.806123 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:29.806086 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr"] Apr 16 08:47:29.810141 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:47:29.810115 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3053d60_28d8_46d1_9532_5e9ddbbf6dd7.slice/crio-500f699e29594ecc66c3a337887bacd45748545e0c49aad2e09ae80da433fb57 WatchSource:0}: Error finding container 500f699e29594ecc66c3a337887bacd45748545e0c49aad2e09ae80da433fb57: Status 404 returned error can't find the container with id 500f699e29594ecc66c3a337887bacd45748545e0c49aad2e09ae80da433fb57 Apr 16 08:47:30.377684 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:30.377653 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-q8bvd" Apr 16 08:47:30.380099 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:30.380071 2578 status_manager.go:895] "Failed to get status for pod" podUID="b54077d1-bfa1-4086-aa66-5f01e0e437ee" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5rpp2\" is forbidden: User \"system:node:ip-10-0-139-8.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-8.ec2.internal' and this object" Apr 16 08:47:30.413936 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:30.413905 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" Apr 16 08:47:30.413936 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:30.413933 2578 scope.go:117] "RemoveContainer" containerID="58904254166d9210cd0841d465c5057b8ab0a097bd4f2903436f6c2516850fbe" Apr 16 08:47:30.415847 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:30.415820 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr" event={"ID":"c3053d60-28d8-46d1-9532-5e9ddbbf6dd7","Type":"ContainerStarted","Data":"001933ae4e66d12de588d9ab69cad4210015b68140b944b619b20961bbf4e475"} Apr 16 08:47:30.415847 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:30.415858 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr" event={"ID":"c3053d60-28d8-46d1-9532-5e9ddbbf6dd7","Type":"ContainerStarted","Data":"500f699e29594ecc66c3a337887bacd45748545e0c49aad2e09ae80da433fb57"} Apr 16 08:47:30.416076 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:30.416059 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr" Apr 16 08:47:30.416664 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:30.416494 2578 status_manager.go:895] "Failed to get status for pod" podUID="b54077d1-bfa1-4086-aa66-5f01e0e437ee" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5rpp2\" is forbidden: User \"system:node:ip-10-0-139-8.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-8.ec2.internal' and this object" Apr 16 08:47:30.418681 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:30.418642 2578 status_manager.go:895] "Failed to get status for pod" podUID="b54077d1-bfa1-4086-aa66-5f01e0e437ee" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5rpp2\" is forbidden: User \"system:node:ip-10-0-139-8.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-8.ec2.internal' and this object" Apr 16 08:47:30.445692 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:30.445641 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr" podStartSLOduration=1.445626328 podStartE2EDuration="1.445626328s" podCreationTimestamp="2026-04-16 08:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:47:30.443585435 +0000 UTC m=+502.363571308" watchObservedRunningTime="2026-04-16 08:47:30.445626328 +0000 UTC m=+502.365612203" Apr 16 08:47:30.445882 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:30.445828 2578 status_manager.go:895] "Failed to get status for pod" podUID="b54077d1-bfa1-4086-aa66-5f01e0e437ee" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5rpp2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5rpp2\" is forbidden: User \"system:node:ip-10-0-139-8.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-8.ec2.internal' and this object" Apr 16 08:47:30.666188 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:30.666101 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b54077d1-bfa1-4086-aa66-5f01e0e437ee" path="/var/lib/kubelet/pods/b54077d1-bfa1-4086-aa66-5f01e0e437ee/volumes" Apr 16 08:47:41.423633 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:41.423603 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr" Apr 16 08:47:45.881588 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:45.881545 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr"] Apr 16 08:47:45.882008 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:45.881831 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr" podUID="c3053d60-28d8-46d1-9532-5e9ddbbf6dd7" containerName="manager" containerID="cri-o://001933ae4e66d12de588d9ab69cad4210015b68140b944b619b20961bbf4e475" gracePeriod=10 Apr 16 08:47:46.128835 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:46.128810 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr" Apr 16 08:47:46.241573 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:46.241475 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfsjx\" (UniqueName: \"kubernetes.io/projected/c3053d60-28d8-46d1-9532-5e9ddbbf6dd7-kube-api-access-sfsjx\") pod \"c3053d60-28d8-46d1-9532-5e9ddbbf6dd7\" (UID: \"c3053d60-28d8-46d1-9532-5e9ddbbf6dd7\") " Apr 16 08:47:46.241573 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:46.241531 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c3053d60-28d8-46d1-9532-5e9ddbbf6dd7-extensions-socket-volume\") pod \"c3053d60-28d8-46d1-9532-5e9ddbbf6dd7\" (UID: \"c3053d60-28d8-46d1-9532-5e9ddbbf6dd7\") " Apr 16 08:47:46.241995 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:46.241970 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3053d60-28d8-46d1-9532-5e9ddbbf6dd7-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "c3053d60-28d8-46d1-9532-5e9ddbbf6dd7" (UID: "c3053d60-28d8-46d1-9532-5e9ddbbf6dd7"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 08:47:46.243816 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:46.243795 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3053d60-28d8-46d1-9532-5e9ddbbf6dd7-kube-api-access-sfsjx" (OuterVolumeSpecName: "kube-api-access-sfsjx") pod "c3053d60-28d8-46d1-9532-5e9ddbbf6dd7" (UID: "c3053d60-28d8-46d1-9532-5e9ddbbf6dd7"). InnerVolumeSpecName "kube-api-access-sfsjx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:47:46.342829 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:46.342796 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sfsjx\" (UniqueName: \"kubernetes.io/projected/c3053d60-28d8-46d1-9532-5e9ddbbf6dd7-kube-api-access-sfsjx\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:47:46.342829 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:46.342822 2578 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c3053d60-28d8-46d1-9532-5e9ddbbf6dd7-extensions-socket-volume\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:47:46.476242 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:46.476210 2578 generic.go:358] "Generic (PLEG): container finished" podID="c3053d60-28d8-46d1-9532-5e9ddbbf6dd7" containerID="001933ae4e66d12de588d9ab69cad4210015b68140b944b619b20961bbf4e475" exitCode=0 Apr 16 08:47:46.476242 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:46.476247 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr" event={"ID":"c3053d60-28d8-46d1-9532-5e9ddbbf6dd7","Type":"ContainerDied","Data":"001933ae4e66d12de588d9ab69cad4210015b68140b944b619b20961bbf4e475"} Apr 16 08:47:46.476486 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:46.476269 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr" event={"ID":"c3053d60-28d8-46d1-9532-5e9ddbbf6dd7","Type":"ContainerDied","Data":"500f699e29594ecc66c3a337887bacd45748545e0c49aad2e09ae80da433fb57"} Apr 16 08:47:46.476486 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:46.476285 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr" Apr 16 08:47:46.476486 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:46.476293 2578 scope.go:117] "RemoveContainer" containerID="001933ae4e66d12de588d9ab69cad4210015b68140b944b619b20961bbf4e475" Apr 16 08:47:46.484561 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:46.484544 2578 scope.go:117] "RemoveContainer" containerID="001933ae4e66d12de588d9ab69cad4210015b68140b944b619b20961bbf4e475" Apr 16 08:47:46.485048 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:47:46.485026 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"001933ae4e66d12de588d9ab69cad4210015b68140b944b619b20961bbf4e475\": container with ID starting with 001933ae4e66d12de588d9ab69cad4210015b68140b944b619b20961bbf4e475 not found: ID does not exist" containerID="001933ae4e66d12de588d9ab69cad4210015b68140b944b619b20961bbf4e475" Apr 16 08:47:46.485125 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:46.485060 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"001933ae4e66d12de588d9ab69cad4210015b68140b944b619b20961bbf4e475"} err="failed to get container status \"001933ae4e66d12de588d9ab69cad4210015b68140b944b619b20961bbf4e475\": rpc error: code = NotFound desc = could not find container \"001933ae4e66d12de588d9ab69cad4210015b68140b944b619b20961bbf4e475\": container with ID starting with 001933ae4e66d12de588d9ab69cad4210015b68140b944b619b20961bbf4e475 not found: ID does not exist" Apr 16 08:47:46.502578 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:46.502506 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr"] Apr 16 08:47:46.509335 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:46.509313 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cpkkr"] Apr 16 08:47:46.666545 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:47:46.666511 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3053d60-28d8-46d1-9532-5e9ddbbf6dd7" path="/var/lib/kubelet/pods/c3053d60-28d8-46d1-9532-5e9ddbbf6dd7/volumes" Apr 16 08:48:02.110424 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.110338 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq"] Apr 16 08:48:02.110888 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.110680 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3053d60-28d8-46d1-9532-5e9ddbbf6dd7" containerName="manager" Apr 16 08:48:02.110888 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.110690 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3053d60-28d8-46d1-9532-5e9ddbbf6dd7" containerName="manager" Apr 16 08:48:02.110888 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.110767 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3053d60-28d8-46d1-9532-5e9ddbbf6dd7" containerName="manager" Apr 16 08:48:02.112812 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.112790 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.115273 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.115251 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-82w56\"" Apr 16 08:48:02.128274 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.128246 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq"] Apr 16 08:48:02.180289 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.180242 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5b162dcf-fda9-46fb-8501-7b81824cefca-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.180463 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.180309 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5b162dcf-fda9-46fb-8501-7b81824cefca-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.180463 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.180328 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5b162dcf-fda9-46fb-8501-7b81824cefca-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.180463 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.180352 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5b162dcf-fda9-46fb-8501-7b81824cefca-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.180463 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.180372 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5b162dcf-fda9-46fb-8501-7b81824cefca-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.180463 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.180398 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdbpf\" (UniqueName: \"kubernetes.io/projected/5b162dcf-fda9-46fb-8501-7b81824cefca-kube-api-access-mdbpf\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.180463 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.180441 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5b162dcf-fda9-46fb-8501-7b81824cefca-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.180780 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.180531 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5b162dcf-fda9-46fb-8501-7b81824cefca-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.180780 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.180575 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5b162dcf-fda9-46fb-8501-7b81824cefca-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.281393 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.281349 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5b162dcf-fda9-46fb-8501-7b81824cefca-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.281603 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.281414 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5b162dcf-fda9-46fb-8501-7b81824cefca-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.281603 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.281439 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5b162dcf-fda9-46fb-8501-7b81824cefca-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.281603 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.281482 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5b162dcf-fda9-46fb-8501-7b81824cefca-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.281603 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.281507 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5b162dcf-fda9-46fb-8501-7b81824cefca-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.281603 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.281537 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdbpf\" (UniqueName: \"kubernetes.io/projected/5b162dcf-fda9-46fb-8501-7b81824cefca-kube-api-access-mdbpf\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.281603 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.281566 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5b162dcf-fda9-46fb-8501-7b81824cefca-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.281949 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.281610 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5b162dcf-fda9-46fb-8501-7b81824cefca-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.281949 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.281671 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5b162dcf-fda9-46fb-8501-7b81824cefca-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.281949 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.281895 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5b162dcf-fda9-46fb-8501-7b81824cefca-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.282202 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.281946 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5b162dcf-fda9-46fb-8501-7b81824cefca-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.282202 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.281987 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5b162dcf-fda9-46fb-8501-7b81824cefca-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.282452 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.282254 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5b162dcf-fda9-46fb-8501-7b81824cefca-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.282452 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.282442 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5b162dcf-fda9-46fb-8501-7b81824cefca-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.284040 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.284012 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5b162dcf-fda9-46fb-8501-7b81824cefca-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.284302 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.284284 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5b162dcf-fda9-46fb-8501-7b81824cefca-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.289071 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.289043 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5b162dcf-fda9-46fb-8501-7b81824cefca-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.289179 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.289102 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdbpf\" (UniqueName: \"kubernetes.io/projected/5b162dcf-fda9-46fb-8501-7b81824cefca-kube-api-access-mdbpf\") pod \"maas-default-gateway-openshift-default-58b6f876-vjxhq\" (UID: \"5b162dcf-fda9-46fb-8501-7b81824cefca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.425181 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.425094 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:02.554000 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.553976 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq"] Apr 16 08:48:02.556049 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:48:02.556022 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b162dcf_fda9_46fb_8501_7b81824cefca.slice/crio-8bbf97e60f50bcc239c78317ca9c9f2b2611da99b8d1e6f15a7a49950c7b0962 WatchSource:0}: Error finding container 8bbf97e60f50bcc239c78317ca9c9f2b2611da99b8d1e6f15a7a49950c7b0962: Status 404 returned error can't find the container with id 8bbf97e60f50bcc239c78317ca9c9f2b2611da99b8d1e6f15a7a49950c7b0962 Apr 16 08:48:02.558215 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.558185 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 08:48:02.558300 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.558251 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 08:48:02.558300 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:02.558287 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 08:48:03.538448 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:03.538414 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" event={"ID":"5b162dcf-fda9-46fb-8501-7b81824cefca","Type":"ContainerStarted","Data":"3ec44a62bc19d5517fc56851f22cccead2f4626dd15bb8a8a0a1d9405fe5b52e"} Apr 16 08:48:03.538448 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:03.538451 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" event={"ID":"5b162dcf-fda9-46fb-8501-7b81824cefca","Type":"ContainerStarted","Data":"8bbf97e60f50bcc239c78317ca9c9f2b2611da99b8d1e6f15a7a49950c7b0962"} Apr 16 08:48:03.556679 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:03.556621 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" podStartSLOduration=1.556604785 podStartE2EDuration="1.556604785s" podCreationTimestamp="2026-04-16 08:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:48:03.555560386 +0000 UTC m=+535.475546252" watchObservedRunningTime="2026-04-16 08:48:03.556604785 +0000 UTC m=+535.476590661" Apr 16 08:48:04.425430 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:04.425390 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:04.430249 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:04.430219 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:04.542362 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:04.542332 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:04.543341 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:04.543320 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-vjxhq" Apr 16 08:48:06.080889 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.080807 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-shh6z"] Apr 16 08:48:06.083208 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.083190 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-shh6z" Apr 16 08:48:06.085773 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.085749 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-cmwnl\"" Apr 16 08:48:06.085883 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.085794 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 08:48:06.092216 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.092187 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-shh6z"] Apr 16 08:48:06.120071 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.120043 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jfvd\" (UniqueName: \"kubernetes.io/projected/1b4bb2bf-7323-4577-b5d8-3fffd8192c2e-kube-api-access-5jfvd\") pod \"limitador-limitador-7d549b5b-shh6z\" (UID: \"1b4bb2bf-7323-4577-b5d8-3fffd8192c2e\") " pod="kuadrant-system/limitador-limitador-7d549b5b-shh6z" Apr 16 08:48:06.120198 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.120095 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1b4bb2bf-7323-4577-b5d8-3fffd8192c2e-config-file\") pod \"limitador-limitador-7d549b5b-shh6z\" (UID: \"1b4bb2bf-7323-4577-b5d8-3fffd8192c2e\") " pod="kuadrant-system/limitador-limitador-7d549b5b-shh6z" Apr 16 08:48:06.177426 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.177395 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-shh6z"] Apr 16 08:48:06.221386 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.221346 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1b4bb2bf-7323-4577-b5d8-3fffd8192c2e-config-file\") pod \"limitador-limitador-7d549b5b-shh6z\" (UID: \"1b4bb2bf-7323-4577-b5d8-3fffd8192c2e\") " pod="kuadrant-system/limitador-limitador-7d549b5b-shh6z" Apr 16 08:48:06.221580 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.221464 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jfvd\" (UniqueName: \"kubernetes.io/projected/1b4bb2bf-7323-4577-b5d8-3fffd8192c2e-kube-api-access-5jfvd\") pod \"limitador-limitador-7d549b5b-shh6z\" (UID: \"1b4bb2bf-7323-4577-b5d8-3fffd8192c2e\") " pod="kuadrant-system/limitador-limitador-7d549b5b-shh6z" Apr 16 08:48:06.222102 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.222079 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1b4bb2bf-7323-4577-b5d8-3fffd8192c2e-config-file\") pod \"limitador-limitador-7d549b5b-shh6z\" (UID: \"1b4bb2bf-7323-4577-b5d8-3fffd8192c2e\") " pod="kuadrant-system/limitador-limitador-7d549b5b-shh6z" Apr 16 08:48:06.229145 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.229118 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jfvd\" (UniqueName: \"kubernetes.io/projected/1b4bb2bf-7323-4577-b5d8-3fffd8192c2e-kube-api-access-5jfvd\") pod \"limitador-limitador-7d549b5b-shh6z\" (UID: \"1b4bb2bf-7323-4577-b5d8-3fffd8192c2e\") " pod="kuadrant-system/limitador-limitador-7d549b5b-shh6z" Apr 16 08:48:06.394354 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.394321 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-shh6z" Apr 16 08:48:06.528026 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.527969 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-shh6z"] Apr 16 08:48:06.530655 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:48:06.530624 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b4bb2bf_7323_4577_b5d8_3fffd8192c2e.slice/crio-0d18abe6eaa4d6ddd9fae1afb7ecef5dd1bee18e59e66ae03a88f4955e1f5097 WatchSource:0}: Error finding container 0d18abe6eaa4d6ddd9fae1afb7ecef5dd1bee18e59e66ae03a88f4955e1f5097: Status 404 returned error can't find the container with id 0d18abe6eaa4d6ddd9fae1afb7ecef5dd1bee18e59e66ae03a88f4955e1f5097 Apr 16 08:48:06.550403 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.550377 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-shh6z" event={"ID":"1b4bb2bf-7323-4577-b5d8-3fffd8192c2e","Type":"ContainerStarted","Data":"0d18abe6eaa4d6ddd9fae1afb7ecef5dd1bee18e59e66ae03a88f4955e1f5097"} Apr 16 08:48:06.959273 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.959237 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-hkfkd"] Apr 16 08:48:06.965173 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.964498 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-hkfkd" Apr 16 08:48:06.967360 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.967333 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-57wvr\"" Apr 16 08:48:06.967783 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:06.967761 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-hkfkd"] Apr 16 08:48:07.029270 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:07.029236 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldgtx\" (UniqueName: \"kubernetes.io/projected/bdbc0402-3c18-443d-9384-233509fa7ba9-kube-api-access-ldgtx\") pod \"authorino-f99f4b5cd-hkfkd\" (UID: \"bdbc0402-3c18-443d-9384-233509fa7ba9\") " pod="kuadrant-system/authorino-f99f4b5cd-hkfkd" Apr 16 08:48:07.126275 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:07.126238 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-kgfdr"] Apr 16 08:48:07.130514 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:07.130478 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldgtx\" (UniqueName: \"kubernetes.io/projected/bdbc0402-3c18-443d-9384-233509fa7ba9-kube-api-access-ldgtx\") pod \"authorino-f99f4b5cd-hkfkd\" (UID: \"bdbc0402-3c18-443d-9384-233509fa7ba9\") " pod="kuadrant-system/authorino-f99f4b5cd-hkfkd" Apr 16 08:48:07.130987 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:07.130971 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-kgfdr" Apr 16 08:48:07.135236 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:07.135213 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-kgfdr"] Apr 16 08:48:07.143134 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:07.142404 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldgtx\" (UniqueName: \"kubernetes.io/projected/bdbc0402-3c18-443d-9384-233509fa7ba9-kube-api-access-ldgtx\") pod \"authorino-f99f4b5cd-hkfkd\" (UID: \"bdbc0402-3c18-443d-9384-233509fa7ba9\") " pod="kuadrant-system/authorino-f99f4b5cd-hkfkd" Apr 16 08:48:07.232168 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:07.232081 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsb94\" (UniqueName: \"kubernetes.io/projected/d71bca0c-1ac4-4569-bf43-9737dcedb07e-kube-api-access-zsb94\") pod \"authorino-7498df8756-kgfdr\" (UID: \"d71bca0c-1ac4-4569-bf43-9737dcedb07e\") " pod="kuadrant-system/authorino-7498df8756-kgfdr" Apr 16 08:48:07.286461 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:07.285968 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-hkfkd" Apr 16 08:48:07.333776 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:07.333360 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsb94\" (UniqueName: \"kubernetes.io/projected/d71bca0c-1ac4-4569-bf43-9737dcedb07e-kube-api-access-zsb94\") pod \"authorino-7498df8756-kgfdr\" (UID: \"d71bca0c-1ac4-4569-bf43-9737dcedb07e\") " pod="kuadrant-system/authorino-7498df8756-kgfdr" Apr 16 08:48:07.343255 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:07.343183 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsb94\" (UniqueName: \"kubernetes.io/projected/d71bca0c-1ac4-4569-bf43-9737dcedb07e-kube-api-access-zsb94\") pod \"authorino-7498df8756-kgfdr\" (UID: \"d71bca0c-1ac4-4569-bf43-9737dcedb07e\") " pod="kuadrant-system/authorino-7498df8756-kgfdr" Apr 16 08:48:07.456195 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:07.456112 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-kgfdr" Apr 16 08:48:07.505950 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:07.505683 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-hkfkd"] Apr 16 08:48:07.509602 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:48:07.509551 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdbc0402_3c18_443d_9384_233509fa7ba9.slice/crio-078eb908dd552a7f05ba2b4cbf101eb85c92857052ad3b406abb6b23d2354278 WatchSource:0}: Error finding container 078eb908dd552a7f05ba2b4cbf101eb85c92857052ad3b406abb6b23d2354278: Status 404 returned error can't find the container with id 078eb908dd552a7f05ba2b4cbf101eb85c92857052ad3b406abb6b23d2354278 Apr 16 08:48:07.555981 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:07.555890 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-hkfkd" event={"ID":"bdbc0402-3c18-443d-9384-233509fa7ba9","Type":"ContainerStarted","Data":"078eb908dd552a7f05ba2b4cbf101eb85c92857052ad3b406abb6b23d2354278"} Apr 16 08:48:07.623239 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:07.623211 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-kgfdr"] Apr 16 08:48:07.626198 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:48:07.626165 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd71bca0c_1ac4_4569_bf43_9737dcedb07e.slice/crio-d897a6e53863a34c43fa86f9f66192a45241802bb0ec39a38a736e20900c9ad1 WatchSource:0}: Error finding container d897a6e53863a34c43fa86f9f66192a45241802bb0ec39a38a736e20900c9ad1: Status 404 returned error can't find the container with id d897a6e53863a34c43fa86f9f66192a45241802bb0ec39a38a736e20900c9ad1 Apr 16 08:48:08.562253 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:08.562189 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-kgfdr" event={"ID":"d71bca0c-1ac4-4569-bf43-9737dcedb07e","Type":"ContainerStarted","Data":"d897a6e53863a34c43fa86f9f66192a45241802bb0ec39a38a736e20900c9ad1"} Apr 16 08:48:12.578926 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:12.578892 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-kgfdr" event={"ID":"d71bca0c-1ac4-4569-bf43-9737dcedb07e","Type":"ContainerStarted","Data":"e6d66404d5805e19cdffb13772775637bd2193014ca63701bcaf836927a687c9"} Apr 16 08:48:12.580277 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:12.580257 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-hkfkd" event={"ID":"bdbc0402-3c18-443d-9384-233509fa7ba9","Type":"ContainerStarted","Data":"58d453a56281d62b06e3a0229cea2dfaabe4cd1cbdbc0b7c72aa605f0f3dfc4b"} Apr 16 08:48:12.581558 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:12.581536 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-shh6z" event={"ID":"1b4bb2bf-7323-4577-b5d8-3fffd8192c2e","Type":"ContainerStarted","Data":"b030537b09d8e988a0bf13d7cd868b2f21013196d53a2656ade8e24cd1c862b2"} Apr 16 08:48:12.581686 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:12.581667 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-shh6z" Apr 16 08:48:12.592536 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:12.592478 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-kgfdr" podStartSLOduration=1.398338927 podStartE2EDuration="5.59246051s" podCreationTimestamp="2026-04-16 08:48:07 +0000 UTC" firstStartedPulling="2026-04-16 08:48:07.627697104 +0000 UTC m=+539.547682959" lastFinishedPulling="2026-04-16 08:48:11.821818677 +0000 UTC m=+543.741804542" observedRunningTime="2026-04-16 08:48:12.591839197 +0000 UTC m=+544.511825076" watchObservedRunningTime="2026-04-16 08:48:12.59246051 +0000 UTC m=+544.512446386" Apr 16 08:48:12.608825 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:12.608741 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-hkfkd" podStartSLOduration=2.263030637 podStartE2EDuration="6.608702243s" podCreationTimestamp="2026-04-16 08:48:06 +0000 UTC" firstStartedPulling="2026-04-16 08:48:07.511378172 +0000 UTC m=+539.431364026" lastFinishedPulling="2026-04-16 08:48:11.85704977 +0000 UTC m=+543.777035632" observedRunningTime="2026-04-16 08:48:12.605603373 +0000 UTC m=+544.525589246" watchObservedRunningTime="2026-04-16 08:48:12.608702243 +0000 UTC m=+544.528688120" Apr 16 08:48:12.627153 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:12.627123 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-hkfkd"] Apr 16 08:48:12.631949 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:12.631885 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-shh6z" podStartSLOduration=1.2903026930000001 podStartE2EDuration="6.631868806s" podCreationTimestamp="2026-04-16 08:48:06 +0000 UTC" firstStartedPulling="2026-04-16 08:48:06.532533643 +0000 UTC m=+538.452519501" lastFinishedPulling="2026-04-16 08:48:11.87409976 +0000 UTC m=+543.794085614" observedRunningTime="2026-04-16 08:48:12.628348951 +0000 UTC m=+544.548334828" watchObservedRunningTime="2026-04-16 08:48:12.631868806 +0000 UTC m=+544.551854718" Apr 16 08:48:14.588902 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:14.588839 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-hkfkd" podUID="bdbc0402-3c18-443d-9384-233509fa7ba9" containerName="authorino" containerID="cri-o://58d453a56281d62b06e3a0229cea2dfaabe4cd1cbdbc0b7c72aa605f0f3dfc4b" gracePeriod=30 Apr 16 08:48:14.830779 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:14.830702 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-hkfkd" Apr 16 08:48:14.909541 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:14.909459 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldgtx\" (UniqueName: \"kubernetes.io/projected/bdbc0402-3c18-443d-9384-233509fa7ba9-kube-api-access-ldgtx\") pod \"bdbc0402-3c18-443d-9384-233509fa7ba9\" (UID: \"bdbc0402-3c18-443d-9384-233509fa7ba9\") " Apr 16 08:48:14.911648 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:14.911615 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdbc0402-3c18-443d-9384-233509fa7ba9-kube-api-access-ldgtx" (OuterVolumeSpecName: "kube-api-access-ldgtx") pod "bdbc0402-3c18-443d-9384-233509fa7ba9" (UID: "bdbc0402-3c18-443d-9384-233509fa7ba9"). InnerVolumeSpecName "kube-api-access-ldgtx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:48:15.010680 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:15.010644 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ldgtx\" (UniqueName: \"kubernetes.io/projected/bdbc0402-3c18-443d-9384-233509fa7ba9-kube-api-access-ldgtx\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:48:15.593499 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:15.593460 2578 generic.go:358] "Generic (PLEG): container finished" podID="bdbc0402-3c18-443d-9384-233509fa7ba9" containerID="58d453a56281d62b06e3a0229cea2dfaabe4cd1cbdbc0b7c72aa605f0f3dfc4b" exitCode=0 Apr 16 08:48:15.593967 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:15.593508 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-hkfkd" Apr 16 08:48:15.593967 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:15.593544 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-hkfkd" event={"ID":"bdbc0402-3c18-443d-9384-233509fa7ba9","Type":"ContainerDied","Data":"58d453a56281d62b06e3a0229cea2dfaabe4cd1cbdbc0b7c72aa605f0f3dfc4b"} Apr 16 08:48:15.593967 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:15.593579 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-hkfkd" event={"ID":"bdbc0402-3c18-443d-9384-233509fa7ba9","Type":"ContainerDied","Data":"078eb908dd552a7f05ba2b4cbf101eb85c92857052ad3b406abb6b23d2354278"} Apr 16 08:48:15.593967 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:15.593595 2578 scope.go:117] "RemoveContainer" containerID="58d453a56281d62b06e3a0229cea2dfaabe4cd1cbdbc0b7c72aa605f0f3dfc4b" Apr 16 08:48:15.602491 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:15.602473 2578 scope.go:117] "RemoveContainer" containerID="58d453a56281d62b06e3a0229cea2dfaabe4cd1cbdbc0b7c72aa605f0f3dfc4b" Apr 16 08:48:15.602772 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:48:15.602754 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d453a56281d62b06e3a0229cea2dfaabe4cd1cbdbc0b7c72aa605f0f3dfc4b\": container with ID starting with 58d453a56281d62b06e3a0229cea2dfaabe4cd1cbdbc0b7c72aa605f0f3dfc4b not found: ID does not exist" containerID="58d453a56281d62b06e3a0229cea2dfaabe4cd1cbdbc0b7c72aa605f0f3dfc4b" Apr 16 08:48:15.602834 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:15.602814 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d453a56281d62b06e3a0229cea2dfaabe4cd1cbdbc0b7c72aa605f0f3dfc4b"} err="failed to get container status \"58d453a56281d62b06e3a0229cea2dfaabe4cd1cbdbc0b7c72aa605f0f3dfc4b\": rpc error: code = NotFound desc = could not find container \"58d453a56281d62b06e3a0229cea2dfaabe4cd1cbdbc0b7c72aa605f0f3dfc4b\": container with ID starting with 58d453a56281d62b06e3a0229cea2dfaabe4cd1cbdbc0b7c72aa605f0f3dfc4b not found: ID does not exist" Apr 16 08:48:15.613876 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:15.613848 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-hkfkd"] Apr 16 08:48:15.616165 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:15.616136 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-hkfkd"] Apr 16 08:48:16.666294 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:16.666258 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdbc0402-3c18-443d-9384-233509fa7ba9" path="/var/lib/kubelet/pods/bdbc0402-3c18-443d-9384-233509fa7ba9/volumes" Apr 16 08:48:21.169909 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:21.169871 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-shh6z"] Apr 16 08:48:21.170392 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:21.170181 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-shh6z" podUID="1b4bb2bf-7323-4577-b5d8-3fffd8192c2e" containerName="limitador" containerID="cri-o://b030537b09d8e988a0bf13d7cd868b2f21013196d53a2656ade8e24cd1c862b2" gracePeriod=30 Apr 16 08:48:21.170937 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:21.170827 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-shh6z" Apr 16 08:48:21.616658 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:21.616618 2578 generic.go:358] "Generic (PLEG): container finished" podID="1b4bb2bf-7323-4577-b5d8-3fffd8192c2e" containerID="b030537b09d8e988a0bf13d7cd868b2f21013196d53a2656ade8e24cd1c862b2" exitCode=0 Apr 16 08:48:21.616852 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:21.616695 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-shh6z" event={"ID":"1b4bb2bf-7323-4577-b5d8-3fffd8192c2e","Type":"ContainerDied","Data":"b030537b09d8e988a0bf13d7cd868b2f21013196d53a2656ade8e24cd1c862b2"} Apr 16 08:48:22.112175 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:22.112149 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-shh6z" Apr 16 08:48:22.174954 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:22.174921 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1b4bb2bf-7323-4577-b5d8-3fffd8192c2e-config-file\") pod \"1b4bb2bf-7323-4577-b5d8-3fffd8192c2e\" (UID: \"1b4bb2bf-7323-4577-b5d8-3fffd8192c2e\") " Apr 16 08:48:22.175383 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:22.174989 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jfvd\" (UniqueName: \"kubernetes.io/projected/1b4bb2bf-7323-4577-b5d8-3fffd8192c2e-kube-api-access-5jfvd\") pod \"1b4bb2bf-7323-4577-b5d8-3fffd8192c2e\" (UID: \"1b4bb2bf-7323-4577-b5d8-3fffd8192c2e\") " Apr 16 08:48:22.175383 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:22.175362 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b4bb2bf-7323-4577-b5d8-3fffd8192c2e-config-file" (OuterVolumeSpecName: "config-file") pod "1b4bb2bf-7323-4577-b5d8-3fffd8192c2e" (UID: "1b4bb2bf-7323-4577-b5d8-3fffd8192c2e"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:48:22.177429 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:22.177395 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b4bb2bf-7323-4577-b5d8-3fffd8192c2e-kube-api-access-5jfvd" (OuterVolumeSpecName: "kube-api-access-5jfvd") pod "1b4bb2bf-7323-4577-b5d8-3fffd8192c2e" (UID: "1b4bb2bf-7323-4577-b5d8-3fffd8192c2e"). InnerVolumeSpecName "kube-api-access-5jfvd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:48:22.276134 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:22.276051 2578 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1b4bb2bf-7323-4577-b5d8-3fffd8192c2e-config-file\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:48:22.276134 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:22.276082 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5jfvd\" (UniqueName: \"kubernetes.io/projected/1b4bb2bf-7323-4577-b5d8-3fffd8192c2e-kube-api-access-5jfvd\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:48:22.621905 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:22.621877 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-shh6z" Apr 16 08:48:22.621905 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:22.621887 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-shh6z" event={"ID":"1b4bb2bf-7323-4577-b5d8-3fffd8192c2e","Type":"ContainerDied","Data":"0d18abe6eaa4d6ddd9fae1afb7ecef5dd1bee18e59e66ae03a88f4955e1f5097"} Apr 16 08:48:22.622124 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:22.621928 2578 scope.go:117] "RemoveContainer" containerID="b030537b09d8e988a0bf13d7cd868b2f21013196d53a2656ade8e24cd1c862b2" Apr 16 08:48:22.643123 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:22.643087 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-shh6z"] Apr 16 08:48:22.648634 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:22.648612 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-shh6z"] Apr 16 08:48:22.668129 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:22.668099 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b4bb2bf-7323-4577-b5d8-3fffd8192c2e" path="/var/lib/kubelet/pods/1b4bb2bf-7323-4577-b5d8-3fffd8192c2e/volumes" Apr 16 08:48:27.170283 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.170243 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-s7mth"] Apr 16 08:48:27.170864 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.170840 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdbc0402-3c18-443d-9384-233509fa7ba9" containerName="authorino" Apr 16 08:48:27.171022 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.171009 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdbc0402-3c18-443d-9384-233509fa7ba9" containerName="authorino" Apr 16 08:48:27.171150 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.171137 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b4bb2bf-7323-4577-b5d8-3fffd8192c2e" containerName="limitador" Apr 16 08:48:27.171223 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.171165 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b4bb2bf-7323-4577-b5d8-3fffd8192c2e" containerName="limitador" Apr 16 08:48:27.171322 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.171302 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdbc0402-3c18-443d-9384-233509fa7ba9" containerName="authorino" Apr 16 08:48:27.171322 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.171323 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b4bb2bf-7323-4577-b5d8-3fffd8192c2e" containerName="limitador" Apr 16 08:48:27.174511 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.174484 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-s7mth" Apr 16 08:48:27.177447 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.177423 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-zd29n\"" Apr 16 08:48:27.177578 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.177497 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 16 08:48:27.188911 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.188888 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-s7mth"] Apr 16 08:48:27.322553 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.322503 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b2525202-92cf-4022-a6e5-7d4572ce65fb-data\") pod \"postgres-868db5846d-s7mth\" (UID: \"b2525202-92cf-4022-a6e5-7d4572ce65fb\") " pod="opendatahub/postgres-868db5846d-s7mth" Apr 16 08:48:27.322777 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.322627 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkdz8\" (UniqueName: \"kubernetes.io/projected/b2525202-92cf-4022-a6e5-7d4572ce65fb-kube-api-access-fkdz8\") pod \"postgres-868db5846d-s7mth\" (UID: \"b2525202-92cf-4022-a6e5-7d4572ce65fb\") " pod="opendatahub/postgres-868db5846d-s7mth" Apr 16 08:48:27.423295 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.423207 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b2525202-92cf-4022-a6e5-7d4572ce65fb-data\") pod \"postgres-868db5846d-s7mth\" (UID: \"b2525202-92cf-4022-a6e5-7d4572ce65fb\") " pod="opendatahub/postgres-868db5846d-s7mth" Apr 16 08:48:27.423295 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.423281 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkdz8\" (UniqueName: \"kubernetes.io/projected/b2525202-92cf-4022-a6e5-7d4572ce65fb-kube-api-access-fkdz8\") pod \"postgres-868db5846d-s7mth\" (UID: \"b2525202-92cf-4022-a6e5-7d4572ce65fb\") " pod="opendatahub/postgres-868db5846d-s7mth" Apr 16 08:48:27.423624 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.423603 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b2525202-92cf-4022-a6e5-7d4572ce65fb-data\") pod \"postgres-868db5846d-s7mth\" (UID: \"b2525202-92cf-4022-a6e5-7d4572ce65fb\") " pod="opendatahub/postgres-868db5846d-s7mth" Apr 16 08:48:27.431814 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.431793 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkdz8\" (UniqueName: \"kubernetes.io/projected/b2525202-92cf-4022-a6e5-7d4572ce65fb-kube-api-access-fkdz8\") pod \"postgres-868db5846d-s7mth\" (UID: \"b2525202-92cf-4022-a6e5-7d4572ce65fb\") " pod="opendatahub/postgres-868db5846d-s7mth" Apr 16 08:48:27.485766 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.485734 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-s7mth" Apr 16 08:48:27.611928 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.611898 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-s7mth"] Apr 16 08:48:27.613906 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:48:27.613881 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2525202_92cf_4022_a6e5_7d4572ce65fb.slice/crio-f908ed0b251d5c16a25793fdc6d63ca84564b5bc5fec6f8461f95eeb9c8e346b WatchSource:0}: Error finding container f908ed0b251d5c16a25793fdc6d63ca84564b5bc5fec6f8461f95eeb9c8e346b: Status 404 returned error can't find the container with id f908ed0b251d5c16a25793fdc6d63ca84564b5bc5fec6f8461f95eeb9c8e346b Apr 16 08:48:27.640320 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:27.640290 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-s7mth" event={"ID":"b2525202-92cf-4022-a6e5-7d4572ce65fb","Type":"ContainerStarted","Data":"f908ed0b251d5c16a25793fdc6d63ca84564b5bc5fec6f8461f95eeb9c8e346b"} Apr 16 08:48:33.662575 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:33.662538 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-s7mth" event={"ID":"b2525202-92cf-4022-a6e5-7d4572ce65fb","Type":"ContainerStarted","Data":"9a98a37f6c018ca555e9e6c2f04589492aa0424aabe68b8cdd29b6b59f42ba08"} Apr 16 08:48:33.663048 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:33.662638 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-s7mth" Apr 16 08:48:33.679568 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:33.679525 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-s7mth" podStartSLOduration=1.2076842509999999 podStartE2EDuration="6.679511657s" podCreationTimestamp="2026-04-16 08:48:27 +0000 UTC" firstStartedPulling="2026-04-16 08:48:27.615771445 +0000 UTC m=+559.535757298" lastFinishedPulling="2026-04-16 08:48:33.087598851 +0000 UTC m=+565.007584704" observedRunningTime="2026-04-16 08:48:33.677271359 +0000 UTC m=+565.597257271" watchObservedRunningTime="2026-04-16 08:48:33.679511657 +0000 UTC m=+565.599497532" Apr 16 08:48:39.695974 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:39.695943 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-s7mth" Apr 16 08:48:40.673098 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.673068 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-57dfbb646d-ztr94"] Apr 16 08:48:40.683665 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.683640 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-57dfbb646d-ztr94" Apr 16 08:48:40.686198 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.686174 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 16 08:48:40.686396 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.686377 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 16 08:48:40.686520 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.686208 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-rxq2t\"" Apr 16 08:48:40.688190 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.688166 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-57dfbb646d-ztr94"] Apr 16 08:48:40.697414 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.697389 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-f7689cdc8-t6v6g"] Apr 16 08:48:40.701235 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.701216 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f7689cdc8-t6v6g" Apr 16 08:48:40.704127 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.704106 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-6hrhq\"" Apr 16 08:48:40.720512 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.720489 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f7689cdc8-t6v6g"] Apr 16 08:48:40.746642 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.746611 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx49r\" (UniqueName: \"kubernetes.io/projected/dfe384fe-5807-4ad3-971a-46211d6298fd-kube-api-access-qx49r\") pod \"maas-controller-f7689cdc8-t6v6g\" (UID: \"dfe384fe-5807-4ad3-971a-46211d6298fd\") " pod="opendatahub/maas-controller-f7689cdc8-t6v6g" Apr 16 08:48:40.848088 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.848049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx49r\" (UniqueName: \"kubernetes.io/projected/dfe384fe-5807-4ad3-971a-46211d6298fd-kube-api-access-qx49r\") pod \"maas-controller-f7689cdc8-t6v6g\" (UID: \"dfe384fe-5807-4ad3-971a-46211d6298fd\") " pod="opendatahub/maas-controller-f7689cdc8-t6v6g" Apr 16 08:48:40.848272 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.848185 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/033e236d-46d4-4e05-a295-7fce50e14911-maas-api-tls\") pod \"maas-api-57dfbb646d-ztr94\" (UID: \"033e236d-46d4-4e05-a295-7fce50e14911\") " pod="opendatahub/maas-api-57dfbb646d-ztr94" Apr 16 08:48:40.848272 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.848229 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cxhw\" (UniqueName: \"kubernetes.io/projected/033e236d-46d4-4e05-a295-7fce50e14911-kube-api-access-5cxhw\") pod \"maas-api-57dfbb646d-ztr94\" (UID: \"033e236d-46d4-4e05-a295-7fce50e14911\") " pod="opendatahub/maas-api-57dfbb646d-ztr94" Apr 16 08:48:40.860252 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.860221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx49r\" (UniqueName: \"kubernetes.io/projected/dfe384fe-5807-4ad3-971a-46211d6298fd-kube-api-access-qx49r\") pod \"maas-controller-f7689cdc8-t6v6g\" (UID: \"dfe384fe-5807-4ad3-971a-46211d6298fd\") " pod="opendatahub/maas-controller-f7689cdc8-t6v6g" Apr 16 08:48:40.949178 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.949101 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/033e236d-46d4-4e05-a295-7fce50e14911-maas-api-tls\") pod \"maas-api-57dfbb646d-ztr94\" (UID: \"033e236d-46d4-4e05-a295-7fce50e14911\") " pod="opendatahub/maas-api-57dfbb646d-ztr94" Apr 16 08:48:40.949178 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.949142 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cxhw\" (UniqueName: \"kubernetes.io/projected/033e236d-46d4-4e05-a295-7fce50e14911-kube-api-access-5cxhw\") pod \"maas-api-57dfbb646d-ztr94\" (UID: \"033e236d-46d4-4e05-a295-7fce50e14911\") " pod="opendatahub/maas-api-57dfbb646d-ztr94" Apr 16 08:48:40.951672 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.951641 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/033e236d-46d4-4e05-a295-7fce50e14911-maas-api-tls\") pod \"maas-api-57dfbb646d-ztr94\" (UID: \"033e236d-46d4-4e05-a295-7fce50e14911\") " pod="opendatahub/maas-api-57dfbb646d-ztr94" Apr 16 08:48:40.956668 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.956633 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cxhw\" (UniqueName: \"kubernetes.io/projected/033e236d-46d4-4e05-a295-7fce50e14911-kube-api-access-5cxhw\") pod \"maas-api-57dfbb646d-ztr94\" (UID: \"033e236d-46d4-4e05-a295-7fce50e14911\") " pod="opendatahub/maas-api-57dfbb646d-ztr94" Apr 16 08:48:40.999430 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:40.999396 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-57dfbb646d-ztr94" Apr 16 08:48:41.011236 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.011205 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f7689cdc8-t6v6g" Apr 16 08:48:41.171789 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.171761 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f7689cdc8-t6v6g"] Apr 16 08:48:41.173792 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:48:41.173763 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfe384fe_5807_4ad3_971a_46211d6298fd.slice/crio-5506c52b88b9b74ab70309eaa9f50182bdf04d11ca9e25f92bf92c43bed21235 WatchSource:0}: Error finding container 5506c52b88b9b74ab70309eaa9f50182bdf04d11ca9e25f92bf92c43bed21235: Status 404 returned error can't find the container with id 5506c52b88b9b74ab70309eaa9f50182bdf04d11ca9e25f92bf92c43bed21235 Apr 16 08:48:41.345525 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.345442 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-57dfbb646d-ztr94"] Apr 16 08:48:41.347527 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:48:41.347484 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod033e236d_46d4_4e05_a295_7fce50e14911.slice/crio-f8b78e797c4955ba30e7d1cd4291b1c4bd60ca0d321e321814b2d3287418afb0 WatchSource:0}: Error finding container f8b78e797c4955ba30e7d1cd4291b1c4bd60ca0d321e321814b2d3287418afb0: Status 404 returned error can't find the container with id f8b78e797c4955ba30e7d1cd4291b1c4bd60ca0d321e321814b2d3287418afb0 Apr 16 08:48:41.474007 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.473970 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-7v9lc"] Apr 16 08:48:41.478890 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.478869 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-7v9lc" Apr 16 08:48:41.486383 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.486360 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-7v9lc"] Apr 16 08:48:41.556167 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.556129 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z46rv\" (UniqueName: \"kubernetes.io/projected/4f43281b-e287-44f0-a3fb-315352010f39-kube-api-access-z46rv\") pod \"authorino-8b475cf9f-7v9lc\" (UID: \"4f43281b-e287-44f0-a3fb-315352010f39\") " pod="kuadrant-system/authorino-8b475cf9f-7v9lc" Apr 16 08:48:41.657245 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.657144 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z46rv\" (UniqueName: \"kubernetes.io/projected/4f43281b-e287-44f0-a3fb-315352010f39-kube-api-access-z46rv\") pod \"authorino-8b475cf9f-7v9lc\" (UID: \"4f43281b-e287-44f0-a3fb-315352010f39\") " pod="kuadrant-system/authorino-8b475cf9f-7v9lc" Apr 16 08:48:41.664943 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.664915 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z46rv\" (UniqueName: \"kubernetes.io/projected/4f43281b-e287-44f0-a3fb-315352010f39-kube-api-access-z46rv\") pod \"authorino-8b475cf9f-7v9lc\" (UID: \"4f43281b-e287-44f0-a3fb-315352010f39\") " pod="kuadrant-system/authorino-8b475cf9f-7v9lc" Apr 16 08:48:41.691669 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.691627 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-57dfbb646d-ztr94" event={"ID":"033e236d-46d4-4e05-a295-7fce50e14911","Type":"ContainerStarted","Data":"f8b78e797c4955ba30e7d1cd4291b1c4bd60ca0d321e321814b2d3287418afb0"} Apr 16 08:48:41.692631 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.692605 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f7689cdc8-t6v6g" event={"ID":"dfe384fe-5807-4ad3-971a-46211d6298fd","Type":"ContainerStarted","Data":"5506c52b88b9b74ab70309eaa9f50182bdf04d11ca9e25f92bf92c43bed21235"} Apr 16 08:48:41.722068 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.722035 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-7v9lc"] Apr 16 08:48:41.722446 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.722319 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-7v9lc" Apr 16 08:48:41.746454 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.746422 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7d79b7c9c5-5phw5"] Apr 16 08:48:41.751330 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.751306 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7d79b7c9c5-5phw5" Apr 16 08:48:41.757114 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.757087 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7d79b7c9c5-5phw5"] Apr 16 08:48:41.758744 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.758026 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxmtw\" (UniqueName: \"kubernetes.io/projected/85bde3e0-0726-4ace-8cd7-5e14dc7389d6-kube-api-access-dxmtw\") pod \"authorino-7d79b7c9c5-5phw5\" (UID: \"85bde3e0-0726-4ace-8cd7-5e14dc7389d6\") " pod="kuadrant-system/authorino-7d79b7c9c5-5phw5" Apr 16 08:48:41.758744 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.758112 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/85bde3e0-0726-4ace-8cd7-5e14dc7389d6-openshift-service-ca\") pod \"authorino-7d79b7c9c5-5phw5\" (UID: \"85bde3e0-0726-4ace-8cd7-5e14dc7389d6\") " pod="kuadrant-system/authorino-7d79b7c9c5-5phw5" Apr 16 08:48:41.859996 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.859580 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxmtw\" (UniqueName: \"kubernetes.io/projected/85bde3e0-0726-4ace-8cd7-5e14dc7389d6-kube-api-access-dxmtw\") pod \"authorino-7d79b7c9c5-5phw5\" (UID: \"85bde3e0-0726-4ace-8cd7-5e14dc7389d6\") " pod="kuadrant-system/authorino-7d79b7c9c5-5phw5" Apr 16 08:48:41.860169 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.860084 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/85bde3e0-0726-4ace-8cd7-5e14dc7389d6-openshift-service-ca\") pod \"authorino-7d79b7c9c5-5phw5\" (UID: \"85bde3e0-0726-4ace-8cd7-5e14dc7389d6\") " pod="kuadrant-system/authorino-7d79b7c9c5-5phw5" Apr 16 08:48:41.860912 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.860885 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/85bde3e0-0726-4ace-8cd7-5e14dc7389d6-openshift-service-ca\") pod \"authorino-7d79b7c9c5-5phw5\" (UID: \"85bde3e0-0726-4ace-8cd7-5e14dc7389d6\") " pod="kuadrant-system/authorino-7d79b7c9c5-5phw5" Apr 16 08:48:41.868308 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.868282 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxmtw\" (UniqueName: \"kubernetes.io/projected/85bde3e0-0726-4ace-8cd7-5e14dc7389d6-kube-api-access-dxmtw\") pod \"authorino-7d79b7c9c5-5phw5\" (UID: \"85bde3e0-0726-4ace-8cd7-5e14dc7389d6\") " pod="kuadrant-system/authorino-7d79b7c9c5-5phw5" Apr 16 08:48:41.889738 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.889694 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-7v9lc"] Apr 16 08:48:41.893792 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.893762 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7d79b7c9c5-5phw5"] Apr 16 08:48:41.894088 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:48:41.894060 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f43281b_e287_44f0_a3fb_315352010f39.slice/crio-35442c68cb5a87ce2dfae3f0700e7d8b9962250157a38903a9a2a23986f4c70f WatchSource:0}: Error finding container 35442c68cb5a87ce2dfae3f0700e7d8b9962250157a38903a9a2a23986f4c70f: Status 404 returned error can't find the container with id 35442c68cb5a87ce2dfae3f0700e7d8b9962250157a38903a9a2a23986f4c70f Apr 16 08:48:41.894187 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.894173 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7d79b7c9c5-5phw5" Apr 16 08:48:41.922050 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.921966 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-56794f675f-mgfbt"] Apr 16 08:48:41.947786 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.947038 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-56794f675f-mgfbt"] Apr 16 08:48:41.947786 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.947242 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56794f675f-mgfbt" Apr 16 08:48:41.950552 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.950474 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 08:48:41.963595 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.963566 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/79227e1c-0266-40e9-a84a-a83853969669-tls-cert\") pod \"authorino-56794f675f-mgfbt\" (UID: \"79227e1c-0266-40e9-a84a-a83853969669\") " pod="kuadrant-system/authorino-56794f675f-mgfbt" Apr 16 08:48:41.963951 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.963935 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/79227e1c-0266-40e9-a84a-a83853969669-openshift-service-ca\") pod \"authorino-56794f675f-mgfbt\" (UID: \"79227e1c-0266-40e9-a84a-a83853969669\") " pod="kuadrant-system/authorino-56794f675f-mgfbt" Apr 16 08:48:41.964084 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.964070 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w252z\" (UniqueName: \"kubernetes.io/projected/79227e1c-0266-40e9-a84a-a83853969669-kube-api-access-w252z\") pod \"authorino-56794f675f-mgfbt\" (UID: \"79227e1c-0266-40e9-a84a-a83853969669\") " pod="kuadrant-system/authorino-56794f675f-mgfbt" Apr 16 08:48:41.978132 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:41.978099 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-56794f675f-mgfbt"] Apr 16 08:48:41.978501 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:48:41.978470 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-w252z openshift-service-ca tls-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-56794f675f-mgfbt" podUID="79227e1c-0266-40e9-a84a-a83853969669" Apr 16 08:48:42.015735 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.002752 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5696479cd-b26hq"] Apr 16 08:48:42.024038 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.022506 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5696479cd-b26hq" Apr 16 08:48:42.024038 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.022365 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5696479cd-b26hq"] Apr 16 08:48:42.066380 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.065269 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/79227e1c-0266-40e9-a84a-a83853969669-tls-cert\") pod \"authorino-56794f675f-mgfbt\" (UID: \"79227e1c-0266-40e9-a84a-a83853969669\") " pod="kuadrant-system/authorino-56794f675f-mgfbt" Apr 16 08:48:42.066380 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.065335 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/739baf7a-605e-49ed-9fe4-2d1d74b3c573-tls-cert\") pod \"authorino-5696479cd-b26hq\" (UID: \"739baf7a-605e-49ed-9fe4-2d1d74b3c573\") " pod="kuadrant-system/authorino-5696479cd-b26hq" Apr 16 08:48:42.066380 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.065362 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfskh\" (UniqueName: \"kubernetes.io/projected/739baf7a-605e-49ed-9fe4-2d1d74b3c573-kube-api-access-kfskh\") pod \"authorino-5696479cd-b26hq\" (UID: \"739baf7a-605e-49ed-9fe4-2d1d74b3c573\") " pod="kuadrant-system/authorino-5696479cd-b26hq" Apr 16 08:48:42.066380 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.065398 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w252z\" (UniqueName: \"kubernetes.io/projected/79227e1c-0266-40e9-a84a-a83853969669-kube-api-access-w252z\") pod \"authorino-56794f675f-mgfbt\" (UID: \"79227e1c-0266-40e9-a84a-a83853969669\") " pod="kuadrant-system/authorino-56794f675f-mgfbt" Apr 16 08:48:42.066380 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.065506 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/79227e1c-0266-40e9-a84a-a83853969669-openshift-service-ca\") pod \"authorino-56794f675f-mgfbt\" (UID: \"79227e1c-0266-40e9-a84a-a83853969669\") " pod="kuadrant-system/authorino-56794f675f-mgfbt" Apr 16 08:48:42.066380 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.065554 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/739baf7a-605e-49ed-9fe4-2d1d74b3c573-openshift-service-ca\") pod \"authorino-5696479cd-b26hq\" (UID: \"739baf7a-605e-49ed-9fe4-2d1d74b3c573\") " pod="kuadrant-system/authorino-5696479cd-b26hq" Apr 16 08:48:42.066380 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.066334 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/79227e1c-0266-40e9-a84a-a83853969669-openshift-service-ca\") pod \"authorino-56794f675f-mgfbt\" (UID: \"79227e1c-0266-40e9-a84a-a83853969669\") " pod="kuadrant-system/authorino-56794f675f-mgfbt" Apr 16 08:48:42.070321 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.070294 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/79227e1c-0266-40e9-a84a-a83853969669-tls-cert\") pod \"authorino-56794f675f-mgfbt\" (UID: \"79227e1c-0266-40e9-a84a-a83853969669\") " pod="kuadrant-system/authorino-56794f675f-mgfbt" Apr 16 08:48:42.076312 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.076290 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w252z\" (UniqueName: \"kubernetes.io/projected/79227e1c-0266-40e9-a84a-a83853969669-kube-api-access-w252z\") pod \"authorino-56794f675f-mgfbt\" (UID: \"79227e1c-0266-40e9-a84a-a83853969669\") " pod="kuadrant-system/authorino-56794f675f-mgfbt" Apr 16 08:48:42.158979 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.158919 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7d79b7c9c5-5phw5"] Apr 16 08:48:42.163397 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:48:42.163351 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85bde3e0_0726_4ace_8cd7_5e14dc7389d6.slice/crio-c7bcce50b9827d375bcc58b289153d73fd6d2f0b1577b60eef6ab92724857537 WatchSource:0}: Error finding container c7bcce50b9827d375bcc58b289153d73fd6d2f0b1577b60eef6ab92724857537: Status 404 returned error can't find the container with id c7bcce50b9827d375bcc58b289153d73fd6d2f0b1577b60eef6ab92724857537 Apr 16 08:48:42.166480 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.166447 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/739baf7a-605e-49ed-9fe4-2d1d74b3c573-tls-cert\") pod \"authorino-5696479cd-b26hq\" (UID: \"739baf7a-605e-49ed-9fe4-2d1d74b3c573\") " pod="kuadrant-system/authorino-5696479cd-b26hq" Apr 16 08:48:42.166597 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.166495 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfskh\" (UniqueName: \"kubernetes.io/projected/739baf7a-605e-49ed-9fe4-2d1d74b3c573-kube-api-access-kfskh\") pod \"authorino-5696479cd-b26hq\" (UID: \"739baf7a-605e-49ed-9fe4-2d1d74b3c573\") " pod="kuadrant-system/authorino-5696479cd-b26hq" Apr 16 08:48:42.166597 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.166585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/739baf7a-605e-49ed-9fe4-2d1d74b3c573-openshift-service-ca\") pod \"authorino-5696479cd-b26hq\" (UID: \"739baf7a-605e-49ed-9fe4-2d1d74b3c573\") " pod="kuadrant-system/authorino-5696479cd-b26hq" Apr 16 08:48:42.167370 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.167346 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/739baf7a-605e-49ed-9fe4-2d1d74b3c573-openshift-service-ca\") pod \"authorino-5696479cd-b26hq\" (UID: \"739baf7a-605e-49ed-9fe4-2d1d74b3c573\") " pod="kuadrant-system/authorino-5696479cd-b26hq" Apr 16 08:48:42.170326 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.170284 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/739baf7a-605e-49ed-9fe4-2d1d74b3c573-tls-cert\") pod \"authorino-5696479cd-b26hq\" (UID: \"739baf7a-605e-49ed-9fe4-2d1d74b3c573\") " pod="kuadrant-system/authorino-5696479cd-b26hq" Apr 16 08:48:42.175779 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.175705 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfskh\" (UniqueName: \"kubernetes.io/projected/739baf7a-605e-49ed-9fe4-2d1d74b3c573-kube-api-access-kfskh\") pod \"authorino-5696479cd-b26hq\" (UID: \"739baf7a-605e-49ed-9fe4-2d1d74b3c573\") " pod="kuadrant-system/authorino-5696479cd-b26hq" Apr 16 08:48:42.346103 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.346068 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5696479cd-b26hq" Apr 16 08:48:42.591531 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.589630 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5696479cd-b26hq"] Apr 16 08:48:42.620757 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:48:42.603024 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod739baf7a_605e_49ed_9fe4_2d1d74b3c573.slice/crio-beb42e25fe96c6379628d2b1ee9e3ad2c4d8b89e17557c1233316cab40f064ef WatchSource:0}: Error finding container beb42e25fe96c6379628d2b1ee9e3ad2c4d8b89e17557c1233316cab40f064ef: Status 404 returned error can't find the container with id beb42e25fe96c6379628d2b1ee9e3ad2c4d8b89e17557c1233316cab40f064ef Apr 16 08:48:42.715740 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.701821 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7d79b7c9c5-5phw5" event={"ID":"85bde3e0-0726-4ace-8cd7-5e14dc7389d6","Type":"ContainerStarted","Data":"583aeaf94ed9b54daa78f97d3e303e274fe5c9e3178f727beaf7a9c9e0e08ba9"} Apr 16 08:48:42.715740 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.701867 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7d79b7c9c5-5phw5" event={"ID":"85bde3e0-0726-4ace-8cd7-5e14dc7389d6","Type":"ContainerStarted","Data":"c7bcce50b9827d375bcc58b289153d73fd6d2f0b1577b60eef6ab92724857537"} Apr 16 08:48:42.719739 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.716900 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5696479cd-b26hq" event={"ID":"739baf7a-605e-49ed-9fe4-2d1d74b3c573","Type":"ContainerStarted","Data":"beb42e25fe96c6379628d2b1ee9e3ad2c4d8b89e17557c1233316cab40f064ef"} Apr 16 08:48:42.727812 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.724762 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56794f675f-mgfbt" Apr 16 08:48:42.727812 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.725254 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-7v9lc" podUID="4f43281b-e287-44f0-a3fb-315352010f39" containerName="authorino" containerID="cri-o://10fe42a1214b5b4b5a32fc73773d8ad1081d850ab15019c2c3f3598f420a7a3f" gracePeriod=30 Apr 16 08:48:42.727812 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.725539 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-7v9lc" event={"ID":"4f43281b-e287-44f0-a3fb-315352010f39","Type":"ContainerStarted","Data":"10fe42a1214b5b4b5a32fc73773d8ad1081d850ab15019c2c3f3598f420a7a3f"} Apr 16 08:48:42.727812 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.725572 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-7v9lc" event={"ID":"4f43281b-e287-44f0-a3fb-315352010f39","Type":"ContainerStarted","Data":"35442c68cb5a87ce2dfae3f0700e7d8b9962250157a38903a9a2a23986f4c70f"} Apr 16 08:48:42.740905 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.740316 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56794f675f-mgfbt" Apr 16 08:48:42.745408 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.745355 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-7v9lc" podStartSLOduration=1.283465605 podStartE2EDuration="1.745334208s" podCreationTimestamp="2026-04-16 08:48:41 +0000 UTC" firstStartedPulling="2026-04-16 08:48:41.895656528 +0000 UTC m=+573.815642387" lastFinishedPulling="2026-04-16 08:48:42.357525136 +0000 UTC m=+574.277510990" observedRunningTime="2026-04-16 08:48:42.743391586 +0000 UTC m=+574.663377462" watchObservedRunningTime="2026-04-16 08:48:42.745334208 +0000 UTC m=+574.665320087" Apr 16 08:48:42.774110 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.774071 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w252z\" (UniqueName: \"kubernetes.io/projected/79227e1c-0266-40e9-a84a-a83853969669-kube-api-access-w252z\") pod \"79227e1c-0266-40e9-a84a-a83853969669\" (UID: \"79227e1c-0266-40e9-a84a-a83853969669\") " Apr 16 08:48:42.774271 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.774184 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/79227e1c-0266-40e9-a84a-a83853969669-tls-cert\") pod \"79227e1c-0266-40e9-a84a-a83853969669\" (UID: \"79227e1c-0266-40e9-a84a-a83853969669\") " Apr 16 08:48:42.774271 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.774215 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/79227e1c-0266-40e9-a84a-a83853969669-openshift-service-ca\") pod \"79227e1c-0266-40e9-a84a-a83853969669\" (UID: \"79227e1c-0266-40e9-a84a-a83853969669\") " Apr 16 08:48:42.775348 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.775314 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79227e1c-0266-40e9-a84a-a83853969669-openshift-service-ca" (OuterVolumeSpecName: "openshift-service-ca") pod "79227e1c-0266-40e9-a84a-a83853969669" (UID: "79227e1c-0266-40e9-a84a-a83853969669"). InnerVolumeSpecName "openshift-service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:48:42.780425 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.780393 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79227e1c-0266-40e9-a84a-a83853969669-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "79227e1c-0266-40e9-a84a-a83853969669" (UID: "79227e1c-0266-40e9-a84a-a83853969669"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:48:42.780798 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.780773 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79227e1c-0266-40e9-a84a-a83853969669-kube-api-access-w252z" (OuterVolumeSpecName: "kube-api-access-w252z") pod "79227e1c-0266-40e9-a84a-a83853969669" (UID: "79227e1c-0266-40e9-a84a-a83853969669"). InnerVolumeSpecName "kube-api-access-w252z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:48:42.879369 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.879329 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w252z\" (UniqueName: \"kubernetes.io/projected/79227e1c-0266-40e9-a84a-a83853969669-kube-api-access-w252z\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:48:42.879369 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.879371 2578 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/79227e1c-0266-40e9-a84a-a83853969669-tls-cert\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:48:42.879581 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:42.879389 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/79227e1c-0266-40e9-a84a-a83853969669-openshift-service-ca\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:48:43.093878 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.093582 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-7v9lc" Apr 16 08:48:43.182706 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.182527 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z46rv\" (UniqueName: \"kubernetes.io/projected/4f43281b-e287-44f0-a3fb-315352010f39-kube-api-access-z46rv\") pod \"4f43281b-e287-44f0-a3fb-315352010f39\" (UID: \"4f43281b-e287-44f0-a3fb-315352010f39\") " Apr 16 08:48:43.189438 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.189384 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f43281b-e287-44f0-a3fb-315352010f39-kube-api-access-z46rv" (OuterVolumeSpecName: "kube-api-access-z46rv") pod "4f43281b-e287-44f0-a3fb-315352010f39" (UID: "4f43281b-e287-44f0-a3fb-315352010f39"). InnerVolumeSpecName "kube-api-access-z46rv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:48:43.284238 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.283847 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z46rv\" (UniqueName: \"kubernetes.io/projected/4f43281b-e287-44f0-a3fb-315352010f39-kube-api-access-z46rv\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:48:43.732798 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.732608 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5696479cd-b26hq" event={"ID":"739baf7a-605e-49ed-9fe4-2d1d74b3c573","Type":"ContainerStarted","Data":"eb779e98a1f2cc51b195a631d4f744eefbe168537fe0f74adf4888c762889569"} Apr 16 08:48:43.734670 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.734485 2578 generic.go:358] "Generic (PLEG): container finished" podID="4f43281b-e287-44f0-a3fb-315352010f39" containerID="10fe42a1214b5b4b5a32fc73773d8ad1081d850ab15019c2c3f3598f420a7a3f" exitCode=0 Apr 16 08:48:43.734670 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.734584 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56794f675f-mgfbt" Apr 16 08:48:43.734900 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.734693 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-7v9lc" event={"ID":"4f43281b-e287-44f0-a3fb-315352010f39","Type":"ContainerDied","Data":"10fe42a1214b5b4b5a32fc73773d8ad1081d850ab15019c2c3f3598f420a7a3f"} Apr 16 08:48:43.734900 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.734753 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-7v9lc" event={"ID":"4f43281b-e287-44f0-a3fb-315352010f39","Type":"ContainerDied","Data":"35442c68cb5a87ce2dfae3f0700e7d8b9962250157a38903a9a2a23986f4c70f"} Apr 16 08:48:43.734900 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.734777 2578 scope.go:117] "RemoveContainer" containerID="10fe42a1214b5b4b5a32fc73773d8ad1081d850ab15019c2c3f3598f420a7a3f" Apr 16 08:48:43.735075 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.734912 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-7v9lc" Apr 16 08:48:43.735338 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.735310 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7d79b7c9c5-5phw5" podUID="85bde3e0-0726-4ace-8cd7-5e14dc7389d6" containerName="authorino" containerID="cri-o://583aeaf94ed9b54daa78f97d3e303e274fe5c9e3178f727beaf7a9c9e0e08ba9" gracePeriod=30 Apr 16 08:48:43.748296 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.748046 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5696479cd-b26hq" podStartSLOduration=2.414906471 podStartE2EDuration="2.74802776s" podCreationTimestamp="2026-04-16 08:48:41 +0000 UTC" firstStartedPulling="2026-04-16 08:48:42.605303185 +0000 UTC m=+574.525289046" lastFinishedPulling="2026-04-16 08:48:42.938424478 +0000 UTC m=+574.858410335" observedRunningTime="2026-04-16 08:48:43.746591368 +0000 UTC m=+575.666577285" watchObservedRunningTime="2026-04-16 08:48:43.74802776 +0000 UTC m=+575.668013617" Apr 16 08:48:43.769325 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.768982 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7d79b7c9c5-5phw5" podStartSLOduration=2.3785190529999998 podStartE2EDuration="2.768963308s" podCreationTimestamp="2026-04-16 08:48:41 +0000 UTC" firstStartedPulling="2026-04-16 08:48:42.164846609 +0000 UTC m=+574.084832466" lastFinishedPulling="2026-04-16 08:48:42.55529086 +0000 UTC m=+574.475276721" observedRunningTime="2026-04-16 08:48:43.767321167 +0000 UTC m=+575.687307044" watchObservedRunningTime="2026-04-16 08:48:43.768963308 +0000 UTC m=+575.688949185" Apr 16 08:48:43.773646 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.773616 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-kgfdr"] Apr 16 08:48:43.774005 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.773981 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-kgfdr" podUID="d71bca0c-1ac4-4569-bf43-9737dcedb07e" containerName="authorino" containerID="cri-o://e6d66404d5805e19cdffb13772775637bd2193014ca63701bcaf836927a687c9" gracePeriod=30 Apr 16 08:48:43.795015 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.794960 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-7v9lc"] Apr 16 08:48:43.799623 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.799573 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-7v9lc"] Apr 16 08:48:43.833905 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.833875 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-56794f675f-mgfbt"] Apr 16 08:48:43.836694 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:43.836658 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-56794f675f-mgfbt"] Apr 16 08:48:44.668628 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:44.667822 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f43281b-e287-44f0-a3fb-315352010f39" path="/var/lib/kubelet/pods/4f43281b-e287-44f0-a3fb-315352010f39/volumes" Apr 16 08:48:44.668628 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:44.668356 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79227e1c-0266-40e9-a84a-a83853969669" path="/var/lib/kubelet/pods/79227e1c-0266-40e9-a84a-a83853969669/volumes" Apr 16 08:48:44.739851 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:44.739818 2578 generic.go:358] "Generic (PLEG): container finished" podID="d71bca0c-1ac4-4569-bf43-9737dcedb07e" containerID="e6d66404d5805e19cdffb13772775637bd2193014ca63701bcaf836927a687c9" exitCode=0 Apr 16 08:48:44.740313 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:44.739901 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-kgfdr" event={"ID":"d71bca0c-1ac4-4569-bf43-9737dcedb07e","Type":"ContainerDied","Data":"e6d66404d5805e19cdffb13772775637bd2193014ca63701bcaf836927a687c9"} Apr 16 08:48:45.149690 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.149497 2578 scope.go:117] "RemoveContainer" containerID="10fe42a1214b5b4b5a32fc73773d8ad1081d850ab15019c2c3f3598f420a7a3f" Apr 16 08:48:45.149814 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:48:45.149783 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10fe42a1214b5b4b5a32fc73773d8ad1081d850ab15019c2c3f3598f420a7a3f\": container with ID starting with 10fe42a1214b5b4b5a32fc73773d8ad1081d850ab15019c2c3f3598f420a7a3f not found: ID does not exist" containerID="10fe42a1214b5b4b5a32fc73773d8ad1081d850ab15019c2c3f3598f420a7a3f" Apr 16 08:48:45.149851 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.149811 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10fe42a1214b5b4b5a32fc73773d8ad1081d850ab15019c2c3f3598f420a7a3f"} err="failed to get container status \"10fe42a1214b5b4b5a32fc73773d8ad1081d850ab15019c2c3f3598f420a7a3f\": rpc error: code = NotFound desc = could not find container \"10fe42a1214b5b4b5a32fc73773d8ad1081d850ab15019c2c3f3598f420a7a3f\": container with ID starting with 10fe42a1214b5b4b5a32fc73773d8ad1081d850ab15019c2c3f3598f420a7a3f not found: ID does not exist" Apr 16 08:48:45.331960 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.331936 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-kgfdr" Apr 16 08:48:45.335676 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.335654 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7d79b7c9c5-5phw5" Apr 16 08:48:45.407921 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.407842 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxmtw\" (UniqueName: \"kubernetes.io/projected/85bde3e0-0726-4ace-8cd7-5e14dc7389d6-kube-api-access-dxmtw\") pod \"85bde3e0-0726-4ace-8cd7-5e14dc7389d6\" (UID: \"85bde3e0-0726-4ace-8cd7-5e14dc7389d6\") " Apr 16 08:48:45.408060 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.407926 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsb94\" (UniqueName: \"kubernetes.io/projected/d71bca0c-1ac4-4569-bf43-9737dcedb07e-kube-api-access-zsb94\") pod \"d71bca0c-1ac4-4569-bf43-9737dcedb07e\" (UID: \"d71bca0c-1ac4-4569-bf43-9737dcedb07e\") " Apr 16 08:48:45.408060 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.407955 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/85bde3e0-0726-4ace-8cd7-5e14dc7389d6-openshift-service-ca\") pod \"85bde3e0-0726-4ace-8cd7-5e14dc7389d6\" (UID: \"85bde3e0-0726-4ace-8cd7-5e14dc7389d6\") " Apr 16 08:48:45.408373 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.408343 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85bde3e0-0726-4ace-8cd7-5e14dc7389d6-openshift-service-ca" (OuterVolumeSpecName: "openshift-service-ca") pod "85bde3e0-0726-4ace-8cd7-5e14dc7389d6" (UID: "85bde3e0-0726-4ace-8cd7-5e14dc7389d6"). InnerVolumeSpecName "openshift-service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:48:45.410231 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.410203 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d71bca0c-1ac4-4569-bf43-9737dcedb07e-kube-api-access-zsb94" (OuterVolumeSpecName: "kube-api-access-zsb94") pod "d71bca0c-1ac4-4569-bf43-9737dcedb07e" (UID: "d71bca0c-1ac4-4569-bf43-9737dcedb07e"). InnerVolumeSpecName "kube-api-access-zsb94". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:48:45.410347 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.410248 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85bde3e0-0726-4ace-8cd7-5e14dc7389d6-kube-api-access-dxmtw" (OuterVolumeSpecName: "kube-api-access-dxmtw") pod "85bde3e0-0726-4ace-8cd7-5e14dc7389d6" (UID: "85bde3e0-0726-4ace-8cd7-5e14dc7389d6"). InnerVolumeSpecName "kube-api-access-dxmtw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:48:45.514617 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.513415 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zsb94\" (UniqueName: \"kubernetes.io/projected/d71bca0c-1ac4-4569-bf43-9737dcedb07e-kube-api-access-zsb94\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:48:45.514617 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.513459 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/85bde3e0-0726-4ace-8cd7-5e14dc7389d6-openshift-service-ca\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:48:45.514617 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.513482 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dxmtw\" (UniqueName: \"kubernetes.io/projected/85bde3e0-0726-4ace-8cd7-5e14dc7389d6-kube-api-access-dxmtw\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:48:45.744937 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.744839 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f7689cdc8-t6v6g" event={"ID":"dfe384fe-5807-4ad3-971a-46211d6298fd","Type":"ContainerStarted","Data":"35951679933dcafde6a03e2b7d7e4403a976cab2a062e3a6291216ccb49af8ae"} Apr 16 08:48:45.745385 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.744982 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-f7689cdc8-t6v6g" Apr 16 08:48:45.746923 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.746893 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-kgfdr" event={"ID":"d71bca0c-1ac4-4569-bf43-9737dcedb07e","Type":"ContainerDied","Data":"d897a6e53863a34c43fa86f9f66192a45241802bb0ec39a38a736e20900c9ad1"} Apr 16 08:48:45.747052 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.746905 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-kgfdr" Apr 16 08:48:45.747052 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.746930 2578 scope.go:117] "RemoveContainer" containerID="e6d66404d5805e19cdffb13772775637bd2193014ca63701bcaf836927a687c9" Apr 16 08:48:45.748158 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.748123 2578 generic.go:358] "Generic (PLEG): container finished" podID="85bde3e0-0726-4ace-8cd7-5e14dc7389d6" containerID="583aeaf94ed9b54daa78f97d3e303e274fe5c9e3178f727beaf7a9c9e0e08ba9" exitCode=0 Apr 16 08:48:45.748291 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.748151 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7d79b7c9c5-5phw5" event={"ID":"85bde3e0-0726-4ace-8cd7-5e14dc7389d6","Type":"ContainerDied","Data":"583aeaf94ed9b54daa78f97d3e303e274fe5c9e3178f727beaf7a9c9e0e08ba9"} Apr 16 08:48:45.748291 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.748179 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7d79b7c9c5-5phw5" Apr 16 08:48:45.748291 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.748190 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7d79b7c9c5-5phw5" event={"ID":"85bde3e0-0726-4ace-8cd7-5e14dc7389d6","Type":"ContainerDied","Data":"c7bcce50b9827d375bcc58b289153d73fd6d2f0b1577b60eef6ab92724857537"} Apr 16 08:48:45.749817 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.749790 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-57dfbb646d-ztr94" event={"ID":"033e236d-46d4-4e05-a295-7fce50e14911","Type":"ContainerStarted","Data":"202bc3799c1e065b76205654e05d5ff3d5a0d33da1d7af893d293d28ff88a3cb"} Apr 16 08:48:45.749953 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.749936 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-57dfbb646d-ztr94" Apr 16 08:48:45.756979 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.756627 2578 scope.go:117] "RemoveContainer" containerID="583aeaf94ed9b54daa78f97d3e303e274fe5c9e3178f727beaf7a9c9e0e08ba9" Apr 16 08:48:45.762053 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.762014 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-f7689cdc8-t6v6g" podStartSLOduration=1.7708589 podStartE2EDuration="5.762002075s" podCreationTimestamp="2026-04-16 08:48:40 +0000 UTC" firstStartedPulling="2026-04-16 08:48:41.175035678 +0000 UTC m=+573.095021531" lastFinishedPulling="2026-04-16 08:48:45.166178848 +0000 UTC m=+577.086164706" observedRunningTime="2026-04-16 08:48:45.760499513 +0000 UTC m=+577.680485391" watchObservedRunningTime="2026-04-16 08:48:45.762002075 +0000 UTC m=+577.681987950" Apr 16 08:48:45.765501 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.765484 2578 scope.go:117] "RemoveContainer" containerID="583aeaf94ed9b54daa78f97d3e303e274fe5c9e3178f727beaf7a9c9e0e08ba9" Apr 16 08:48:45.765749 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:48:45.765730 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"583aeaf94ed9b54daa78f97d3e303e274fe5c9e3178f727beaf7a9c9e0e08ba9\": container with ID starting with 583aeaf94ed9b54daa78f97d3e303e274fe5c9e3178f727beaf7a9c9e0e08ba9 not found: ID does not exist" containerID="583aeaf94ed9b54daa78f97d3e303e274fe5c9e3178f727beaf7a9c9e0e08ba9" Apr 16 08:48:45.765808 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.765756 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"583aeaf94ed9b54daa78f97d3e303e274fe5c9e3178f727beaf7a9c9e0e08ba9"} err="failed to get container status \"583aeaf94ed9b54daa78f97d3e303e274fe5c9e3178f727beaf7a9c9e0e08ba9\": rpc error: code = NotFound desc = could not find container \"583aeaf94ed9b54daa78f97d3e303e274fe5c9e3178f727beaf7a9c9e0e08ba9\": container with ID starting with 583aeaf94ed9b54daa78f97d3e303e274fe5c9e3178f727beaf7a9c9e0e08ba9 not found: ID does not exist" Apr 16 08:48:45.775785 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.775763 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7d79b7c9c5-5phw5"] Apr 16 08:48:45.779272 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.779252 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7d79b7c9c5-5phw5"] Apr 16 08:48:45.789758 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.789736 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-kgfdr"] Apr 16 08:48:45.793471 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.793451 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-kgfdr"] Apr 16 08:48:45.807513 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:45.807467 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-57dfbb646d-ztr94" podStartSLOduration=1.9903930920000001 podStartE2EDuration="5.807453172s" podCreationTimestamp="2026-04-16 08:48:40 +0000 UTC" firstStartedPulling="2026-04-16 08:48:41.349587018 +0000 UTC m=+573.269572871" lastFinishedPulling="2026-04-16 08:48:45.166647094 +0000 UTC m=+577.086632951" observedRunningTime="2026-04-16 08:48:45.805643658 +0000 UTC m=+577.725629547" watchObservedRunningTime="2026-04-16 08:48:45.807453172 +0000 UTC m=+577.727439048" Apr 16 08:48:46.667443 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:46.667408 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85bde3e0-0726-4ace-8cd7-5e14dc7389d6" path="/var/lib/kubelet/pods/85bde3e0-0726-4ace-8cd7-5e14dc7389d6/volumes" Apr 16 08:48:46.667783 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:46.667770 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d71bca0c-1ac4-4569-bf43-9737dcedb07e" path="/var/lib/kubelet/pods/d71bca0c-1ac4-4569-bf43-9737dcedb07e/volumes" Apr 16 08:48:51.097045 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.097007 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-57dfbb646d-ztr94"] Apr 16 08:48:51.097590 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.097271 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-57dfbb646d-ztr94" podUID="033e236d-46d4-4e05-a295-7fce50e14911" containerName="maas-api" containerID="cri-o://202bc3799c1e065b76205654e05d5ff3d5a0d33da1d7af893d293d28ff88a3cb" gracePeriod=30 Apr 16 08:48:51.102066 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.102045 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-57dfbb646d-ztr94" Apr 16 08:48:51.348009 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.347945 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-57dfbb646d-ztr94" Apr 16 08:48:51.469956 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.469920 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/033e236d-46d4-4e05-a295-7fce50e14911-maas-api-tls\") pod \"033e236d-46d4-4e05-a295-7fce50e14911\" (UID: \"033e236d-46d4-4e05-a295-7fce50e14911\") " Apr 16 08:48:51.469956 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.469961 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cxhw\" (UniqueName: \"kubernetes.io/projected/033e236d-46d4-4e05-a295-7fce50e14911-kube-api-access-5cxhw\") pod \"033e236d-46d4-4e05-a295-7fce50e14911\" (UID: \"033e236d-46d4-4e05-a295-7fce50e14911\") " Apr 16 08:48:51.472527 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.472476 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/033e236d-46d4-4e05-a295-7fce50e14911-kube-api-access-5cxhw" (OuterVolumeSpecName: "kube-api-access-5cxhw") pod "033e236d-46d4-4e05-a295-7fce50e14911" (UID: "033e236d-46d4-4e05-a295-7fce50e14911"). InnerVolumeSpecName "kube-api-access-5cxhw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:48:51.472527 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.472518 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033e236d-46d4-4e05-a295-7fce50e14911-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "033e236d-46d4-4e05-a295-7fce50e14911" (UID: "033e236d-46d4-4e05-a295-7fce50e14911"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:48:51.571504 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.571447 2578 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/033e236d-46d4-4e05-a295-7fce50e14911-maas-api-tls\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:48:51.571504 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.571497 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5cxhw\" (UniqueName: \"kubernetes.io/projected/033e236d-46d4-4e05-a295-7fce50e14911-kube-api-access-5cxhw\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:48:51.772707 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.772614 2578 generic.go:358] "Generic (PLEG): container finished" podID="033e236d-46d4-4e05-a295-7fce50e14911" containerID="202bc3799c1e065b76205654e05d5ff3d5a0d33da1d7af893d293d28ff88a3cb" exitCode=0 Apr 16 08:48:51.772707 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.772680 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-57dfbb646d-ztr94" Apr 16 08:48:51.772707 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.772695 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-57dfbb646d-ztr94" event={"ID":"033e236d-46d4-4e05-a295-7fce50e14911","Type":"ContainerDied","Data":"202bc3799c1e065b76205654e05d5ff3d5a0d33da1d7af893d293d28ff88a3cb"} Apr 16 08:48:51.772969 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.772747 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-57dfbb646d-ztr94" event={"ID":"033e236d-46d4-4e05-a295-7fce50e14911","Type":"ContainerDied","Data":"f8b78e797c4955ba30e7d1cd4291b1c4bd60ca0d321e321814b2d3287418afb0"} Apr 16 08:48:51.772969 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.772761 2578 scope.go:117] "RemoveContainer" containerID="202bc3799c1e065b76205654e05d5ff3d5a0d33da1d7af893d293d28ff88a3cb" Apr 16 08:48:51.781731 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.781691 2578 scope.go:117] "RemoveContainer" containerID="202bc3799c1e065b76205654e05d5ff3d5a0d33da1d7af893d293d28ff88a3cb" Apr 16 08:48:51.781994 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:48:51.781972 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"202bc3799c1e065b76205654e05d5ff3d5a0d33da1d7af893d293d28ff88a3cb\": container with ID starting with 202bc3799c1e065b76205654e05d5ff3d5a0d33da1d7af893d293d28ff88a3cb not found: ID does not exist" containerID="202bc3799c1e065b76205654e05d5ff3d5a0d33da1d7af893d293d28ff88a3cb" Apr 16 08:48:51.782056 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.782001 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"202bc3799c1e065b76205654e05d5ff3d5a0d33da1d7af893d293d28ff88a3cb"} err="failed to get container status \"202bc3799c1e065b76205654e05d5ff3d5a0d33da1d7af893d293d28ff88a3cb\": rpc error: code = NotFound desc = could not find container \"202bc3799c1e065b76205654e05d5ff3d5a0d33da1d7af893d293d28ff88a3cb\": container with ID starting with 202bc3799c1e065b76205654e05d5ff3d5a0d33da1d7af893d293d28ff88a3cb not found: ID does not exist" Apr 16 08:48:51.794596 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.794576 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-57dfbb646d-ztr94"] Apr 16 08:48:51.797654 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:51.797635 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-57dfbb646d-ztr94"] Apr 16 08:48:52.667270 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:52.667237 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="033e236d-46d4-4e05-a295-7fce50e14911" path="/var/lib/kubelet/pods/033e236d-46d4-4e05-a295-7fce50e14911/volumes" Apr 16 08:48:56.760530 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:56.760501 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-f7689cdc8-t6v6g" Apr 16 08:48:57.058155 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.058055 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7f79877486-ptbtd"] Apr 16 08:48:57.058983 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.058954 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="033e236d-46d4-4e05-a295-7fce50e14911" containerName="maas-api" Apr 16 08:48:57.058983 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.058982 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="033e236d-46d4-4e05-a295-7fce50e14911" containerName="maas-api" Apr 16 08:48:57.059170 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.059033 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85bde3e0-0726-4ace-8cd7-5e14dc7389d6" containerName="authorino" Apr 16 08:48:57.059170 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.059049 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bde3e0-0726-4ace-8cd7-5e14dc7389d6" containerName="authorino" Apr 16 08:48:57.059170 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.059069 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d71bca0c-1ac4-4569-bf43-9737dcedb07e" containerName="authorino" Apr 16 08:48:57.059170 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.059077 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71bca0c-1ac4-4569-bf43-9737dcedb07e" containerName="authorino" Apr 16 08:48:57.059170 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.059094 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f43281b-e287-44f0-a3fb-315352010f39" containerName="authorino" Apr 16 08:48:57.059170 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.059103 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f43281b-e287-44f0-a3fb-315352010f39" containerName="authorino" Apr 16 08:48:57.059422 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.059258 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f43281b-e287-44f0-a3fb-315352010f39" containerName="authorino" Apr 16 08:48:57.059422 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.059277 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="85bde3e0-0726-4ace-8cd7-5e14dc7389d6" containerName="authorino" Apr 16 08:48:57.059422 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.059295 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d71bca0c-1ac4-4569-bf43-9737dcedb07e" containerName="authorino" Apr 16 08:48:57.059422 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.059304 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="033e236d-46d4-4e05-a295-7fce50e14911" containerName="maas-api" Apr 16 08:48:57.062979 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.062951 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f79877486-ptbtd" Apr 16 08:48:57.067058 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.067028 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7f79877486-ptbtd"] Apr 16 08:48:57.123386 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.123355 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z7pj\" (UniqueName: \"kubernetes.io/projected/8bb71c58-a16f-4e7c-9bff-3111f3c94d48-kube-api-access-6z7pj\") pod \"maas-controller-7f79877486-ptbtd\" (UID: \"8bb71c58-a16f-4e7c-9bff-3111f3c94d48\") " pod="opendatahub/maas-controller-7f79877486-ptbtd" Apr 16 08:48:57.224112 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.224077 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6z7pj\" (UniqueName: \"kubernetes.io/projected/8bb71c58-a16f-4e7c-9bff-3111f3c94d48-kube-api-access-6z7pj\") pod \"maas-controller-7f79877486-ptbtd\" (UID: \"8bb71c58-a16f-4e7c-9bff-3111f3c94d48\") " pod="opendatahub/maas-controller-7f79877486-ptbtd" Apr 16 08:48:57.232196 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.232168 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z7pj\" (UniqueName: \"kubernetes.io/projected/8bb71c58-a16f-4e7c-9bff-3111f3c94d48-kube-api-access-6z7pj\") pod \"maas-controller-7f79877486-ptbtd\" (UID: \"8bb71c58-a16f-4e7c-9bff-3111f3c94d48\") " pod="opendatahub/maas-controller-7f79877486-ptbtd" Apr 16 08:48:57.374935 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.374903 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f79877486-ptbtd" Apr 16 08:48:57.528184 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.528156 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7f79877486-ptbtd"] Apr 16 08:48:57.529855 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:48:57.529826 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bb71c58_a16f_4e7c_9bff_3111f3c94d48.slice/crio-758b64d72b1e40a9c5584ecafceda9af4006d4920a5bb713e68346746ed2e08a WatchSource:0}: Error finding container 758b64d72b1e40a9c5584ecafceda9af4006d4920a5bb713e68346746ed2e08a: Status 404 returned error can't find the container with id 758b64d72b1e40a9c5584ecafceda9af4006d4920a5bb713e68346746ed2e08a Apr 16 08:48:57.795367 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:57.795284 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f79877486-ptbtd" event={"ID":"8bb71c58-a16f-4e7c-9bff-3111f3c94d48","Type":"ContainerStarted","Data":"758b64d72b1e40a9c5584ecafceda9af4006d4920a5bb713e68346746ed2e08a"} Apr 16 08:48:58.800378 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:58.800344 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f79877486-ptbtd" event={"ID":"8bb71c58-a16f-4e7c-9bff-3111f3c94d48","Type":"ContainerStarted","Data":"2c509c0c90342691b586aec1fda22529207c500a9ec72e876621ea9b74e9daf9"} Apr 16 08:48:58.800791 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:58.800403 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7f79877486-ptbtd" Apr 16 08:48:58.816091 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:48:58.816036 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7f79877486-ptbtd" podStartSLOduration=1.432267217 podStartE2EDuration="1.81602221s" podCreationTimestamp="2026-04-16 08:48:57 +0000 UTC" firstStartedPulling="2026-04-16 08:48:57.531173092 +0000 UTC m=+589.451158945" lastFinishedPulling="2026-04-16 08:48:57.914928079 +0000 UTC m=+589.834913938" observedRunningTime="2026-04-16 08:48:58.813606225 +0000 UTC m=+590.733592099" watchObservedRunningTime="2026-04-16 08:48:58.81602221 +0000 UTC m=+590.736008085" Apr 16 08:49:08.612296 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:08.612262 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/1.log" Apr 16 08:49:08.612699 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:08.612401 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/1.log" Apr 16 08:49:08.617313 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:08.617293 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/0.log" Apr 16 08:49:08.617431 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:08.617365 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/0.log" Apr 16 08:49:09.809878 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:09.809844 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7f79877486-ptbtd" Apr 16 08:49:09.845858 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:09.845829 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-f7689cdc8-t6v6g"] Apr 16 08:49:09.846127 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:09.846085 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-f7689cdc8-t6v6g" podUID="dfe384fe-5807-4ad3-971a-46211d6298fd" containerName="manager" containerID="cri-o://35951679933dcafde6a03e2b7d7e4403a976cab2a062e3a6291216ccb49af8ae" gracePeriod=10 Apr 16 08:49:10.086861 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:10.086834 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f7689cdc8-t6v6g" Apr 16 08:49:10.244438 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:10.244399 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx49r\" (UniqueName: \"kubernetes.io/projected/dfe384fe-5807-4ad3-971a-46211d6298fd-kube-api-access-qx49r\") pod \"dfe384fe-5807-4ad3-971a-46211d6298fd\" (UID: \"dfe384fe-5807-4ad3-971a-46211d6298fd\") " Apr 16 08:49:10.246598 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:10.246574 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfe384fe-5807-4ad3-971a-46211d6298fd-kube-api-access-qx49r" (OuterVolumeSpecName: "kube-api-access-qx49r") pod "dfe384fe-5807-4ad3-971a-46211d6298fd" (UID: "dfe384fe-5807-4ad3-971a-46211d6298fd"). InnerVolumeSpecName "kube-api-access-qx49r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:49:10.345822 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:10.345787 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qx49r\" (UniqueName: \"kubernetes.io/projected/dfe384fe-5807-4ad3-971a-46211d6298fd-kube-api-access-qx49r\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:49:10.842618 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:10.842581 2578 generic.go:358] "Generic (PLEG): container finished" podID="dfe384fe-5807-4ad3-971a-46211d6298fd" containerID="35951679933dcafde6a03e2b7d7e4403a976cab2a062e3a6291216ccb49af8ae" exitCode=0 Apr 16 08:49:10.843166 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:10.842639 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f7689cdc8-t6v6g" event={"ID":"dfe384fe-5807-4ad3-971a-46211d6298fd","Type":"ContainerDied","Data":"35951679933dcafde6a03e2b7d7e4403a976cab2a062e3a6291216ccb49af8ae"} Apr 16 08:49:10.843166 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:10.842647 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f7689cdc8-t6v6g" Apr 16 08:49:10.843166 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:10.842673 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f7689cdc8-t6v6g" event={"ID":"dfe384fe-5807-4ad3-971a-46211d6298fd","Type":"ContainerDied","Data":"5506c52b88b9b74ab70309eaa9f50182bdf04d11ca9e25f92bf92c43bed21235"} Apr 16 08:49:10.843166 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:10.842694 2578 scope.go:117] "RemoveContainer" containerID="35951679933dcafde6a03e2b7d7e4403a976cab2a062e3a6291216ccb49af8ae" Apr 16 08:49:10.850862 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:10.850682 2578 scope.go:117] "RemoveContainer" containerID="35951679933dcafde6a03e2b7d7e4403a976cab2a062e3a6291216ccb49af8ae" Apr 16 08:49:10.850992 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:49:10.850970 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35951679933dcafde6a03e2b7d7e4403a976cab2a062e3a6291216ccb49af8ae\": container with ID starting with 35951679933dcafde6a03e2b7d7e4403a976cab2a062e3a6291216ccb49af8ae not found: ID does not exist" containerID="35951679933dcafde6a03e2b7d7e4403a976cab2a062e3a6291216ccb49af8ae" Apr 16 08:49:10.851040 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:10.851002 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35951679933dcafde6a03e2b7d7e4403a976cab2a062e3a6291216ccb49af8ae"} err="failed to get container status \"35951679933dcafde6a03e2b7d7e4403a976cab2a062e3a6291216ccb49af8ae\": rpc error: code = NotFound desc = could not find container \"35951679933dcafde6a03e2b7d7e4403a976cab2a062e3a6291216ccb49af8ae\": container with ID starting with 35951679933dcafde6a03e2b7d7e4403a976cab2a062e3a6291216ccb49af8ae not found: ID does not exist" Apr 16 08:49:10.858778 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:10.858756 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-f7689cdc8-t6v6g"] Apr 16 08:49:10.868461 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:10.868439 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-f7689cdc8-t6v6g"] Apr 16 08:49:12.666387 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:12.666352 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfe384fe-5807-4ad3-971a-46211d6298fd" path="/var/lib/kubelet/pods/dfe384fe-5807-4ad3-971a-46211d6298fd/volumes" Apr 16 08:49:19.698602 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.698568 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv"] Apr 16 08:49:19.699048 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.699030 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfe384fe-5807-4ad3-971a-46211d6298fd" containerName="manager" Apr 16 08:49:19.699089 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.699052 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe384fe-5807-4ad3-971a-46211d6298fd" containerName="manager" Apr 16 08:49:19.699134 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.699125 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfe384fe-5807-4ad3-971a-46211d6298fd" containerName="manager" Apr 16 08:49:19.706597 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.706578 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:19.711086 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.711060 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-88shx\"" Apr 16 08:49:19.711242 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.711088 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 16 08:49:19.711242 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.711060 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 08:49:19.711453 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.711436 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 08:49:19.712365 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.712342 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv"] Apr 16 08:49:19.834356 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.834318 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9bd69522-106a-4245-ba69-ec8d51af6dca-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv\" (UID: \"9bd69522-106a-4245-ba69-ec8d51af6dca\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:19.834530 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.834364 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rf88\" (UniqueName: \"kubernetes.io/projected/9bd69522-106a-4245-ba69-ec8d51af6dca-kube-api-access-4rf88\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv\" (UID: \"9bd69522-106a-4245-ba69-ec8d51af6dca\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:19.834530 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.834455 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9bd69522-106a-4245-ba69-ec8d51af6dca-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv\" (UID: \"9bd69522-106a-4245-ba69-ec8d51af6dca\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:19.834530 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.834519 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9bd69522-106a-4245-ba69-ec8d51af6dca-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv\" (UID: \"9bd69522-106a-4245-ba69-ec8d51af6dca\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:19.834633 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.834542 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9bd69522-106a-4245-ba69-ec8d51af6dca-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv\" (UID: \"9bd69522-106a-4245-ba69-ec8d51af6dca\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:19.834633 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.834576 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9bd69522-106a-4245-ba69-ec8d51af6dca-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv\" (UID: \"9bd69522-106a-4245-ba69-ec8d51af6dca\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:19.935517 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.935478 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9bd69522-106a-4245-ba69-ec8d51af6dca-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv\" (UID: \"9bd69522-106a-4245-ba69-ec8d51af6dca\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:19.935686 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.935538 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9bd69522-106a-4245-ba69-ec8d51af6dca-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv\" (UID: \"9bd69522-106a-4245-ba69-ec8d51af6dca\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:19.935686 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.935566 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rf88\" (UniqueName: \"kubernetes.io/projected/9bd69522-106a-4245-ba69-ec8d51af6dca-kube-api-access-4rf88\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv\" (UID: \"9bd69522-106a-4245-ba69-ec8d51af6dca\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:19.935686 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.935604 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9bd69522-106a-4245-ba69-ec8d51af6dca-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv\" (UID: \"9bd69522-106a-4245-ba69-ec8d51af6dca\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:19.935686 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.935647 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9bd69522-106a-4245-ba69-ec8d51af6dca-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv\" (UID: \"9bd69522-106a-4245-ba69-ec8d51af6dca\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:19.935686 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.935663 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9bd69522-106a-4245-ba69-ec8d51af6dca-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv\" (UID: \"9bd69522-106a-4245-ba69-ec8d51af6dca\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:19.936020 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.935939 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9bd69522-106a-4245-ba69-ec8d51af6dca-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv\" (UID: \"9bd69522-106a-4245-ba69-ec8d51af6dca\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:19.936063 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.936014 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9bd69522-106a-4245-ba69-ec8d51af6dca-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv\" (UID: \"9bd69522-106a-4245-ba69-ec8d51af6dca\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:19.936063 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.936054 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9bd69522-106a-4245-ba69-ec8d51af6dca-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv\" (UID: \"9bd69522-106a-4245-ba69-ec8d51af6dca\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:19.938146 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.938123 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9bd69522-106a-4245-ba69-ec8d51af6dca-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv\" (UID: \"9bd69522-106a-4245-ba69-ec8d51af6dca\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:19.938318 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.938301 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9bd69522-106a-4245-ba69-ec8d51af6dca-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv\" (UID: \"9bd69522-106a-4245-ba69-ec8d51af6dca\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:19.943163 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:19.943144 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rf88\" (UniqueName: \"kubernetes.io/projected/9bd69522-106a-4245-ba69-ec8d51af6dca-kube-api-access-4rf88\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv\" (UID: \"9bd69522-106a-4245-ba69-ec8d51af6dca\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:20.017558 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.017472 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:20.148636 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.148600 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv"] Apr 16 08:49:20.150984 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:49:20.150943 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bd69522_106a_4245_ba69_ec8d51af6dca.slice/crio-342454c1ddb2afa399bffcbad3a8b15990e49adaa7b48f8d0fd53df529ab385a WatchSource:0}: Error finding container 342454c1ddb2afa399bffcbad3a8b15990e49adaa7b48f8d0fd53df529ab385a: Status 404 returned error can't find the container with id 342454c1ddb2afa399bffcbad3a8b15990e49adaa7b48f8d0fd53df529ab385a Apr 16 08:49:20.303091 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.302986 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7"] Apr 16 08:49:20.307919 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.307897 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.311397 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.311378 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 16 08:49:20.324200 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.324174 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7"] Apr 16 08:49:20.341654 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.341629 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8d130a90-dcb1-40f8-b852-3d1d35375afd-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7\" (UID: \"8d130a90-dcb1-40f8-b852-3d1d35375afd\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.341787 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.341668 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8d130a90-dcb1-40f8-b852-3d1d35375afd-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7\" (UID: \"8d130a90-dcb1-40f8-b852-3d1d35375afd\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.341787 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.341777 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5dvq\" (UniqueName: \"kubernetes.io/projected/8d130a90-dcb1-40f8-b852-3d1d35375afd-kube-api-access-n5dvq\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7\" (UID: \"8d130a90-dcb1-40f8-b852-3d1d35375afd\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.341854 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.341796 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8d130a90-dcb1-40f8-b852-3d1d35375afd-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7\" (UID: \"8d130a90-dcb1-40f8-b852-3d1d35375afd\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.341854 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.341818 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8d130a90-dcb1-40f8-b852-3d1d35375afd-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7\" (UID: \"8d130a90-dcb1-40f8-b852-3d1d35375afd\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.341922 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.341892 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d130a90-dcb1-40f8-b852-3d1d35375afd-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7\" (UID: \"8d130a90-dcb1-40f8-b852-3d1d35375afd\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.442854 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.442822 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8d130a90-dcb1-40f8-b852-3d1d35375afd-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7\" (UID: \"8d130a90-dcb1-40f8-b852-3d1d35375afd\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.443015 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.442894 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5dvq\" (UniqueName: \"kubernetes.io/projected/8d130a90-dcb1-40f8-b852-3d1d35375afd-kube-api-access-n5dvq\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7\" (UID: \"8d130a90-dcb1-40f8-b852-3d1d35375afd\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.443015 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.442917 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8d130a90-dcb1-40f8-b852-3d1d35375afd-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7\" (UID: \"8d130a90-dcb1-40f8-b852-3d1d35375afd\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.443015 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.442940 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8d130a90-dcb1-40f8-b852-3d1d35375afd-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7\" (UID: \"8d130a90-dcb1-40f8-b852-3d1d35375afd\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.443172 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.443016 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d130a90-dcb1-40f8-b852-3d1d35375afd-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7\" (UID: \"8d130a90-dcb1-40f8-b852-3d1d35375afd\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.443172 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.443058 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8d130a90-dcb1-40f8-b852-3d1d35375afd-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7\" (UID: \"8d130a90-dcb1-40f8-b852-3d1d35375afd\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.443454 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.443410 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d130a90-dcb1-40f8-b852-3d1d35375afd-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7\" (UID: \"8d130a90-dcb1-40f8-b852-3d1d35375afd\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.443570 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.443462 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8d130a90-dcb1-40f8-b852-3d1d35375afd-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7\" (UID: \"8d130a90-dcb1-40f8-b852-3d1d35375afd\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.443570 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.443457 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8d130a90-dcb1-40f8-b852-3d1d35375afd-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7\" (UID: \"8d130a90-dcb1-40f8-b852-3d1d35375afd\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.445270 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.445249 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8d130a90-dcb1-40f8-b852-3d1d35375afd-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7\" (UID: \"8d130a90-dcb1-40f8-b852-3d1d35375afd\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.445468 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.445452 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8d130a90-dcb1-40f8-b852-3d1d35375afd-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7\" (UID: \"8d130a90-dcb1-40f8-b852-3d1d35375afd\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.455019 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.454990 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5dvq\" (UniqueName: \"kubernetes.io/projected/8d130a90-dcb1-40f8-b852-3d1d35375afd-kube-api-access-n5dvq\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7\" (UID: \"8d130a90-dcb1-40f8-b852-3d1d35375afd\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.618649 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.618607 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:20.753654 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.753625 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7"] Apr 16 08:49:20.755892 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:49:20.755839 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d130a90_dcb1_40f8_b852_3d1d35375afd.slice/crio-16276a0c82c9bc6a669ca3da874dde31aa8fe5b1b820de7caed4fae475a681b7 WatchSource:0}: Error finding container 16276a0c82c9bc6a669ca3da874dde31aa8fe5b1b820de7caed4fae475a681b7: Status 404 returned error can't find the container with id 16276a0c82c9bc6a669ca3da874dde31aa8fe5b1b820de7caed4fae475a681b7 Apr 16 08:49:20.880735 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.880622 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" event={"ID":"9bd69522-106a-4245-ba69-ec8d51af6dca","Type":"ContainerStarted","Data":"342454c1ddb2afa399bffcbad3a8b15990e49adaa7b48f8d0fd53df529ab385a"} Apr 16 08:49:20.881994 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:20.881953 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" event={"ID":"8d130a90-dcb1-40f8-b852-3d1d35375afd","Type":"ContainerStarted","Data":"16276a0c82c9bc6a669ca3da874dde31aa8fe5b1b820de7caed4fae475a681b7"} Apr 16 08:49:21.994848 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:21.994811 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd"] Apr 16 08:49:21.998963 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:21.998935 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.001605 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.001575 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 16 08:49:22.009895 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.009736 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd"] Apr 16 08:49:22.056961 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.056835 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr66j\" (UniqueName: \"kubernetes.io/projected/9f329d91-e0c4-4987-8788-700da7886b55-kube-api-access-gr66j\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hxdnd\" (UID: \"9f329d91-e0c4-4987-8788-700da7886b55\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.056961 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.056922 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f329d91-e0c4-4987-8788-700da7886b55-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hxdnd\" (UID: \"9f329d91-e0c4-4987-8788-700da7886b55\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.057322 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.056989 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f329d91-e0c4-4987-8788-700da7886b55-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hxdnd\" (UID: \"9f329d91-e0c4-4987-8788-700da7886b55\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.057322 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.057036 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f329d91-e0c4-4987-8788-700da7886b55-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hxdnd\" (UID: \"9f329d91-e0c4-4987-8788-700da7886b55\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.057322 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.057065 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f329d91-e0c4-4987-8788-700da7886b55-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hxdnd\" (UID: \"9f329d91-e0c4-4987-8788-700da7886b55\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.057322 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.057113 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f329d91-e0c4-4987-8788-700da7886b55-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hxdnd\" (UID: \"9f329d91-e0c4-4987-8788-700da7886b55\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.158951 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.158342 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gr66j\" (UniqueName: \"kubernetes.io/projected/9f329d91-e0c4-4987-8788-700da7886b55-kube-api-access-gr66j\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hxdnd\" (UID: \"9f329d91-e0c4-4987-8788-700da7886b55\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.158951 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.158438 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f329d91-e0c4-4987-8788-700da7886b55-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hxdnd\" (UID: \"9f329d91-e0c4-4987-8788-700da7886b55\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.158951 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.158482 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f329d91-e0c4-4987-8788-700da7886b55-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hxdnd\" (UID: \"9f329d91-e0c4-4987-8788-700da7886b55\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.158951 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.158516 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f329d91-e0c4-4987-8788-700da7886b55-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hxdnd\" (UID: \"9f329d91-e0c4-4987-8788-700da7886b55\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.158951 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.158541 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f329d91-e0c4-4987-8788-700da7886b55-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hxdnd\" (UID: \"9f329d91-e0c4-4987-8788-700da7886b55\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.158951 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.158591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f329d91-e0c4-4987-8788-700da7886b55-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hxdnd\" (UID: \"9f329d91-e0c4-4987-8788-700da7886b55\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.159482 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.159289 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f329d91-e0c4-4987-8788-700da7886b55-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hxdnd\" (UID: \"9f329d91-e0c4-4987-8788-700da7886b55\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.159482 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.159445 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f329d91-e0c4-4987-8788-700da7886b55-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hxdnd\" (UID: \"9f329d91-e0c4-4987-8788-700da7886b55\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.159600 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.159486 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f329d91-e0c4-4987-8788-700da7886b55-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hxdnd\" (UID: \"9f329d91-e0c4-4987-8788-700da7886b55\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.162175 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.162149 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f329d91-e0c4-4987-8788-700da7886b55-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hxdnd\" (UID: \"9f329d91-e0c4-4987-8788-700da7886b55\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.162498 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.162471 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f329d91-e0c4-4987-8788-700da7886b55-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hxdnd\" (UID: \"9f329d91-e0c4-4987-8788-700da7886b55\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.166463 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.166438 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr66j\" (UniqueName: \"kubernetes.io/projected/9f329d91-e0c4-4987-8788-700da7886b55-kube-api-access-gr66j\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hxdnd\" (UID: \"9f329d91-e0c4-4987-8788-700da7886b55\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.316266 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.316107 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:22.481701 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.481662 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd"] Apr 16 08:49:22.485852 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:49:22.485823 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f329d91_e0c4_4987_8788_700da7886b55.slice/crio-9aa5858a059aaddf9a34e6773c36b8a8b0873a4cdb07cf7ad39ba8d4a9729333 WatchSource:0}: Error finding container 9aa5858a059aaddf9a34e6773c36b8a8b0873a4cdb07cf7ad39ba8d4a9729333: Status 404 returned error can't find the container with id 9aa5858a059aaddf9a34e6773c36b8a8b0873a4cdb07cf7ad39ba8d4a9729333 Apr 16 08:49:22.892122 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:22.892072 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" event={"ID":"9f329d91-e0c4-4987-8788-700da7886b55","Type":"ContainerStarted","Data":"9aa5858a059aaddf9a34e6773c36b8a8b0873a4cdb07cf7ad39ba8d4a9729333"} Apr 16 08:49:26.916334 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:26.916290 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" event={"ID":"9f329d91-e0c4-4987-8788-700da7886b55","Type":"ContainerStarted","Data":"058a7194fc92b06131f398a54595d39d8c21d954cd68def276ac8e897feb016d"} Apr 16 08:49:26.918237 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:26.918203 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" event={"ID":"9bd69522-106a-4245-ba69-ec8d51af6dca","Type":"ContainerStarted","Data":"c5d6265d633b98140b64218a382e054a44030fc804811946dee0a74ad1c279fb"} Apr 16 08:49:26.919893 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:26.919857 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" event={"ID":"8d130a90-dcb1-40f8-b852-3d1d35375afd","Type":"ContainerStarted","Data":"e16c726e7a33e828b96bb498e24ad710a70b491eeb212468b0876f860377f421"} Apr 16 08:49:31.941561 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:31.941520 2578 generic.go:358] "Generic (PLEG): container finished" podID="8d130a90-dcb1-40f8-b852-3d1d35375afd" containerID="e16c726e7a33e828b96bb498e24ad710a70b491eeb212468b0876f860377f421" exitCode=0 Apr 16 08:49:31.941978 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:31.941607 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" event={"ID":"8d130a90-dcb1-40f8-b852-3d1d35375afd","Type":"ContainerDied","Data":"e16c726e7a33e828b96bb498e24ad710a70b491eeb212468b0876f860377f421"} Apr 16 08:49:32.947737 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:32.947642 2578 generic.go:358] "Generic (PLEG): container finished" podID="9f329d91-e0c4-4987-8788-700da7886b55" containerID="058a7194fc92b06131f398a54595d39d8c21d954cd68def276ac8e897feb016d" exitCode=0 Apr 16 08:49:32.948186 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:32.947738 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" event={"ID":"9f329d91-e0c4-4987-8788-700da7886b55","Type":"ContainerDied","Data":"058a7194fc92b06131f398a54595d39d8c21d954cd68def276ac8e897feb016d"} Apr 16 08:49:32.951080 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:32.951049 2578 generic.go:358] "Generic (PLEG): container finished" podID="9bd69522-106a-4245-ba69-ec8d51af6dca" containerID="c5d6265d633b98140b64218a382e054a44030fc804811946dee0a74ad1c279fb" exitCode=0 Apr 16 08:49:32.951190 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:32.951127 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" event={"ID":"9bd69522-106a-4245-ba69-ec8d51af6dca","Type":"ContainerDied","Data":"c5d6265d633b98140b64218a382e054a44030fc804811946dee0a74ad1c279fb"} Apr 16 08:49:36.970073 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:36.970040 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" event={"ID":"9bd69522-106a-4245-ba69-ec8d51af6dca","Type":"ContainerStarted","Data":"e6106808fd9b09744d18a63cc9cc65238a2059c5e9e9f17e81a500acad32f694"} Apr 16 08:49:36.970511 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:36.970274 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:36.971803 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:36.971768 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" event={"ID":"8d130a90-dcb1-40f8-b852-3d1d35375afd","Type":"ContainerStarted","Data":"6555502605b259860658df52dfdd6c75de9b2833bb60b473e49cdc1d1b4a0c53"} Apr 16 08:49:36.972013 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:36.971996 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:49:36.973321 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:36.973287 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" event={"ID":"9f329d91-e0c4-4987-8788-700da7886b55","Type":"ContainerStarted","Data":"9351122b0e9cc865c9c4f1c301ec46fbc01b94944f7b55f589861a813fe00308"} Apr 16 08:49:36.973510 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:36.973492 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:36.987680 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:36.987634 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" podStartSLOduration=1.692504732 podStartE2EDuration="17.987621974s" podCreationTimestamp="2026-04-16 08:49:19 +0000 UTC" firstStartedPulling="2026-04-16 08:49:20.153181374 +0000 UTC m=+612.073167229" lastFinishedPulling="2026-04-16 08:49:36.448298616 +0000 UTC m=+628.368284471" observedRunningTime="2026-04-16 08:49:36.986467403 +0000 UTC m=+628.906453288" watchObservedRunningTime="2026-04-16 08:49:36.987621974 +0000 UTC m=+628.907607849" Apr 16 08:49:37.003903 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:37.003847 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" podStartSLOduration=1.396045334 podStartE2EDuration="17.00382905s" podCreationTimestamp="2026-04-16 08:49:20 +0000 UTC" firstStartedPulling="2026-04-16 08:49:20.758417886 +0000 UTC m=+612.678403751" lastFinishedPulling="2026-04-16 08:49:36.366201613 +0000 UTC m=+628.286187467" observedRunningTime="2026-04-16 08:49:37.001847679 +0000 UTC m=+628.921833566" watchObservedRunningTime="2026-04-16 08:49:37.00382905 +0000 UTC m=+628.923814926" Apr 16 08:49:37.019224 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:37.019176 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" podStartSLOduration=2.133184254 podStartE2EDuration="16.019163438s" podCreationTimestamp="2026-04-16 08:49:21 +0000 UTC" firstStartedPulling="2026-04-16 08:49:22.487266735 +0000 UTC m=+614.407252593" lastFinishedPulling="2026-04-16 08:49:36.373245917 +0000 UTC m=+628.293231777" observedRunningTime="2026-04-16 08:49:37.017138499 +0000 UTC m=+628.937124403" watchObservedRunningTime="2026-04-16 08:49:37.019163438 +0000 UTC m=+628.939149315" Apr 16 08:49:47.990757 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:47.990726 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hxdnd" Apr 16 08:49:47.991692 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:47.991668 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv" Apr 16 08:49:47.991853 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:49:47.991753 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7" Apr 16 08:50:37.915875 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:37.915837 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5c66b85975-4phh9"] Apr 16 08:50:37.919575 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:37.919557 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5c66b85975-4phh9" Apr 16 08:50:37.925683 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:37.925652 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5c66b85975-4phh9"] Apr 16 08:50:38.026383 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:38.026347 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxg5r\" (UniqueName: \"kubernetes.io/projected/758dc381-cecc-435a-9fb1-33ed17fa8c4b-kube-api-access-qxg5r\") pod \"authorino-5c66b85975-4phh9\" (UID: \"758dc381-cecc-435a-9fb1-33ed17fa8c4b\") " pod="kuadrant-system/authorino-5c66b85975-4phh9" Apr 16 08:50:38.026548 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:38.026404 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/758dc381-cecc-435a-9fb1-33ed17fa8c4b-openshift-service-ca\") pod \"authorino-5c66b85975-4phh9\" (UID: \"758dc381-cecc-435a-9fb1-33ed17fa8c4b\") " pod="kuadrant-system/authorino-5c66b85975-4phh9" Apr 16 08:50:38.026548 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:38.026460 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/758dc381-cecc-435a-9fb1-33ed17fa8c4b-tls-cert\") pod \"authorino-5c66b85975-4phh9\" (UID: \"758dc381-cecc-435a-9fb1-33ed17fa8c4b\") " pod="kuadrant-system/authorino-5c66b85975-4phh9" Apr 16 08:50:38.127258 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:38.127220 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/758dc381-cecc-435a-9fb1-33ed17fa8c4b-tls-cert\") pod \"authorino-5c66b85975-4phh9\" (UID: \"758dc381-cecc-435a-9fb1-33ed17fa8c4b\") " pod="kuadrant-system/authorino-5c66b85975-4phh9" Apr 16 08:50:38.127421 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:38.127315 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxg5r\" (UniqueName: \"kubernetes.io/projected/758dc381-cecc-435a-9fb1-33ed17fa8c4b-kube-api-access-qxg5r\") pod \"authorino-5c66b85975-4phh9\" (UID: \"758dc381-cecc-435a-9fb1-33ed17fa8c4b\") " pod="kuadrant-system/authorino-5c66b85975-4phh9" Apr 16 08:50:38.127421 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:38.127350 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/758dc381-cecc-435a-9fb1-33ed17fa8c4b-openshift-service-ca\") pod \"authorino-5c66b85975-4phh9\" (UID: \"758dc381-cecc-435a-9fb1-33ed17fa8c4b\") " pod="kuadrant-system/authorino-5c66b85975-4phh9" Apr 16 08:50:38.128019 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:38.127986 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/758dc381-cecc-435a-9fb1-33ed17fa8c4b-openshift-service-ca\") pod \"authorino-5c66b85975-4phh9\" (UID: \"758dc381-cecc-435a-9fb1-33ed17fa8c4b\") " pod="kuadrant-system/authorino-5c66b85975-4phh9" Apr 16 08:50:38.129975 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:38.129951 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/758dc381-cecc-435a-9fb1-33ed17fa8c4b-tls-cert\") pod \"authorino-5c66b85975-4phh9\" (UID: \"758dc381-cecc-435a-9fb1-33ed17fa8c4b\") " pod="kuadrant-system/authorino-5c66b85975-4phh9" Apr 16 08:50:38.137106 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:38.137080 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxg5r\" (UniqueName: \"kubernetes.io/projected/758dc381-cecc-435a-9fb1-33ed17fa8c4b-kube-api-access-qxg5r\") pod \"authorino-5c66b85975-4phh9\" (UID: \"758dc381-cecc-435a-9fb1-33ed17fa8c4b\") " pod="kuadrant-system/authorino-5c66b85975-4phh9" Apr 16 08:50:38.230581 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:38.230497 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5c66b85975-4phh9" Apr 16 08:50:38.359731 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:38.359692 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5c66b85975-4phh9"] Apr 16 08:50:38.361583 ip-10-0-139-8 kubenswrapper[2578]: W0416 08:50:38.361554 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod758dc381_cecc_435a_9fb1_33ed17fa8c4b.slice/crio-3a9b25d1a94a5e6866a3033f60df34dee6d763b086573bd622b5af8f66e2d614 WatchSource:0}: Error finding container 3a9b25d1a94a5e6866a3033f60df34dee6d763b086573bd622b5af8f66e2d614: Status 404 returned error can't find the container with id 3a9b25d1a94a5e6866a3033f60df34dee6d763b086573bd622b5af8f66e2d614 Apr 16 08:50:38.362956 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:38.362935 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 08:50:39.211375 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:39.211329 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5c66b85975-4phh9" event={"ID":"758dc381-cecc-435a-9fb1-33ed17fa8c4b","Type":"ContainerStarted","Data":"5cd38ae20b796a3b6f231ba7737cf46abbe71e44d0c524ec31c252688bb13a03"} Apr 16 08:50:39.211375 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:39.211382 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5c66b85975-4phh9" event={"ID":"758dc381-cecc-435a-9fb1-33ed17fa8c4b","Type":"ContainerStarted","Data":"3a9b25d1a94a5e6866a3033f60df34dee6d763b086573bd622b5af8f66e2d614"} Apr 16 08:50:39.230779 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:39.230696 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5c66b85975-4phh9" podStartSLOduration=1.819311637 podStartE2EDuration="2.230680742s" podCreationTimestamp="2026-04-16 08:50:37 +0000 UTC" firstStartedPulling="2026-04-16 08:50:38.363107824 +0000 UTC m=+690.283093680" lastFinishedPulling="2026-04-16 08:50:38.774476932 +0000 UTC m=+690.694462785" observedRunningTime="2026-04-16 08:50:39.229460251 +0000 UTC m=+691.149446126" watchObservedRunningTime="2026-04-16 08:50:39.230680742 +0000 UTC m=+691.150666652" Apr 16 08:50:39.268693 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:39.268654 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5696479cd-b26hq"] Apr 16 08:50:39.269273 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:39.269219 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-5696479cd-b26hq" podUID="739baf7a-605e-49ed-9fe4-2d1d74b3c573" containerName="authorino" containerID="cri-o://eb779e98a1f2cc51b195a631d4f744eefbe168537fe0f74adf4888c762889569" gracePeriod=30 Apr 16 08:50:39.522918 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:39.522895 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5696479cd-b26hq" Apr 16 08:50:39.540461 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:39.540437 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/739baf7a-605e-49ed-9fe4-2d1d74b3c573-openshift-service-ca\") pod \"739baf7a-605e-49ed-9fe4-2d1d74b3c573\" (UID: \"739baf7a-605e-49ed-9fe4-2d1d74b3c573\") " Apr 16 08:50:39.540633 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:39.540481 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfskh\" (UniqueName: \"kubernetes.io/projected/739baf7a-605e-49ed-9fe4-2d1d74b3c573-kube-api-access-kfskh\") pod \"739baf7a-605e-49ed-9fe4-2d1d74b3c573\" (UID: \"739baf7a-605e-49ed-9fe4-2d1d74b3c573\") " Apr 16 08:50:39.540633 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:39.540515 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/739baf7a-605e-49ed-9fe4-2d1d74b3c573-tls-cert\") pod \"739baf7a-605e-49ed-9fe4-2d1d74b3c573\" (UID: \"739baf7a-605e-49ed-9fe4-2d1d74b3c573\") " Apr 16 08:50:39.540869 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:39.540839 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/739baf7a-605e-49ed-9fe4-2d1d74b3c573-openshift-service-ca" (OuterVolumeSpecName: "openshift-service-ca") pod "739baf7a-605e-49ed-9fe4-2d1d74b3c573" (UID: "739baf7a-605e-49ed-9fe4-2d1d74b3c573"). InnerVolumeSpecName "openshift-service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:50:39.543164 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:39.543137 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/739baf7a-605e-49ed-9fe4-2d1d74b3c573-kube-api-access-kfskh" (OuterVolumeSpecName: "kube-api-access-kfskh") pod "739baf7a-605e-49ed-9fe4-2d1d74b3c573" (UID: "739baf7a-605e-49ed-9fe4-2d1d74b3c573"). InnerVolumeSpecName "kube-api-access-kfskh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:50:39.557047 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:39.557009 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/739baf7a-605e-49ed-9fe4-2d1d74b3c573-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "739baf7a-605e-49ed-9fe4-2d1d74b3c573" (UID: "739baf7a-605e-49ed-9fe4-2d1d74b3c573"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:50:39.641573 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:39.641538 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/739baf7a-605e-49ed-9fe4-2d1d74b3c573-openshift-service-ca\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:50:39.641573 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:39.641568 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kfskh\" (UniqueName: \"kubernetes.io/projected/739baf7a-605e-49ed-9fe4-2d1d74b3c573-kube-api-access-kfskh\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:50:39.641573 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:39.641578 2578 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/739baf7a-605e-49ed-9fe4-2d1d74b3c573-tls-cert\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 08:50:40.215755 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:40.215689 2578 generic.go:358] "Generic (PLEG): container finished" podID="739baf7a-605e-49ed-9fe4-2d1d74b3c573" containerID="eb779e98a1f2cc51b195a631d4f744eefbe168537fe0f74adf4888c762889569" exitCode=0 Apr 16 08:50:40.216233 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:40.215779 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5696479cd-b26hq" Apr 16 08:50:40.216233 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:40.215777 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5696479cd-b26hq" event={"ID":"739baf7a-605e-49ed-9fe4-2d1d74b3c573","Type":"ContainerDied","Data":"eb779e98a1f2cc51b195a631d4f744eefbe168537fe0f74adf4888c762889569"} Apr 16 08:50:40.216233 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:40.215910 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5696479cd-b26hq" event={"ID":"739baf7a-605e-49ed-9fe4-2d1d74b3c573","Type":"ContainerDied","Data":"beb42e25fe96c6379628d2b1ee9e3ad2c4d8b89e17557c1233316cab40f064ef"} Apr 16 08:50:40.216233 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:40.215931 2578 scope.go:117] "RemoveContainer" containerID="eb779e98a1f2cc51b195a631d4f744eefbe168537fe0f74adf4888c762889569" Apr 16 08:50:40.224817 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:40.224801 2578 scope.go:117] "RemoveContainer" containerID="eb779e98a1f2cc51b195a631d4f744eefbe168537fe0f74adf4888c762889569" Apr 16 08:50:40.225067 ip-10-0-139-8 kubenswrapper[2578]: E0416 08:50:40.225044 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb779e98a1f2cc51b195a631d4f744eefbe168537fe0f74adf4888c762889569\": container with ID starting with eb779e98a1f2cc51b195a631d4f744eefbe168537fe0f74adf4888c762889569 not found: ID does not exist" containerID="eb779e98a1f2cc51b195a631d4f744eefbe168537fe0f74adf4888c762889569" Apr 16 08:50:40.225163 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:40.225073 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb779e98a1f2cc51b195a631d4f744eefbe168537fe0f74adf4888c762889569"} err="failed to get container status \"eb779e98a1f2cc51b195a631d4f744eefbe168537fe0f74adf4888c762889569\": rpc error: code = NotFound desc = could not find container \"eb779e98a1f2cc51b195a631d4f744eefbe168537fe0f74adf4888c762889569\": container with ID starting with eb779e98a1f2cc51b195a631d4f744eefbe168537fe0f74adf4888c762889569 not found: ID does not exist" Apr 16 08:50:40.236973 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:40.236948 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5696479cd-b26hq"] Apr 16 08:50:40.240788 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:40.240765 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-5696479cd-b26hq"] Apr 16 08:50:40.666755 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:50:40.666701 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="739baf7a-605e-49ed-9fe4-2d1d74b3c573" path="/var/lib/kubelet/pods/739baf7a-605e-49ed-9fe4-2d1d74b3c573/volumes" Apr 16 08:54:08.643926 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:54:08.643849 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/1.log" Apr 16 08:54:08.643926 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:54:08.643893 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/1.log" Apr 16 08:54:08.649427 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:54:08.649406 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/0.log" Apr 16 08:54:08.649569 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:54:08.649411 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/0.log" Apr 16 08:59:08.672666 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:59:08.672635 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/1.log" Apr 16 08:59:08.674420 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:59:08.674398 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/1.log" Apr 16 08:59:08.677372 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:59:08.677354 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/0.log" Apr 16 08:59:08.679258 ip-10-0-139-8 kubenswrapper[2578]: I0416 08:59:08.679236 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/0.log" Apr 16 09:00:00.137888 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:00.137809 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29605500-t7nwn"] Apr 16 09:00:00.138383 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:00.138199 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="739baf7a-605e-49ed-9fe4-2d1d74b3c573" containerName="authorino" Apr 16 09:00:00.138383 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:00.138210 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="739baf7a-605e-49ed-9fe4-2d1d74b3c573" containerName="authorino" Apr 16 09:00:00.138383 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:00.138286 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="739baf7a-605e-49ed-9fe4-2d1d74b3c573" containerName="authorino" Apr 16 09:00:00.141358 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:00.141334 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605500-t7nwn" Apr 16 09:00:00.143786 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:00.143763 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-rxq2t\"" Apr 16 09:00:00.154486 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:00.154463 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605500-t7nwn"] Apr 16 09:00:00.243249 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:00.243206 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbcxs\" (UniqueName: \"kubernetes.io/projected/8e6ecc42-0c29-414e-9a72-75b25fd08298-kube-api-access-nbcxs\") pod \"maas-api-key-cleanup-29605500-t7nwn\" (UID: \"8e6ecc42-0c29-414e-9a72-75b25fd08298\") " pod="opendatahub/maas-api-key-cleanup-29605500-t7nwn" Apr 16 09:00:00.344861 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:00.344821 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbcxs\" (UniqueName: \"kubernetes.io/projected/8e6ecc42-0c29-414e-9a72-75b25fd08298-kube-api-access-nbcxs\") pod \"maas-api-key-cleanup-29605500-t7nwn\" (UID: \"8e6ecc42-0c29-414e-9a72-75b25fd08298\") " pod="opendatahub/maas-api-key-cleanup-29605500-t7nwn" Apr 16 09:00:00.353610 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:00.353577 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbcxs\" (UniqueName: \"kubernetes.io/projected/8e6ecc42-0c29-414e-9a72-75b25fd08298-kube-api-access-nbcxs\") pod \"maas-api-key-cleanup-29605500-t7nwn\" (UID: \"8e6ecc42-0c29-414e-9a72-75b25fd08298\") " pod="opendatahub/maas-api-key-cleanup-29605500-t7nwn" Apr 16 09:00:00.451764 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:00.451644 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605500-t7nwn" Apr 16 09:00:00.581275 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:00.581205 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605500-t7nwn"] Apr 16 09:00:00.584224 ip-10-0-139-8 kubenswrapper[2578]: W0416 09:00:00.584193 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e6ecc42_0c29_414e_9a72_75b25fd08298.slice/crio-b4d7088f452321b1556c6d8c17d8125bc76880533e6b92f5b3b4df130caf41e9 WatchSource:0}: Error finding container b4d7088f452321b1556c6d8c17d8125bc76880533e6b92f5b3b4df130caf41e9: Status 404 returned error can't find the container with id b4d7088f452321b1556c6d8c17d8125bc76880533e6b92f5b3b4df130caf41e9 Apr 16 09:00:00.585983 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:00.585960 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 09:00:01.265596 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:01.265554 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605500-t7nwn" event={"ID":"8e6ecc42-0c29-414e-9a72-75b25fd08298","Type":"ContainerStarted","Data":"b4d7088f452321b1556c6d8c17d8125bc76880533e6b92f5b3b4df130caf41e9"} Apr 16 09:00:28.366896 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:28.366841 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605500-t7nwn" event={"ID":"8e6ecc42-0c29-414e-9a72-75b25fd08298","Type":"ContainerStarted","Data":"e38a571418f18d71add4682b1e2f173a124242321e4b934f1f8c43e239e06f53"} Apr 16 09:00:28.383410 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:28.383359 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29605500-t7nwn" podStartSLOduration=1.752040075 podStartE2EDuration="28.38334499s" podCreationTimestamp="2026-04-16 09:00:00 +0000 UTC" firstStartedPulling="2026-04-16 09:00:00.586093392 +0000 UTC m=+1252.506079247" lastFinishedPulling="2026-04-16 09:00:27.217398308 +0000 UTC m=+1279.137384162" observedRunningTime="2026-04-16 09:00:28.38074376 +0000 UTC m=+1280.300729627" watchObservedRunningTime="2026-04-16 09:00:28.38334499 +0000 UTC m=+1280.303330864" Apr 16 09:00:48.441674 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:48.441638 2578 generic.go:358] "Generic (PLEG): container finished" podID="8e6ecc42-0c29-414e-9a72-75b25fd08298" containerID="e38a571418f18d71add4682b1e2f173a124242321e4b934f1f8c43e239e06f53" exitCode=6 Apr 16 09:00:48.442099 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:48.441693 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605500-t7nwn" event={"ID":"8e6ecc42-0c29-414e-9a72-75b25fd08298","Type":"ContainerDied","Data":"e38a571418f18d71add4682b1e2f173a124242321e4b934f1f8c43e239e06f53"} Apr 16 09:00:48.442099 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:48.442052 2578 scope.go:117] "RemoveContainer" containerID="e38a571418f18d71add4682b1e2f173a124242321e4b934f1f8c43e239e06f53" Apr 16 09:00:49.446885 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:00:49.446850 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605500-t7nwn" event={"ID":"8e6ecc42-0c29-414e-9a72-75b25fd08298","Type":"ContainerStarted","Data":"e6d8463ed3bdfe8e214d7ddc4a4e752834221555d9c8df68bf517af8d786317d"} Apr 16 09:01:00.010960 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:00.010927 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605500-t7nwn"] Apr 16 09:01:00.011428 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:00.011197 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29605500-t7nwn" podUID="8e6ecc42-0c29-414e-9a72-75b25fd08298" containerName="cleanup" containerID="cri-o://e6d8463ed3bdfe8e214d7ddc4a4e752834221555d9c8df68bf517af8d786317d" gracePeriod=30 Apr 16 09:01:09.158415 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:09.158390 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605500-t7nwn" Apr 16 09:01:09.321208 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:09.321121 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbcxs\" (UniqueName: \"kubernetes.io/projected/8e6ecc42-0c29-414e-9a72-75b25fd08298-kube-api-access-nbcxs\") pod \"8e6ecc42-0c29-414e-9a72-75b25fd08298\" (UID: \"8e6ecc42-0c29-414e-9a72-75b25fd08298\") " Apr 16 09:01:09.323356 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:09.323323 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e6ecc42-0c29-414e-9a72-75b25fd08298-kube-api-access-nbcxs" (OuterVolumeSpecName: "kube-api-access-nbcxs") pod "8e6ecc42-0c29-414e-9a72-75b25fd08298" (UID: "8e6ecc42-0c29-414e-9a72-75b25fd08298"). InnerVolumeSpecName "kube-api-access-nbcxs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 09:01:09.422339 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:09.422305 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nbcxs\" (UniqueName: \"kubernetes.io/projected/8e6ecc42-0c29-414e-9a72-75b25fd08298-kube-api-access-nbcxs\") on node \"ip-10-0-139-8.ec2.internal\" DevicePath \"\"" Apr 16 09:01:09.519743 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:09.519681 2578 generic.go:358] "Generic (PLEG): container finished" podID="8e6ecc42-0c29-414e-9a72-75b25fd08298" containerID="e6d8463ed3bdfe8e214d7ddc4a4e752834221555d9c8df68bf517af8d786317d" exitCode=6 Apr 16 09:01:09.519896 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:09.519758 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605500-t7nwn" event={"ID":"8e6ecc42-0c29-414e-9a72-75b25fd08298","Type":"ContainerDied","Data":"e6d8463ed3bdfe8e214d7ddc4a4e752834221555d9c8df68bf517af8d786317d"} Apr 16 09:01:09.519896 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:09.519775 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605500-t7nwn" Apr 16 09:01:09.519896 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:09.519801 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605500-t7nwn" event={"ID":"8e6ecc42-0c29-414e-9a72-75b25fd08298","Type":"ContainerDied","Data":"b4d7088f452321b1556c6d8c17d8125bc76880533e6b92f5b3b4df130caf41e9"} Apr 16 09:01:09.519896 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:09.519817 2578 scope.go:117] "RemoveContainer" containerID="e6d8463ed3bdfe8e214d7ddc4a4e752834221555d9c8df68bf517af8d786317d" Apr 16 09:01:09.528266 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:09.528244 2578 scope.go:117] "RemoveContainer" containerID="e38a571418f18d71add4682b1e2f173a124242321e4b934f1f8c43e239e06f53" Apr 16 09:01:09.536691 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:09.536673 2578 scope.go:117] "RemoveContainer" containerID="e6d8463ed3bdfe8e214d7ddc4a4e752834221555d9c8df68bf517af8d786317d" Apr 16 09:01:09.536974 ip-10-0-139-8 kubenswrapper[2578]: E0416 09:01:09.536954 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6d8463ed3bdfe8e214d7ddc4a4e752834221555d9c8df68bf517af8d786317d\": container with ID starting with e6d8463ed3bdfe8e214d7ddc4a4e752834221555d9c8df68bf517af8d786317d not found: ID does not exist" containerID="e6d8463ed3bdfe8e214d7ddc4a4e752834221555d9c8df68bf517af8d786317d" Apr 16 09:01:09.537039 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:09.536981 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6d8463ed3bdfe8e214d7ddc4a4e752834221555d9c8df68bf517af8d786317d"} err="failed to get container status \"e6d8463ed3bdfe8e214d7ddc4a4e752834221555d9c8df68bf517af8d786317d\": rpc error: code = NotFound desc = could not find container \"e6d8463ed3bdfe8e214d7ddc4a4e752834221555d9c8df68bf517af8d786317d\": container with ID starting with e6d8463ed3bdfe8e214d7ddc4a4e752834221555d9c8df68bf517af8d786317d not found: ID does not exist" Apr 16 09:01:09.537039 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:09.537000 2578 scope.go:117] "RemoveContainer" containerID="e38a571418f18d71add4682b1e2f173a124242321e4b934f1f8c43e239e06f53" Apr 16 09:01:09.537219 ip-10-0-139-8 kubenswrapper[2578]: E0416 09:01:09.537201 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e38a571418f18d71add4682b1e2f173a124242321e4b934f1f8c43e239e06f53\": container with ID starting with e38a571418f18d71add4682b1e2f173a124242321e4b934f1f8c43e239e06f53 not found: ID does not exist" containerID="e38a571418f18d71add4682b1e2f173a124242321e4b934f1f8c43e239e06f53" Apr 16 09:01:09.537256 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:09.537226 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e38a571418f18d71add4682b1e2f173a124242321e4b934f1f8c43e239e06f53"} err="failed to get container status \"e38a571418f18d71add4682b1e2f173a124242321e4b934f1f8c43e239e06f53\": rpc error: code = NotFound desc = could not find container \"e38a571418f18d71add4682b1e2f173a124242321e4b934f1f8c43e239e06f53\": container with ID starting with e38a571418f18d71add4682b1e2f173a124242321e4b934f1f8c43e239e06f53 not found: ID does not exist" Apr 16 09:01:09.540726 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:09.540693 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605500-t7nwn"] Apr 16 09:01:09.544322 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:09.544297 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605500-t7nwn"] Apr 16 09:01:10.666853 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:01:10.666820 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e6ecc42-0c29-414e-9a72-75b25fd08298" path="/var/lib/kubelet/pods/8e6ecc42-0c29-414e-9a72-75b25fd08298/volumes" Apr 16 09:04:08.699190 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:04:08.699161 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/1.log" Apr 16 09:04:08.702265 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:04:08.702241 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/1.log" Apr 16 09:04:08.704372 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:04:08.704354 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/0.log" Apr 16 09:04:08.707547 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:04:08.707527 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/0.log" Apr 16 09:08:34.774891 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:34.774842 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-5c66b85975-4phh9_758dc381-cecc-435a-9fb1-33ed17fa8c4b/authorino/0.log" Apr 16 09:08:38.773931 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:38.773897 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-7f79877486-ptbtd_8bb71c58-a16f-4e7c-9bff-3111f3c94d48/manager/0.log" Apr 16 09:08:39.272373 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:39.272338 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-569944d57d-zqvw7_b488352f-eb0b-4eec-b9ba-5e9c536e67ea/manager/0.log" Apr 16 09:08:39.390727 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:39.390673 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-s7mth_b2525202-92cf-4022-a6e5-7d4572ce65fb/postgres/0.log" Apr 16 09:08:40.642391 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:40.642352 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-5c66b85975-4phh9_758dc381-cecc-435a-9fb1-33ed17fa8c4b/authorino/0.log" Apr 16 09:08:40.762493 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:40.762463 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-q8bvd_bc9c7905-4995-4372-bb7f-d566eddd696a/manager/0.log" Apr 16 09:08:41.099372 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:41.099340 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-s9hcc_074e55dc-7424-4e94-8cba-fc8d31c62fe0/registry-server/0.log" Apr 16 09:08:41.781583 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:41.781544 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt_fc028e22-de25-4e9d-b201-a94f19dd4e66/istio-proxy/0.log" Apr 16 09:08:42.253133 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:42.253089 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-vjxhq_5b162dcf-fda9-46fb-8501-7b81824cefca/istio-proxy/0.log" Apr 16 09:08:42.825933 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:42.825889 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-hxdnd_9f329d91-e0c4-4987-8788-700da7886b55/main/0.log" Apr 16 09:08:42.833390 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:42.833347 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-hxdnd_9f329d91-e0c4-4987-8788-700da7886b55/storage-initializer/0.log" Apr 16 09:08:42.940689 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:42.940659 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv_9bd69522-106a-4245-ba69-ec8d51af6dca/storage-initializer/0.log" Apr 16 09:08:42.949833 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:42.949802 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcckrbtv_9bd69522-106a-4245-ba69-ec8d51af6dca/main/0.log" Apr 16 09:08:43.067525 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:43.067495 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7_8d130a90-dcb1-40f8-b852-3d1d35375afd/storage-initializer/0.log" Apr 16 09:08:43.075128 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:43.075099 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-h79k7_8d130a90-dcb1-40f8-b852-3d1d35375afd/main/0.log" Apr 16 09:08:50.313408 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:50.313374 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7mf7d_dc53c5fe-6772-48e1-b1d9-82b3bf47aca3/global-pull-secret-syncer/0.log" Apr 16 09:08:50.462924 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:50.462899 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-g5rw9_654861bc-7246-41e6-a23f-20623cc156ef/konnectivity-agent/0.log" Apr 16 09:08:50.514048 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:50.514022 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-8.ec2.internal_ff050f160b53222212c83886cf4ca7d7/haproxy/0.log" Apr 16 09:08:54.511375 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:54.511268 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-5c66b85975-4phh9_758dc381-cecc-435a-9fb1-33ed17fa8c4b/authorino/0.log" Apr 16 09:08:54.548422 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:54.548390 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-q8bvd_bc9c7905-4995-4372-bb7f-d566eddd696a/manager/0.log" Apr 16 09:08:54.682705 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:54.682665 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-s9hcc_074e55dc-7424-4e94-8cba-fc8d31c62fe0/registry-server/0.log" Apr 16 09:08:56.343361 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:56.343330 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3c65bce5-5192-4f53-bc36-bbc33717edac/alertmanager/0.log" Apr 16 09:08:56.390211 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:56.390181 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3c65bce5-5192-4f53-bc36-bbc33717edac/config-reloader/0.log" Apr 16 09:08:56.418732 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:56.418694 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3c65bce5-5192-4f53-bc36-bbc33717edac/kube-rbac-proxy-web/0.log" Apr 16 09:08:56.443542 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:56.443517 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3c65bce5-5192-4f53-bc36-bbc33717edac/kube-rbac-proxy/0.log" Apr 16 09:08:56.465036 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:56.465007 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3c65bce5-5192-4f53-bc36-bbc33717edac/kube-rbac-proxy-metric/0.log" Apr 16 09:08:56.489663 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:56.489636 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3c65bce5-5192-4f53-bc36-bbc33717edac/prom-label-proxy/0.log" Apr 16 09:08:56.513102 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:56.513072 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3c65bce5-5192-4f53-bc36-bbc33717edac/init-config-reloader/0.log" Apr 16 09:08:56.667943 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:56.667903 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-57bbcc56b8-r4vck_a8884475-b8ab-446b-bf80-e0b74c7da6f6/metrics-server/0.log" Apr 16 09:08:56.693115 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:56.693080 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-mj4jq_441d3e87-1392-4ee0-97c2-285af9c5f52d/monitoring-plugin/0.log" Apr 16 09:08:56.728036 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:56.728006 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qsc79_baddd5f3-613c-43e0-92bd-81661f559e01/node-exporter/0.log" Apr 16 09:08:56.747890 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:56.747865 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qsc79_baddd5f3-613c-43e0-92bd-81661f559e01/kube-rbac-proxy/0.log" Apr 16 09:08:56.768188 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:56.768164 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qsc79_baddd5f3-613c-43e0-92bd-81661f559e01/init-textfile/0.log" Apr 16 09:08:57.042382 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:57.042351 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_193ac91b-a7b4-46b8-bd09-983570fff5c6/prometheus/0.log" Apr 16 09:08:57.066763 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:57.066732 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_193ac91b-a7b4-46b8-bd09-983570fff5c6/config-reloader/0.log" Apr 16 09:08:57.089463 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:57.089428 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_193ac91b-a7b4-46b8-bd09-983570fff5c6/thanos-sidecar/0.log" Apr 16 09:08:57.110993 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:57.110953 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_193ac91b-a7b4-46b8-bd09-983570fff5c6/kube-rbac-proxy-web/0.log" Apr 16 09:08:57.134936 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:57.134864 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_193ac91b-a7b4-46b8-bd09-983570fff5c6/kube-rbac-proxy/0.log" Apr 16 09:08:57.156826 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:57.156802 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_193ac91b-a7b4-46b8-bd09-983570fff5c6/kube-rbac-proxy-thanos/0.log" Apr 16 09:08:57.179199 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:57.179175 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_193ac91b-a7b4-46b8-bd09-983570fff5c6/init-config-reloader/0.log" Apr 16 09:08:58.726393 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.726365 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv"] Apr 16 09:08:58.726767 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.726754 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e6ecc42-0c29-414e-9a72-75b25fd08298" containerName="cleanup" Apr 16 09:08:58.726815 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.726768 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e6ecc42-0c29-414e-9a72-75b25fd08298" containerName="cleanup" Apr 16 09:08:58.726853 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.726842 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e6ecc42-0c29-414e-9a72-75b25fd08298" containerName="cleanup" Apr 16 09:08:58.730043 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.730026 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:08:58.732518 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.732495 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mmdlf\"/\"kube-root-ca.crt\"" Apr 16 09:08:58.732653 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.732569 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mmdlf\"/\"default-dockercfg-9mlmk\"" Apr 16 09:08:58.732743 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.732649 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mmdlf\"/\"openshift-service-ca.crt\"" Apr 16 09:08:58.739497 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.739477 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv"] Apr 16 09:08:58.816842 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.816797 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5c058df2-98e7-4a42-acae-e2c60e00323c-podres\") pod \"perf-node-gather-daemonset-xcwlv\" (UID: \"5c058df2-98e7-4a42-acae-e2c60e00323c\") " pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:08:58.817018 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.816890 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86rq6\" (UniqueName: \"kubernetes.io/projected/5c058df2-98e7-4a42-acae-e2c60e00323c-kube-api-access-86rq6\") pod \"perf-node-gather-daemonset-xcwlv\" (UID: \"5c058df2-98e7-4a42-acae-e2c60e00323c\") " pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:08:58.817018 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.816942 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c058df2-98e7-4a42-acae-e2c60e00323c-sys\") pod \"perf-node-gather-daemonset-xcwlv\" (UID: \"5c058df2-98e7-4a42-acae-e2c60e00323c\") " pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:08:58.817018 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.816964 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5c058df2-98e7-4a42-acae-e2c60e00323c-proc\") pod \"perf-node-gather-daemonset-xcwlv\" (UID: \"5c058df2-98e7-4a42-acae-e2c60e00323c\") " pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:08:58.817018 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.817005 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5c058df2-98e7-4a42-acae-e2c60e00323c-lib-modules\") pod \"perf-node-gather-daemonset-xcwlv\" (UID: \"5c058df2-98e7-4a42-acae-e2c60e00323c\") " pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:08:58.918124 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.918086 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86rq6\" (UniqueName: \"kubernetes.io/projected/5c058df2-98e7-4a42-acae-e2c60e00323c-kube-api-access-86rq6\") pod \"perf-node-gather-daemonset-xcwlv\" (UID: \"5c058df2-98e7-4a42-acae-e2c60e00323c\") " pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:08:58.918296 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.918154 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c058df2-98e7-4a42-acae-e2c60e00323c-sys\") pod \"perf-node-gather-daemonset-xcwlv\" (UID: \"5c058df2-98e7-4a42-acae-e2c60e00323c\") " pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:08:58.918296 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.918178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5c058df2-98e7-4a42-acae-e2c60e00323c-proc\") pod \"perf-node-gather-daemonset-xcwlv\" (UID: \"5c058df2-98e7-4a42-acae-e2c60e00323c\") " pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:08:58.918296 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.918214 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5c058df2-98e7-4a42-acae-e2c60e00323c-lib-modules\") pod \"perf-node-gather-daemonset-xcwlv\" (UID: \"5c058df2-98e7-4a42-acae-e2c60e00323c\") " pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:08:58.918296 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.918252 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5c058df2-98e7-4a42-acae-e2c60e00323c-podres\") pod \"perf-node-gather-daemonset-xcwlv\" (UID: \"5c058df2-98e7-4a42-acae-e2c60e00323c\") " pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:08:58.918450 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.918292 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c058df2-98e7-4a42-acae-e2c60e00323c-sys\") pod \"perf-node-gather-daemonset-xcwlv\" (UID: \"5c058df2-98e7-4a42-acae-e2c60e00323c\") " pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:08:58.918450 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.918296 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5c058df2-98e7-4a42-acae-e2c60e00323c-proc\") pod \"perf-node-gather-daemonset-xcwlv\" (UID: \"5c058df2-98e7-4a42-acae-e2c60e00323c\") " pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:08:58.918450 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.918379 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5c058df2-98e7-4a42-acae-e2c60e00323c-lib-modules\") pod \"perf-node-gather-daemonset-xcwlv\" (UID: \"5c058df2-98e7-4a42-acae-e2c60e00323c\") " pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:08:58.918450 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.918392 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5c058df2-98e7-4a42-acae-e2c60e00323c-podres\") pod \"perf-node-gather-daemonset-xcwlv\" (UID: \"5c058df2-98e7-4a42-acae-e2c60e00323c\") " pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:08:58.925825 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:58.925803 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86rq6\" (UniqueName: \"kubernetes.io/projected/5c058df2-98e7-4a42-acae-e2c60e00323c-kube-api-access-86rq6\") pod \"perf-node-gather-daemonset-xcwlv\" (UID: \"5c058df2-98e7-4a42-acae-e2c60e00323c\") " pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:08:59.041746 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:59.041630 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:08:59.104219 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:59.104018 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/1.log" Apr 16 09:08:59.117139 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:59.117118 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/2.log" Apr 16 09:08:59.169941 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:59.169916 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv"] Apr 16 09:08:59.172127 ip-10-0-139-8 kubenswrapper[2578]: W0416 09:08:59.172097 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5c058df2_98e7_4a42_acae_e2c60e00323c.slice/crio-36c017babc0b2f8b6ae2001db225ca39fd1ec3e15fe01b477b663d24c3da4333 WatchSource:0}: Error finding container 36c017babc0b2f8b6ae2001db225ca39fd1ec3e15fe01b477b663d24c3da4333: Status 404 returned error can't find the container with id 36c017babc0b2f8b6ae2001db225ca39fd1ec3e15fe01b477b663d24c3da4333 Apr 16 09:08:59.173915 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:59.173900 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 09:08:59.215433 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:59.215406 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" event={"ID":"5c058df2-98e7-4a42-acae-e2c60e00323c","Type":"ContainerStarted","Data":"36c017babc0b2f8b6ae2001db225ca39fd1ec3e15fe01b477b663d24c3da4333"} Apr 16 09:08:59.580313 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:59.580232 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85b4bfbb8c-kvbhs_337fbebc-d7d7-4879-a750-c451f801dd55/console/0.log" Apr 16 09:08:59.614057 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:08:59.614027 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-lgk8t_74643581-ba72-4b9d-ae5a-bc893d97b6a0/download-server/0.log" Apr 16 09:09:00.220082 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:00.220048 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" event={"ID":"5c058df2-98e7-4a42-acae-e2c60e00323c","Type":"ContainerStarted","Data":"22139c75c4430cf090e7c9c9bc912ff72ce73fa6490ab7b94e1f66bb6ebc169e"} Apr 16 09:09:00.220499 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:00.220171 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:09:00.235263 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:00.235219 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" podStartSLOduration=2.235206301 podStartE2EDuration="2.235206301s" podCreationTimestamp="2026-04-16 09:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 09:09:00.234149259 +0000 UTC m=+1792.154135151" watchObservedRunningTime="2026-04-16 09:09:00.235206301 +0000 UTC m=+1792.155192170" Apr 16 09:09:00.803421 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:00.803397 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fqcdx_8cea109b-1867-4bf4-a48a-15604584a8d2/dns/0.log" Apr 16 09:09:00.824988 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:00.824963 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fqcdx_8cea109b-1867-4bf4-a48a-15604584a8d2/kube-rbac-proxy/0.log" Apr 16 09:09:00.941792 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:00.941762 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6ngxf_1ba56a43-ff49-4b6d-a602-289479e4e2f7/dns-node-resolver/0.log" Apr 16 09:09:01.445346 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:01.445308 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5bbbf98cd9-n68bb_1231a7af-9e87-4c88-9b24-457ae238ae51/registry/0.log" Apr 16 09:09:01.527748 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:01.527699 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xxvcl_9befbb38-427c-4f04-9ac5-007147cbf0ea/node-ca/0.log" Apr 16 09:09:02.290266 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:02.290233 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf6cpdt_fc028e22-de25-4e9d-b201-a94f19dd4e66/istio-proxy/0.log" Apr 16 09:09:02.581117 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:02.581027 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-vjxhq_5b162dcf-fda9-46fb-8501-7b81824cefca/istio-proxy/0.log" Apr 16 09:09:03.067871 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:03.067841 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4tvsn_11093fee-55ea-464a-b838-08d5d6f8e907/serve-healthcheck-canary/0.log" Apr 16 09:09:03.552677 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:03.552632 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-nz5z7_97f24f3e-056d-4441-bbc0-42973fb6dcc4/insights-operator/0.log" Apr 16 09:09:03.552912 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:03.552843 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-nz5z7_97f24f3e-056d-4441-bbc0-42973fb6dcc4/insights-operator/1.log" Apr 16 09:09:03.643972 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:03.643942 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2rm2g_5a5de9ed-1057-4943-9b06-aade8bc24270/kube-rbac-proxy/0.log" Apr 16 09:09:03.664985 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:03.664950 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2rm2g_5a5de9ed-1057-4943-9b06-aade8bc24270/exporter/0.log" Apr 16 09:09:03.687509 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:03.687482 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2rm2g_5a5de9ed-1057-4943-9b06-aade8bc24270/extractor/0.log" Apr 16 09:09:05.689547 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:05.689516 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-7f79877486-ptbtd_8bb71c58-a16f-4e7c-9bff-3111f3c94d48/manager/0.log" Apr 16 09:09:05.830726 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:05.830684 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-569944d57d-zqvw7_b488352f-eb0b-4eec-b9ba-5e9c536e67ea/manager/0.log" Apr 16 09:09:05.850579 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:05.850552 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-s7mth_b2525202-92cf-4022-a6e5-7d4572ce65fb/postgres/0.log" Apr 16 09:09:06.234047 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:06.234022 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mmdlf/perf-node-gather-daemonset-xcwlv" Apr 16 09:09:06.967101 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:06.967072 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7cbc7f8cc-pgx6j_9e852241-561c-4ec2-b6bb-4f4811673c98/manager/0.log" Apr 16 09:09:08.726884 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:08.726737 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/1.log" Apr 16 09:09:08.745854 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:08.730913 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-p5qrs_5c2240e1-8fc0-49b6-9c23-73dddaed0476/console-operator/1.log" Apr 16 09:09:08.745854 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:08.731826 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/0.log" Apr 16 09:09:08.745854 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:08.735731 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/0.log" Apr 16 09:09:11.632076 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:11.632026 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-m7lwq_c49165de-d15e-468a-9b37-71d0defef4a1/kube-storage-version-migrator-operator/1.log" Apr 16 09:09:11.633855 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:11.633825 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-m7lwq_c49165de-d15e-468a-9b37-71d0defef4a1/kube-storage-version-migrator-operator/0.log" Apr 16 09:09:12.880283 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:12.880244 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nsv7m_2583f7f3-820e-46ff-b710-c2256f41f5c1/kube-multus-additional-cni-plugins/0.log" Apr 16 09:09:12.901438 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:12.901404 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nsv7m_2583f7f3-820e-46ff-b710-c2256f41f5c1/egress-router-binary-copy/0.log" Apr 16 09:09:12.926689 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:12.926658 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nsv7m_2583f7f3-820e-46ff-b710-c2256f41f5c1/cni-plugins/0.log" Apr 16 09:09:12.948766 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:12.948739 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nsv7m_2583f7f3-820e-46ff-b710-c2256f41f5c1/bond-cni-plugin/0.log" Apr 16 09:09:12.970448 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:12.970419 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nsv7m_2583f7f3-820e-46ff-b710-c2256f41f5c1/routeoverride-cni/0.log" Apr 16 09:09:12.991072 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:12.991039 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nsv7m_2583f7f3-820e-46ff-b710-c2256f41f5c1/whereabouts-cni-bincopy/0.log" Apr 16 09:09:13.018538 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:13.018511 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nsv7m_2583f7f3-820e-46ff-b710-c2256f41f5c1/whereabouts-cni/0.log" Apr 16 09:09:13.281752 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:13.281650 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hh7sm_c6fc606d-9332-4d96-911e-24bed66bbda7/kube-multus/0.log" Apr 16 09:09:13.357898 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:13.357856 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kvw77_13e8353c-4eb0-4abd-98df-42ece4ec0318/network-metrics-daemon/0.log" Apr 16 09:09:13.376768 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:13.376738 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kvw77_13e8353c-4eb0-4abd-98df-42ece4ec0318/kube-rbac-proxy/0.log" Apr 16 09:09:14.542000 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:14.541969 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-controller/0.log" Apr 16 09:09:14.559426 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:14.559388 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/0.log" Apr 16 09:09:14.576492 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:14.576460 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovn-acl-logging/1.log" Apr 16 09:09:14.598109 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:14.598082 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/kube-rbac-proxy-node/0.log" Apr 16 09:09:14.619758 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:14.619704 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 09:09:14.636818 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:14.636779 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/northd/0.log" Apr 16 09:09:14.658471 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:14.658445 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/nbdb/0.log" Apr 16 09:09:14.679674 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:14.679645 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/sbdb/0.log" Apr 16 09:09:14.849804 ip-10-0-139-8 kubenswrapper[2578]: I0416 09:09:14.849767 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cvzdh_e2881a10-5691-4f22-92fd-70bdbdbacec2/ovnkube-controller/0.log"