Apr 17 17:22:29.337519 ip-10-0-135-105 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 17:22:29.337532 ip-10-0-135-105 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 17:22:29.337538 ip-10-0-135-105 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 17:22:29.337741 ip-10-0-135-105 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 17:22:39.349674 ip-10-0-135-105 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 17:22:39.349689 ip-10-0-135-105 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 15d59eda0d45488f89d3dfa2bc82a9d7 -- Apr 17 17:25:06.941323 ip-10-0-135-105 systemd[1]: Starting Kubernetes Kubelet... Apr 17 17:25:07.385279 ip-10-0-135-105 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:25:07.385279 ip-10-0-135-105 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 17:25:07.385279 ip-10-0-135-105 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:25:07.385279 ip-10-0-135-105 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 17:25:07.385279 ip-10-0-135-105 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:25:07.387104 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.387002 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 17:25:07.391784 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391765 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:07.391784 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391785 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391790 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391795 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391798 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391801 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391804 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391807 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391809 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391812 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391815 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391818 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391820 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391823 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391826 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391828 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391831 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391833 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391836 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391839 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:07.391853 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391842 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391844 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391847 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391849 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391852 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391854 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391857 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391860 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391863 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391866 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391868 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391871 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391873 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391875 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391878 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391881 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391883 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391886 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391888 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391891 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:07.392341 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391893 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391896 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391899 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391901 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391903 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391906 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391908 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391911 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391913 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391915 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391918 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391920 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391923 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391925 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391928 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391934 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391937 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391953 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391956 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:07.392819 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391959 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391962 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391965 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391968 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391971 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391974 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391977 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391981 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.391984 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392000 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392003 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392005 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392008 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392010 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392013 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392016 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392018 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392021 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392023 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392026 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:07.393298 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392028 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392032 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392035 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392038 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392040 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392043 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392045 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392471 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392477 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392479 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392482 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392484 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392487 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392490 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392493 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392496 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392498 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392501 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392503 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392506 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392509 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:07.393877 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392512 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392514 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392516 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392519 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392522 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392524 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392527 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392529 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392531 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392534 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392536 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392538 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392541 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392543 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392546 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392548 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392550 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392553 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392556 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392559 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:07.394435 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392562 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392564 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392567 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392569 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392572 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392574 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392577 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392579 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392582 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392586 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392589 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392593 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392595 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392598 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392601 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392604 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392606 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392609 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392611 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:07.394950 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392613 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392616 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392619 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392622 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392624 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392627 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392631 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392633 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392636 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392638 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392641 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392644 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392646 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392649 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392652 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392654 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392657 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392660 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392662 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392664 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:07.395433 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392667 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392669 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392671 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392674 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392676 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392679 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392682 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392684 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392687 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392689 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392692 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392695 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.392698 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394093 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394104 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394133 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394138 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394143 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394147 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394151 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394156 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 17:25:07.395928 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394159 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394162 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394166 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394169 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394173 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394176 2571 flags.go:64] FLAG: --cgroup-root="" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394179 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394182 2571 flags.go:64] FLAG: --client-ca-file="" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394185 2571 flags.go:64] FLAG: --cloud-config="" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394187 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394190 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394195 2571 flags.go:64] FLAG: --cluster-domain="" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394198 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394201 2571 flags.go:64] FLAG: --config-dir="" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394204 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394208 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394212 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394218 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394221 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394225 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394228 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394231 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394235 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394238 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394241 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 17:25:07.396511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394245 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394248 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394251 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394255 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394258 2571 flags.go:64] FLAG: --enable-server="true" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394261 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394265 2571 flags.go:64] FLAG: --event-burst="100" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394269 2571 flags.go:64] FLAG: --event-qps="50" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394272 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394275 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394278 2571 flags.go:64] FLAG: --eviction-hard="" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394282 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394285 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394288 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394291 2571 flags.go:64] FLAG: --eviction-soft="" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394294 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394297 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394300 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394303 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394306 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394309 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394312 2571 flags.go:64] FLAG: --feature-gates="" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394316 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394319 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394322 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 17:25:07.397138 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394326 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394329 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394333 2571 flags.go:64] FLAG: --help="false" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394336 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-135-105.ec2.internal" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394339 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394343 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394346 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394350 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394353 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394356 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394360 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394363 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394365 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394369 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394372 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394375 2571 flags.go:64] FLAG: --kube-reserved="" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394378 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394381 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394384 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394387 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394389 2571 flags.go:64] FLAG: --lock-file="" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394392 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394395 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394398 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 17:25:07.397759 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394404 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394406 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394409 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394412 2571 flags.go:64] FLAG: --logging-format="text" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394415 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394418 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394421 2571 flags.go:64] FLAG: --manifest-url="" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394424 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394429 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394432 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394436 2571 flags.go:64] FLAG: --max-pods="110" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394440 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394443 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394446 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394449 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394453 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394456 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394459 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394467 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394470 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394473 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394477 2571 flags.go:64] FLAG: --pod-cidr="" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394479 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 17:25:07.398367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394486 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394489 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394492 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394495 2571 flags.go:64] FLAG: --port="10250" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394497 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394500 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a98e3848918f19d5" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394503 2571 flags.go:64] FLAG: --qos-reserved="" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394506 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394509 2571 flags.go:64] FLAG: --register-node="true" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394511 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394514 2571 flags.go:64] FLAG: --register-with-taints="" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394518 2571 flags.go:64] FLAG: --registry-burst="10" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394520 2571 flags.go:64] FLAG: --registry-qps="5" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394523 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394527 2571 flags.go:64] FLAG: --reserved-memory="" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394530 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394533 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394536 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394539 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394543 2571 flags.go:64] FLAG: --runonce="false" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394546 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394549 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394552 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394555 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394558 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394561 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 17:25:07.398931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394564 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394567 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394570 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394573 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394576 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394579 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394582 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394586 2571 flags.go:64] FLAG: --system-cgroups="" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394588 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394594 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394597 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394600 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394605 2571 flags.go:64] FLAG: --tls-min-version="" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394608 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394611 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394614 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394617 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394620 2571 flags.go:64] FLAG: --v="2" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394624 2571 flags.go:64] FLAG: --version="false" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394628 2571 flags.go:64] FLAG: --vmodule="" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394633 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.394636 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394738 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394742 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394745 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:07.399589 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394748 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394751 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394754 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394757 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394759 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394763 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394767 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394769 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394775 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394777 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394780 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394783 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394785 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394788 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394791 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394794 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394797 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394799 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394802 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:07.400218 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394805 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394808 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394812 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394815 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394818 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394821 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394824 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394827 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394829 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394831 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394834 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394836 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394840 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394842 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394845 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394848 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394851 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394853 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394855 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394858 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:07.400706 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394860 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394865 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394867 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394870 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394872 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394874 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394876 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394879 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394881 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394883 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394886 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394888 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394890 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394893 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394895 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394897 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394900 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394902 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394904 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394907 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:07.401276 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394909 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394912 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394914 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394916 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394919 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394924 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394927 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394929 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394932 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394935 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394937 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394940 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394942 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394946 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394949 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394951 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394954 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394956 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394959 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:07.401796 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394962 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:07.402300 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394964 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:07.402300 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394966 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:07.402300 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394969 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:07.402300 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.394971 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:07.402300 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.395943 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:25:07.402948 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.402927 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 17:25:07.402982 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.402949 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 17:25:07.403059 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403050 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:07.403059 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403059 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403063 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403066 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403069 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403072 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403075 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403077 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403080 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403083 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403085 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403088 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403090 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403099 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403101 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403104 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403106 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403108 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403113 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403117 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:07.403118 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403120 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403124 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403128 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403132 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403135 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403138 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403140 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403143 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403145 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403148 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403151 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403153 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403155 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403158 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403160 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403163 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403165 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403168 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403171 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403174 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:07.403596 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403176 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403178 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403181 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403183 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403185 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403188 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403196 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403199 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403201 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403204 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403206 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403208 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403211 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403213 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403218 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403220 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403223 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403225 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403228 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403230 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:07.404149 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403233 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403235 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403238 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403240 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403242 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403245 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403247 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403249 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403252 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403254 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403257 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403259 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403262 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403264 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403267 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403269 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403271 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403274 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403276 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403286 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:07.404637 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403289 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:07.405163 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403291 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:07.405163 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403294 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:07.405163 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403296 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:07.405163 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403299 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:07.405163 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403302 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:07.405163 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.403308 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:25:07.405163 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403449 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:07.405163 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403454 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:07.405163 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403457 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:07.405163 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403460 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:07.405163 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403463 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:07.405163 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403466 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:07.405163 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403468 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:07.405163 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403471 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:07.405163 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403474 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403478 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403482 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403484 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403487 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403489 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403492 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403494 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403496 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403499 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403502 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403504 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403507 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403509 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403512 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403514 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403518 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403526 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403529 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403531 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:07.405544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403533 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403536 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403538 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403541 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403543 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403546 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403548 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403550 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403553 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403555 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403557 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403560 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403562 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403565 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403567 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403570 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403572 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403574 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403577 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403579 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:07.406055 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403582 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403584 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403587 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403589 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403592 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403594 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403597 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403600 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403602 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403605 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403613 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403615 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403618 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403620 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403623 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403625 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403628 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403630 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403633 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:07.406551 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403635 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:07.407052 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403639 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:07.407052 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403642 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:07.407052 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403645 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:07.407052 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403648 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:07.407052 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403650 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:07.407052 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403653 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:07.407052 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403656 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:07.407052 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403658 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:07.407052 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403661 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:07.407052 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403663 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:07.407052 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403666 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:07.407052 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403668 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:07.407052 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403670 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:07.407052 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403673 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:07.407052 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403675 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:07.407052 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403678 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:07.407052 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403681 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:07.407052 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:07.403683 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:07.407527 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.403688 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:25:07.407527 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.404396 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 17:25:07.407527 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.406489 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 17:25:07.407527 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.407466 2571 server.go:1019] "Starting client certificate rotation" Apr 17 17:25:07.407631 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.407599 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:25:07.408560 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.408545 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:25:07.433417 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.433390 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:25:07.439483 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.439455 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:25:07.455210 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.455183 2571 log.go:25] "Validated CRI v1 runtime API" Apr 17 17:25:07.463224 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.463201 2571 log.go:25] "Validated CRI v1 image API" Apr 17 17:25:07.464458 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.464437 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 17:25:07.467622 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.467599 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:25:07.469915 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.469894 2571 fs.go:135] Filesystem UUIDs: map[545828d8-48ef-441f-97a3-0a9c2599050b:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 f6097634-64df-4106-afb7-b6dede3835f1:/dev/nvme0n1p4] Apr 17 17:25:07.469958 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.469917 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 17:25:07.476009 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.475866 2571 manager.go:217] Machine: {Timestamp:2026-04-17 17:25:07.473803568 +0000 UTC m=+0.412708834 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3088210 MemoryCapacity:32812158976 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec273f0305b579313ba60c810327ab7c SystemUUID:ec273f03-05b5-7931-3ba6-0c810327ab7c BootID:15d59eda-0d45-488f-89d3-dfa2bc82a9d7 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406081536 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406077440 Type:vfs Inodes:4005390 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e6:22:68:22:eb Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e6:22:68:22:eb Speed:0 Mtu:9001} {Name:ovs-system MacAddress:62:96:c7:c2:24:a5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812158976 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 17:25:07.476009 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.476006 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 17:25:07.476123 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.476104 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 17:25:07.477293 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.477263 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 17:25:07.477464 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.477297 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-105.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 17:25:07.477509 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.477474 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 17:25:07.477509 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.477482 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 17:25:07.477509 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.477496 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:25:07.478383 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.478369 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:25:07.479197 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.479185 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:25:07.479336 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.479327 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 17:25:07.481787 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.481774 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 17 17:25:07.481833 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.481796 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 17:25:07.481833 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.481809 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 17:25:07.481833 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.481818 2571 kubelet.go:397] "Adding apiserver pod source" Apr 17 17:25:07.481833 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.481827 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 17:25:07.483064 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.483048 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:25:07.483134 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.483069 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:25:07.486530 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.486506 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 17:25:07.488489 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.488470 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 17:25:07.489668 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.489639 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-725hm" Apr 17 17:25:07.490772 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.490758 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 17:25:07.490851 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.490779 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 17:25:07.490851 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.490788 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 17:25:07.490851 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.490796 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 17:25:07.490851 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.490805 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 17:25:07.490851 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.490821 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 17:25:07.490851 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.490829 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 17:25:07.490851 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.490838 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 17:25:07.491111 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.490870 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 17:25:07.491111 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.490881 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 17:25:07.491111 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.490902 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 17:25:07.491111 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.490915 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 17:25:07.492963 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.492931 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 17:25:07.492963 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.492951 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 17:25:07.495960 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.495869 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-725hm" Apr 17 17:25:07.498903 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.498887 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 17:25:07.499012 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.498942 2571 server.go:1295] "Started kubelet" Apr 17 17:25:07.499085 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.499049 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 17:25:07.499143 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.499062 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 17:25:07.499143 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.499138 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 17:25:07.499831 ip-10-0-135-105 systemd[1]: Started Kubernetes Kubelet. Apr 17 17:25:07.501207 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.501188 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 17:25:07.501954 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.501925 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 17 17:25:07.502061 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.502045 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:07.506084 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.506064 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-105.ec2.internal" not found Apr 17 17:25:07.506224 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.506204 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 17:25:07.506296 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.506263 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:07.506800 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.506779 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 17:25:07.507760 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.507725 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 17:25:07.507760 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:07.507740 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-105.ec2.internal\" not found" Apr 17 17:25:07.507898 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.507845 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 17:25:07.507898 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.507873 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 17:25:07.509312 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.508020 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 17 17:25:07.509312 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.508043 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 17 17:25:07.509312 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.508300 2571 factory.go:55] Registering systemd factory Apr 17 17:25:07.509312 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.508320 2571 factory.go:223] Registration of the systemd container factory successfully Apr 17 17:25:07.509312 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.508640 2571 factory.go:153] Registering CRI-O factory Apr 17 17:25:07.509312 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.508651 2571 factory.go:223] Registration of the crio container factory successfully Apr 17 17:25:07.509312 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.508707 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 17:25:07.509312 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.508737 2571 factory.go:103] Registering Raw factory Apr 17 17:25:07.509312 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.508751 2571 manager.go:1196] Started watching for new ooms in manager Apr 17 17:25:07.509312 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.509162 2571 manager.go:319] Starting recovery of all containers Apr 17 17:25:07.510078 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.510058 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:07.511719 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:07.511359 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 17:25:07.515719 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:07.515532 2571 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-105.ec2.internal\" not found" node="ip-10-0-135-105.ec2.internal" Apr 17 17:25:07.520929 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.520912 2571 manager.go:324] Recovery completed Apr 17 17:25:07.522357 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:07.522327 2571 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 17 17:25:07.523701 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.523684 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-105.ec2.internal" not found Apr 17 17:25:07.525321 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.525309 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:07.527625 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.527608 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-105.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:07.527625 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.527638 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-105.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:07.527766 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.527649 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-105.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:07.528244 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.528230 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 17:25:07.528244 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.528242 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 17:25:07.528333 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.528260 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:25:07.531359 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.531347 2571 policy_none.go:49] "None policy: Start" Apr 17 17:25:07.531399 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.531363 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 17:25:07.531399 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.531374 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 17 17:25:07.576708 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.576685 2571 manager.go:341] "Starting Device Plugin manager" Apr 17 17:25:07.598867 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:07.576728 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 17:25:07.598867 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.576741 2571 server.go:85] "Starting device plugin registration server" Apr 17 17:25:07.598867 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.577032 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 17:25:07.598867 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.577042 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 17:25:07.598867 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.577143 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 17:25:07.598867 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.577242 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 17:25:07.598867 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.577251 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 17:25:07.598867 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:07.577798 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 17:25:07.598867 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:07.577835 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-105.ec2.internal\" not found" Apr 17 17:25:07.598867 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.580234 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-105.ec2.internal" not found Apr 17 17:25:07.651341 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.651254 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 17:25:07.652710 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.652691 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 17:25:07.652780 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.652722 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 17:25:07.652780 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.652746 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 17:25:07.652780 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.652753 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 17:25:07.652922 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:07.652793 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 17:25:07.655439 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.655415 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:07.678073 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.678042 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:07.679270 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.679252 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-105.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:07.679372 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.679283 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-105.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:07.679372 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.679295 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-105.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:07.679372 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.679319 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-105.ec2.internal" Apr 17 17:25:07.688499 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.688478 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-105.ec2.internal" Apr 17 17:25:07.753479 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.753438 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-105.ec2.internal"] Apr 17 17:25:07.758098 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.758078 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal" Apr 17 17:25:07.758174 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.758089 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-105.ec2.internal" Apr 17 17:25:07.784799 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.784773 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal" Apr 17 17:25:07.789068 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.789051 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-105.ec2.internal" Apr 17 17:25:07.806310 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.806282 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:25:07.806444 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.806282 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:25:07.810844 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.810823 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2d0b942ad205f5fb58ca3113d3bd1bee-config\") pod \"kube-apiserver-proxy-ip-10-0-135-105.ec2.internal\" (UID: \"2d0b942ad205f5fb58ca3113d3bd1bee\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-105.ec2.internal" Apr 17 17:25:07.810904 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.810856 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c56e93d9fa5e41391e4558a0134622a0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal\" (UID: \"c56e93d9fa5e41391e4558a0134622a0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal" Apr 17 17:25:07.810904 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.810882 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c56e93d9fa5e41391e4558a0134622a0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal\" (UID: \"c56e93d9fa5e41391e4558a0134622a0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal" Apr 17 17:25:07.912047 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.911908 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c56e93d9fa5e41391e4558a0134622a0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal\" (UID: \"c56e93d9fa5e41391e4558a0134622a0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal" Apr 17 17:25:07.912047 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.911974 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c56e93d9fa5e41391e4558a0134622a0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal\" (UID: \"c56e93d9fa5e41391e4558a0134622a0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal" Apr 17 17:25:07.912047 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.912019 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2d0b942ad205f5fb58ca3113d3bd1bee-config\") pod \"kube-apiserver-proxy-ip-10-0-135-105.ec2.internal\" (UID: \"2d0b942ad205f5fb58ca3113d3bd1bee\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-105.ec2.internal" Apr 17 17:25:07.912047 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.912024 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c56e93d9fa5e41391e4558a0134622a0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal\" (UID: \"c56e93d9fa5e41391e4558a0134622a0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal" Apr 17 17:25:07.912047 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.912050 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2d0b942ad205f5fb58ca3113d3bd1bee-config\") pod \"kube-apiserver-proxy-ip-10-0-135-105.ec2.internal\" (UID: \"2d0b942ad205f5fb58ca3113d3bd1bee\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-105.ec2.internal" Apr 17 17:25:07.912317 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:07.912032 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c56e93d9fa5e41391e4558a0134622a0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal\" (UID: \"c56e93d9fa5e41391e4558a0134622a0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal" Apr 17 17:25:08.109145 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.109117 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal" Apr 17 17:25:08.110280 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.110262 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-105.ec2.internal" Apr 17 17:25:08.407132 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.407056 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 17:25:08.407682 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.407243 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:25:08.407682 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.407282 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:25:08.407682 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.407279 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:25:08.482400 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.482361 2571 apiserver.go:52] "Watching apiserver" Apr 17 17:25:08.491490 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.491462 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 17:25:08.492779 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.492756 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-hxcr5","openshift-multus/network-metrics-daemon-zbptt","openshift-network-diagnostics/network-check-target-lwq8d","openshift-network-operator/iptables-alerter-2l9v7","openshift-ovn-kubernetes/ovnkube-node-hhm5l","kube-system/global-pull-secret-syncer-lzfd4","kube-system/konnectivity-agent-w7cnl","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7","openshift-cluster-node-tuning-operator/tuned-j67sl","openshift-dns/node-resolver-hzc4l","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal","openshift-multus/multus-additional-cni-plugins-5fv24","openshift-multus/multus-kx7gk","kube-system/kube-apiserver-proxy-ip-10-0-135-105.ec2.internal"] Apr 17 17:25:08.496465 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.496435 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-w7cnl" Apr 17 17:25:08.498192 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.498010 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 17:20:07 +0000 UTC" deadline="2027-12-31 09:47:40.957499538 +0000 UTC" Apr 17 17:25:08.498192 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.498068 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14944h22m32.459435325s" Apr 17 17:25:08.499281 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.499259 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 17:25:08.499393 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.499278 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-nxwz8\"" Apr 17 17:25:08.499393 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.499265 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 17:25:08.502341 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.502321 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.504186 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.504164 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:08.504285 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.504252 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2l9v7" Apr 17 17:25:08.504534 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.504363 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 17:25:08.504534 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:08.504244 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lwq8d" podUID="5a407c11-3cbe-4521-8abd-48c6506368fb" Apr 17 17:25:08.504534 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.504461 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 17:25:08.504732 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.504599 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-cjj9q\"" Apr 17 17:25:08.504792 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.504776 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 17:25:08.504825 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.504808 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 17:25:08.506509 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.506492 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 17:25:08.506575 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.506552 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.507181 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.507156 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:08.507266 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.507187 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 17:25:08.507325 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.507273 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8fntx\"" Apr 17 17:25:08.507325 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.507316 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 17:25:08.508440 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.508428 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 17:25:08.508972 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.508958 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.510217 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.510199 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 17:25:08.510307 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.510200 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 17:25:08.511240 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.511177 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 17:25:08.511240 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.511180 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 17:25:08.511355 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.511265 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 17:25:08.511584 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.511563 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-65brh\"" Apr 17 17:25:08.511747 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.511726 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-vnn4x\"" Apr 17 17:25:08.512252 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.511983 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hxcr5" Apr 17 17:25:08.512364 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.512351 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 17:25:08.512460 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.512442 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:08.514322 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.514298 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:08.514428 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:08.514367 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbptt" podUID="a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4" Apr 17 17:25:08.514833 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.514810 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 17:25:08.514930 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.514912 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7lcdx\"" Apr 17 17:25:08.514979 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.514934 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 17:25:08.515052 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.514816 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 17:25:08.515610 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.515590 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/354f25fd-1072-4dea-8ce2-1953b417053a-ovnkube-config\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.515688 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.515631 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-etc-sysctl-d\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.515688 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.515659 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-systemd-units\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.515775 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.515686 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-cni-bin\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.515775 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.515737 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/354f25fd-1072-4dea-8ce2-1953b417053a-env-overrides\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.515775 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.515763 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-host-run-k8s-cni-cncf-io\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.515911 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.515821 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/354f25fd-1072-4dea-8ce2-1953b417053a-ovn-node-metrics-cert\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.515911 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.515854 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-cnibin\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.516167 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.515973 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-os-release\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.516167 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516129 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-hostroot\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.516167 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516160 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/38a8697c-ea92-4488-90c5-13d587599dba-tmp\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.516303 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516184 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-host-run-netns\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.516353 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516210 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-run-systemd\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.516353 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516340 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-host\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.516455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516365 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t84fc\" (UniqueName: \"kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc\") pod \"network-check-target-lwq8d\" (UID: \"5a407c11-3cbe-4521-8abd-48c6506368fb\") " pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:08.516455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516391 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sln5t\" (UniqueName: \"kubernetes.io/projected/354f25fd-1072-4dea-8ce2-1953b417053a-kube-api-access-sln5t\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.516455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516420 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-host-var-lib-cni-multus\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.516455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516444 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-slash\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.516646 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516468 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-run-ovn-kubernetes\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.516646 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516491 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-log-socket\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.516646 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516514 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1f443eb6-ad37-49de-a7de-e96834d08311-agent-certs\") pod \"konnectivity-agent-w7cnl\" (UID: \"1f443eb6-ad37-49de-a7de-e96834d08311\") " pod="kube-system/konnectivity-agent-w7cnl" Apr 17 17:25:08.516646 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516567 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-host-run-multus-certs\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.516646 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516592 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-run-ovn\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.516646 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516628 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-etc-kubernetes\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.516911 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516666 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-var-lib-openvswitch\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.516911 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516699 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-multus-socket-dir-parent\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.516911 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516744 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-multus-daemon-config\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.516911 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516769 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-run\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.516911 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516798 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/38a8697c-ea92-4488-90c5-13d587599dba-etc-tuned\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.516911 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516835 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-cni-binary-copy\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.516911 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516861 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-multus-conf-dir\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.516911 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516902 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h56b\" (UniqueName: \"kubernetes.io/projected/ba50df31-c8c4-481e-8f5a-53f3fc99f52c-kube-api-access-4h56b\") pod \"iptables-alerter-2l9v7\" (UID: \"ba50df31-c8c4-481e-8f5a-53f3fc99f52c\") " pod="openshift-network-operator/iptables-alerter-2l9v7" Apr 17 17:25:08.517173 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516941 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw2cj\" (UniqueName: \"kubernetes.io/projected/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-kube-api-access-gw2cj\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.517173 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.516970 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1f443eb6-ad37-49de-a7de-e96834d08311-konnectivity-ca\") pod \"konnectivity-agent-w7cnl\" (UID: \"1f443eb6-ad37-49de-a7de-e96834d08311\") " pod="kube-system/konnectivity-agent-w7cnl" Apr 17 17:25:08.517173 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517017 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-system-cni-dir\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.517173 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517043 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-host-var-lib-cni-bin\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.517173 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517069 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-host-var-lib-kubelet\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.517173 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517087 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-etc-openvswitch\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.517173 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517126 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-node-log\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.517173 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517166 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-etc-kubernetes\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.517485 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517193 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-etc-systemd\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.517485 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517219 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-var-lib-kubelet\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.517485 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517246 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ba50df31-c8c4-481e-8f5a-53f3fc99f52c-iptables-alerter-script\") pod \"iptables-alerter-2l9v7\" (UID: \"ba50df31-c8c4-481e-8f5a-53f3fc99f52c\") " pod="openshift-network-operator/iptables-alerter-2l9v7" Apr 17 17:25:08.517485 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517274 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-etc-modprobe-d\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.517485 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517299 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-etc-sysctl-conf\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.517485 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517325 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-lib-modules\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.517485 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517350 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjzrq\" (UniqueName: \"kubernetes.io/projected/38a8697c-ea92-4488-90c5-13d587599dba-kube-api-access-zjzrq\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.517485 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517376 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-kubelet\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.517485 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517416 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-run-netns\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.517485 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517453 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hzc4l" Apr 17 17:25:08.517485 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517453 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.517893 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517509 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-run-openvswitch\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.517893 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517546 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/354f25fd-1072-4dea-8ce2-1953b417053a-ovnkube-script-lib\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.517893 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517587 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-etc-sysconfig\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.517893 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517610 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-sys\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.517893 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517639 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-multus-cni-dir\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.517893 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517683 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ba50df31-c8c4-481e-8f5a-53f3fc99f52c-host-slash\") pod \"iptables-alerter-2l9v7\" (UID: \"ba50df31-c8c4-481e-8f5a-53f3fc99f52c\") " pod="openshift-network-operator/iptables-alerter-2l9v7" Apr 17 17:25:08.517893 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.517740 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-cni-netd\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.519715 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.519697 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-nkq72\"" Apr 17 17:25:08.519808 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.519725 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.519808 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.519786 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 17:25:08.520038 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.520016 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 17:25:08.522030 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.522014 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.522816 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.522709 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 17:25:08.522882 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.522832 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mvbh9\"" Apr 17 17:25:08.522882 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.522856 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 17:25:08.524221 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.524202 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:08.524324 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:08.524254 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lzfd4" podUID="78665d2b-9809-4082-abf9-62de202c8f2f" Apr 17 17:25:08.527118 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.524874 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 17:25:08.527118 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.525090 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 17:25:08.527118 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.525096 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 17:25:08.527118 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.525223 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-s5krx\"" Apr 17 17:25:08.527118 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.525634 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:25:08.542633 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.542605 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jf2dg" Apr 17 17:25:08.549183 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.549163 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jf2dg" Apr 17 17:25:08.609261 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.609239 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 17:25:08.618165 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618142 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swspk\" (UniqueName: \"kubernetes.io/projected/e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b-kube-api-access-swspk\") pod \"node-ca-hxcr5\" (UID: \"e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b\") " pod="openshift-image-registry/node-ca-hxcr5" Apr 17 17:25:08.618360 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618170 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/18c0221a-27a0-4a22-a6bf-befe72dd72c0-device-dir\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.618360 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618193 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-var-lib-openvswitch\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.618360 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618210 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-multus-socket-dir-parent\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.618360 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618228 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-multus-daemon-config\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.618360 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618253 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dcf417a0-db1f-4d94-9f13-bd789e955760-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.618360 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618271 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-var-lib-openvswitch\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.618360 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618281 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18c0221a-27a0-4a22-a6bf-befe72dd72c0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.618360 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618296 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-multus-socket-dir-parent\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.618360 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618303 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/18c0221a-27a0-4a22-a6bf-befe72dd72c0-socket-dir\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.618360 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618338 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-run\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.618360 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618355 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/38a8697c-ea92-4488-90c5-13d587599dba-etc-tuned\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.618855 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618371 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-cni-binary-copy\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.618855 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618411 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-multus-conf-dir\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.618855 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618420 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-run\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.618855 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618437 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4h56b\" (UniqueName: \"kubernetes.io/projected/ba50df31-c8c4-481e-8f5a-53f3fc99f52c-kube-api-access-4h56b\") pod \"iptables-alerter-2l9v7\" (UID: \"ba50df31-c8c4-481e-8f5a-53f3fc99f52c\") " pod="openshift-network-operator/iptables-alerter-2l9v7" Apr 17 17:25:08.618855 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618629 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-multus-conf-dir\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.618855 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618751 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 17:25:08.618855 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618765 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gw2cj\" (UniqueName: \"kubernetes.io/projected/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-kube-api-access-gw2cj\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.618855 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618805 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1f443eb6-ad37-49de-a7de-e96834d08311-konnectivity-ca\") pod \"konnectivity-agent-w7cnl\" (UID: \"1f443eb6-ad37-49de-a7de-e96834d08311\") " pod="kube-system/konnectivity-agent-w7cnl" Apr 17 17:25:08.618855 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618830 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-system-cni-dir\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.619265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618855 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-host-var-lib-cni-bin\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.619265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618907 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dcf417a0-db1f-4d94-9f13-bd789e955760-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.619265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618912 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-cni-binary-copy\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.619265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618933 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-host-var-lib-kubelet\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.619265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618960 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-etc-openvswitch\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.619265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619005 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-node-log\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.619265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619027 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-host-var-lib-cni-bin\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.619265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619046 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-system-cni-dir\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.619265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.618806 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-multus-daemon-config\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.619265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619046 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-etc-kubernetes\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.619265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619115 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-node-log\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.619265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619114 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-etc-systemd\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.619265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619153 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-host-var-lib-kubelet\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.619265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619174 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-etc-kubernetes\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.619265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619194 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-etc-systemd\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.619265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619203 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-var-lib-kubelet\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.619265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619239 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-var-lib-kubelet\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.619265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619241 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-etc-openvswitch\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.620177 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619276 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ba50df31-c8c4-481e-8f5a-53f3fc99f52c-iptables-alerter-script\") pod \"iptables-alerter-2l9v7\" (UID: \"ba50df31-c8c4-481e-8f5a-53f3fc99f52c\") " pod="openshift-network-operator/iptables-alerter-2l9v7" Apr 17 17:25:08.620177 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619303 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-etc-modprobe-d\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.620177 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619373 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-etc-sysctl-conf\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.620177 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619386 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-etc-modprobe-d\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.620177 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619472 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-lib-modules\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.620177 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619496 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-etc-sysctl-conf\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.620177 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619487 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1f443eb6-ad37-49de-a7de-e96834d08311-konnectivity-ca\") pod \"konnectivity-agent-w7cnl\" (UID: \"1f443eb6-ad37-49de-a7de-e96834d08311\") " pod="kube-system/konnectivity-agent-w7cnl" Apr 17 17:25:08.620177 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619521 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjzrq\" (UniqueName: \"kubernetes.io/projected/38a8697c-ea92-4488-90c5-13d587599dba-kube-api-access-zjzrq\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.620177 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619569 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dcf417a0-db1f-4d94-9f13-bd789e955760-os-release\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.620177 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619580 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-lib-modules\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.620177 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619597 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snmx9\" (UniqueName: \"kubernetes.io/projected/dcf417a0-db1f-4d94-9f13-bd789e955760-kube-api-access-snmx9\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.620177 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619633 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/18c0221a-27a0-4a22-a6bf-befe72dd72c0-sys-fs\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.620177 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619663 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtzml\" (UniqueName: \"kubernetes.io/projected/18c0221a-27a0-4a22-a6bf-befe72dd72c0-kube-api-access-gtzml\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.620177 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619692 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-kubelet\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.620177 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619719 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-run-netns\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.620177 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619744 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.620177 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619750 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-kubelet\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.621023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619744 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ba50df31-c8c4-481e-8f5a-53f3fc99f52c-iptables-alerter-script\") pod \"iptables-alerter-2l9v7\" (UID: \"ba50df31-c8c4-481e-8f5a-53f3fc99f52c\") " pod="openshift-network-operator/iptables-alerter-2l9v7" Apr 17 17:25:08.621023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619770 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dcf417a0-db1f-4d94-9f13-bd789e955760-cnibin\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.621023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619793 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nplx6\" (UniqueName: \"kubernetes.io/projected/fd529c67-6434-4e26-bfa8-0edc94a3b098-kube-api-access-nplx6\") pod \"node-resolver-hzc4l\" (UID: \"fd529c67-6434-4e26-bfa8-0edc94a3b098\") " pod="openshift-dns/node-resolver-hzc4l" Apr 17 17:25:08.621023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619781 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-run-netns\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.621023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619790 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.621023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619816 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-run-openvswitch\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.621023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619841 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/354f25fd-1072-4dea-8ce2-1953b417053a-ovnkube-script-lib\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.621023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619863 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-etc-sysconfig\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.621023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619871 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-run-openvswitch\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.621023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619927 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-sys\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.621023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619945 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-etc-sysconfig\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.621023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619887 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-sys\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.621023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.619980 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-multus-cni-dir\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.621023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620017 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ba50df31-c8c4-481e-8f5a-53f3fc99f52c-host-slash\") pod \"iptables-alerter-2l9v7\" (UID: \"ba50df31-c8c4-481e-8f5a-53f3fc99f52c\") " pod="openshift-network-operator/iptables-alerter-2l9v7" Apr 17 17:25:08.621023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620034 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-cni-netd\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.621023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620052 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs\") pod \"network-metrics-daemon-zbptt\" (UID: \"a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4\") " pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:08.621023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620086 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/354f25fd-1072-4dea-8ce2-1953b417053a-ovnkube-config\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.621760 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620109 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-etc-sysctl-d\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.621760 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620125 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdr89\" (UniqueName: \"kubernetes.io/projected/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-kube-api-access-qdr89\") pod \"network-metrics-daemon-zbptt\" (UID: \"a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4\") " pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:08.621760 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620144 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/18c0221a-27a0-4a22-a6bf-befe72dd72c0-etc-selinux\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.621760 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620175 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-systemd-units\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.621760 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620191 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-cni-bin\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.621760 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620206 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/354f25fd-1072-4dea-8ce2-1953b417053a-env-overrides\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.621760 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620221 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-host-run-k8s-cni-cncf-io\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.621760 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620239 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dcf417a0-db1f-4d94-9f13-bd789e955760-cni-binary-copy\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.621760 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620257 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dcf417a0-db1f-4d94-9f13-bd789e955760-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.621760 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620274 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/354f25fd-1072-4dea-8ce2-1953b417053a-ovn-node-metrics-cert\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.621760 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620290 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-cnibin\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.621760 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620297 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/354f25fd-1072-4dea-8ce2-1953b417053a-ovnkube-script-lib\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.621760 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620311 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-os-release\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.621760 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620326 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-hostroot\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.621760 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620357 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b-host\") pod \"node-ca-hxcr5\" (UID: \"e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b\") " pod="openshift-image-registry/node-ca-hxcr5" Apr 17 17:25:08.621760 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620374 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/18c0221a-27a0-4a22-a6bf-befe72dd72c0-registration-dir\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.621760 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620398 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret\") pod \"global-pull-secret-syncer-lzfd4\" (UID: \"78665d2b-9809-4082-abf9-62de202c8f2f\") " pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:08.622455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620413 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/38a8697c-ea92-4488-90c5-13d587599dba-tmp\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.622455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620432 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-host-run-netns\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.622455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620447 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-run-systemd\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.622455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620445 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-etc-sysctl-d\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.622455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620514 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-systemd-units\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.622455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620548 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/354f25fd-1072-4dea-8ce2-1953b417053a-env-overrides\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.622455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620583 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-cni-bin\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.622455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620584 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-os-release\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.622455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620614 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ba50df31-c8c4-481e-8f5a-53f3fc99f52c-host-slash\") pod \"iptables-alerter-2l9v7\" (UID: \"ba50df31-c8c4-481e-8f5a-53f3fc99f52c\") " pod="openshift-network-operator/iptables-alerter-2l9v7" Apr 17 17:25:08.622455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620629 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-host-run-k8s-cni-cncf-io\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.622455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620649 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-multus-cni-dir\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.622455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620645 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/354f25fd-1072-4dea-8ce2-1953b417053a-ovnkube-config\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.622455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620704 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-hostroot\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.622455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620729 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-cni-netd\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.622455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620815 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-host-run-netns\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.622455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620915 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-host\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.622455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620947 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t84fc\" (UniqueName: \"kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc\") pod \"network-check-target-lwq8d\" (UID: \"5a407c11-3cbe-4521-8abd-48c6506368fb\") " pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:08.622455 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620966 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-run-systemd\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.623294 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.620983 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sln5t\" (UniqueName: \"kubernetes.io/projected/354f25fd-1072-4dea-8ce2-1953b417053a-kube-api-access-sln5t\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.623294 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621025 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-host-var-lib-cni-multus\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.623294 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621053 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dcf417a0-db1f-4d94-9f13-bd789e955760-system-cni-dir\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.623294 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621097 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/78665d2b-9809-4082-abf9-62de202c8f2f-dbus\") pod \"global-pull-secret-syncer-lzfd4\" (UID: \"78665d2b-9809-4082-abf9-62de202c8f2f\") " pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:08.623294 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621116 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-host-var-lib-cni-multus\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.623294 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621131 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38a8697c-ea92-4488-90c5-13d587599dba-host\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.623294 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621121 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fd529c67-6434-4e26-bfa8-0edc94a3b098-hosts-file\") pod \"node-resolver-hzc4l\" (UID: \"fd529c67-6434-4e26-bfa8-0edc94a3b098\") " pod="openshift-dns/node-resolver-hzc4l" Apr 17 17:25:08.623294 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621172 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fd529c67-6434-4e26-bfa8-0edc94a3b098-tmp-dir\") pod \"node-resolver-hzc4l\" (UID: \"fd529c67-6434-4e26-bfa8-0edc94a3b098\") " pod="openshift-dns/node-resolver-hzc4l" Apr 17 17:25:08.623294 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621245 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-cnibin\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.623294 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621277 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-slash\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.623294 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621385 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-run-ovn-kubernetes\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.623294 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621451 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-slash\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.623294 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621570 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-host-run-ovn-kubernetes\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.623294 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621537 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-log-socket\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.623294 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621507 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-log-socket\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.623294 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621613 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1f443eb6-ad37-49de-a7de-e96834d08311-agent-certs\") pod \"konnectivity-agent-w7cnl\" (UID: \"1f443eb6-ad37-49de-a7de-e96834d08311\") " pod="kube-system/konnectivity-agent-w7cnl" Apr 17 17:25:08.623294 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621634 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-host-run-multus-certs\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.624148 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621651 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b-serviceca\") pod \"node-ca-hxcr5\" (UID: \"e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b\") " pod="openshift-image-registry/node-ca-hxcr5" Apr 17 17:25:08.624148 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621687 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/78665d2b-9809-4082-abf9-62de202c8f2f-kubelet-config\") pod \"global-pull-secret-syncer-lzfd4\" (UID: \"78665d2b-9809-4082-abf9-62de202c8f2f\") " pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:08.624148 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621699 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-host-run-multus-certs\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.624148 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621714 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-run-ovn\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.624148 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621738 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-etc-kubernetes\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.624148 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621742 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/354f25fd-1072-4dea-8ce2-1953b417053a-run-ovn\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.624148 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.621797 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-etc-kubernetes\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.624148 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.622500 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/38a8697c-ea92-4488-90c5-13d587599dba-etc-tuned\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.624148 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.623479 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/354f25fd-1072-4dea-8ce2-1953b417053a-ovn-node-metrics-cert\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.624148 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.623673 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/38a8697c-ea92-4488-90c5-13d587599dba-tmp\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.624148 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.624150 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1f443eb6-ad37-49de-a7de-e96834d08311-agent-certs\") pod \"konnectivity-agent-w7cnl\" (UID: \"1f443eb6-ad37-49de-a7de-e96834d08311\") " pod="kube-system/konnectivity-agent-w7cnl" Apr 17 17:25:08.629551 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:08.629139 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:08.629551 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:08.629164 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:08.629551 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:08.629178 2571 projected.go:194] Error preparing data for projected volume kube-api-access-t84fc for pod openshift-network-diagnostics/network-check-target-lwq8d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:08.629551 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:08.629250 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc podName:5a407c11-3cbe-4521-8abd-48c6506368fb nodeName:}" failed. No retries permitted until 2026-04-17 17:25:09.129227216 +0000 UTC m=+2.068132458 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-t84fc" (UniqueName: "kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc") pod "network-check-target-lwq8d" (UID: "5a407c11-3cbe-4521-8abd-48c6506368fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:08.631600 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.631560 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw2cj\" (UniqueName: \"kubernetes.io/projected/54be9ce9-8e34-41ad-a9f8-f058d0ba0624-kube-api-access-gw2cj\") pod \"multus-kx7gk\" (UID: \"54be9ce9-8e34-41ad-a9f8-f058d0ba0624\") " pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.631838 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.631789 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sln5t\" (UniqueName: \"kubernetes.io/projected/354f25fd-1072-4dea-8ce2-1953b417053a-kube-api-access-sln5t\") pod \"ovnkube-node-hhm5l\" (UID: \"354f25fd-1072-4dea-8ce2-1953b417053a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.632154 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.632052 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h56b\" (UniqueName: \"kubernetes.io/projected/ba50df31-c8c4-481e-8f5a-53f3fc99f52c-kube-api-access-4h56b\") pod \"iptables-alerter-2l9v7\" (UID: \"ba50df31-c8c4-481e-8f5a-53f3fc99f52c\") " pod="openshift-network-operator/iptables-alerter-2l9v7" Apr 17 17:25:08.634588 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.634567 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjzrq\" (UniqueName: \"kubernetes.io/projected/38a8697c-ea92-4488-90c5-13d587599dba-kube-api-access-zjzrq\") pod \"tuned-j67sl\" (UID: \"38a8697c-ea92-4488-90c5-13d587599dba\") " pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.656544 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:08.656514 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d0b942ad205f5fb58ca3113d3bd1bee.slice/crio-c355953fb571ad01d9431ef59daec2d7ae422b91c5629ca753ecb92a4e239189 WatchSource:0}: Error finding container c355953fb571ad01d9431ef59daec2d7ae422b91c5629ca753ecb92a4e239189: Status 404 returned error can't find the container with id c355953fb571ad01d9431ef59daec2d7ae422b91c5629ca753ecb92a4e239189 Apr 17 17:25:08.657329 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:08.657248 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc56e93d9fa5e41391e4558a0134622a0.slice/crio-a1f3bed5f783bb6e97065d183f3c2d7c8896bbc5e97b32324c146d363f58fcda WatchSource:0}: Error finding container a1f3bed5f783bb6e97065d183f3c2d7c8896bbc5e97b32324c146d363f58fcda: Status 404 returned error can't find the container with id a1f3bed5f783bb6e97065d183f3c2d7c8896bbc5e97b32324c146d363f58fcda Apr 17 17:25:08.660799 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.660784 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:25:08.722517 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722482 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdr89\" (UniqueName: \"kubernetes.io/projected/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-kube-api-access-qdr89\") pod \"network-metrics-daemon-zbptt\" (UID: \"a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4\") " pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:08.722517 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722515 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/18c0221a-27a0-4a22-a6bf-befe72dd72c0-etc-selinux\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.722733 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722533 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dcf417a0-db1f-4d94-9f13-bd789e955760-cni-binary-copy\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.722733 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722549 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dcf417a0-db1f-4d94-9f13-bd789e955760-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.722733 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722619 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/18c0221a-27a0-4a22-a6bf-befe72dd72c0-etc-selinux\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.722733 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722657 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b-host\") pod \"node-ca-hxcr5\" (UID: \"e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b\") " pod="openshift-image-registry/node-ca-hxcr5" Apr 17 17:25:08.722733 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722691 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/18c0221a-27a0-4a22-a6bf-befe72dd72c0-registration-dir\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.722733 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722717 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret\") pod \"global-pull-secret-syncer-lzfd4\" (UID: \"78665d2b-9809-4082-abf9-62de202c8f2f\") " pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:08.723030 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722759 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dcf417a0-db1f-4d94-9f13-bd789e955760-system-cni-dir\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.723030 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722780 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b-host\") pod \"node-ca-hxcr5\" (UID: \"e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b\") " pod="openshift-image-registry/node-ca-hxcr5" Apr 17 17:25:08.723030 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722784 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/78665d2b-9809-4082-abf9-62de202c8f2f-dbus\") pod \"global-pull-secret-syncer-lzfd4\" (UID: \"78665d2b-9809-4082-abf9-62de202c8f2f\") " pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:08.723030 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722821 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dcf417a0-db1f-4d94-9f13-bd789e955760-system-cni-dir\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.723030 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722812 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/18c0221a-27a0-4a22-a6bf-befe72dd72c0-registration-dir\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.723030 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722854 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fd529c67-6434-4e26-bfa8-0edc94a3b098-hosts-file\") pod \"node-resolver-hzc4l\" (UID: \"fd529c67-6434-4e26-bfa8-0edc94a3b098\") " pod="openshift-dns/node-resolver-hzc4l" Apr 17 17:25:08.723030 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:08.722870 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:08.723030 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722878 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fd529c67-6434-4e26-bfa8-0edc94a3b098-tmp-dir\") pod \"node-resolver-hzc4l\" (UID: \"fd529c67-6434-4e26-bfa8-0edc94a3b098\") " pod="openshift-dns/node-resolver-hzc4l" Apr 17 17:25:08.723030 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722890 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/78665d2b-9809-4082-abf9-62de202c8f2f-dbus\") pod \"global-pull-secret-syncer-lzfd4\" (UID: \"78665d2b-9809-4082-abf9-62de202c8f2f\") " pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:08.723030 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722898 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fd529c67-6434-4e26-bfa8-0edc94a3b098-hosts-file\") pod \"node-resolver-hzc4l\" (UID: \"fd529c67-6434-4e26-bfa8-0edc94a3b098\") " pod="openshift-dns/node-resolver-hzc4l" Apr 17 17:25:08.723030 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722909 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b-serviceca\") pod \"node-ca-hxcr5\" (UID: \"e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b\") " pod="openshift-image-registry/node-ca-hxcr5" Apr 17 17:25:08.723030 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:08.722935 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret podName:78665d2b-9809-4082-abf9-62de202c8f2f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:09.222913965 +0000 UTC m=+2.161819221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret") pod "global-pull-secret-syncer-lzfd4" (UID: "78665d2b-9809-4082-abf9-62de202c8f2f") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:08.723030 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.722973 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/78665d2b-9809-4082-abf9-62de202c8f2f-kubelet-config\") pod \"global-pull-secret-syncer-lzfd4\" (UID: \"78665d2b-9809-4082-abf9-62de202c8f2f\") " pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:08.723030 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723030 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swspk\" (UniqueName: \"kubernetes.io/projected/e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b-kube-api-access-swspk\") pod \"node-ca-hxcr5\" (UID: \"e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b\") " pod="openshift-image-registry/node-ca-hxcr5" Apr 17 17:25:08.723606 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723054 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/18c0221a-27a0-4a22-a6bf-befe72dd72c0-device-dir\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.723606 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723061 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/78665d2b-9809-4082-abf9-62de202c8f2f-kubelet-config\") pod \"global-pull-secret-syncer-lzfd4\" (UID: \"78665d2b-9809-4082-abf9-62de202c8f2f\") " pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:08.723606 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723087 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dcf417a0-db1f-4d94-9f13-bd789e955760-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.723606 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723111 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18c0221a-27a0-4a22-a6bf-befe72dd72c0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.723606 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723135 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/18c0221a-27a0-4a22-a6bf-befe72dd72c0-socket-dir\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.723606 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723168 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dcf417a0-db1f-4d94-9f13-bd789e955760-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.723606 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723199 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fd529c67-6434-4e26-bfa8-0edc94a3b098-tmp-dir\") pod \"node-resolver-hzc4l\" (UID: \"fd529c67-6434-4e26-bfa8-0edc94a3b098\") " pod="openshift-dns/node-resolver-hzc4l" Apr 17 17:25:08.723606 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723205 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dcf417a0-db1f-4d94-9f13-bd789e955760-os-release\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.723606 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723232 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snmx9\" (UniqueName: \"kubernetes.io/projected/dcf417a0-db1f-4d94-9f13-bd789e955760-kube-api-access-snmx9\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.723606 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723258 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/18c0221a-27a0-4a22-a6bf-befe72dd72c0-sys-fs\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.723606 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723258 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/18c0221a-27a0-4a22-a6bf-befe72dd72c0-device-dir\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.723606 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723270 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dcf417a0-db1f-4d94-9f13-bd789e955760-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.723606 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723289 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dcf417a0-db1f-4d94-9f13-bd789e955760-cni-binary-copy\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.723606 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723294 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dcf417a0-db1f-4d94-9f13-bd789e955760-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.723606 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723290 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtzml\" (UniqueName: \"kubernetes.io/projected/18c0221a-27a0-4a22-a6bf-befe72dd72c0-kube-api-access-gtzml\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.723606 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723309 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18c0221a-27a0-4a22-a6bf-befe72dd72c0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.724107 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723341 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dcf417a0-db1f-4d94-9f13-bd789e955760-cnibin\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.724107 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723343 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dcf417a0-db1f-4d94-9f13-bd789e955760-os-release\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.724107 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723367 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nplx6\" (UniqueName: \"kubernetes.io/projected/fd529c67-6434-4e26-bfa8-0edc94a3b098-kube-api-access-nplx6\") pod \"node-resolver-hzc4l\" (UID: \"fd529c67-6434-4e26-bfa8-0edc94a3b098\") " pod="openshift-dns/node-resolver-hzc4l" Apr 17 17:25:08.724107 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723376 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b-serviceca\") pod \"node-ca-hxcr5\" (UID: \"e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b\") " pod="openshift-image-registry/node-ca-hxcr5" Apr 17 17:25:08.724107 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723393 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/18c0221a-27a0-4a22-a6bf-befe72dd72c0-socket-dir\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.724107 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723397 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/18c0221a-27a0-4a22-a6bf-befe72dd72c0-sys-fs\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.724107 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723399 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs\") pod \"network-metrics-daemon-zbptt\" (UID: \"a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4\") " pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:08.724107 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723434 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dcf417a0-db1f-4d94-9f13-bd789e955760-cnibin\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.724107 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:08.723496 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:08.724107 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:08.723544 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs podName:a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:09.223526283 +0000 UTC m=+2.162431541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs") pod "network-metrics-daemon-zbptt" (UID: "a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:08.724107 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.723675 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dcf417a0-db1f-4d94-9f13-bd789e955760-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.734499 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.734474 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snmx9\" (UniqueName: \"kubernetes.io/projected/dcf417a0-db1f-4d94-9f13-bd789e955760-kube-api-access-snmx9\") pod \"multus-additional-cni-plugins-5fv24\" (UID: \"dcf417a0-db1f-4d94-9f13-bd789e955760\") " pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.734641 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.734604 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swspk\" (UniqueName: \"kubernetes.io/projected/e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b-kube-api-access-swspk\") pod \"node-ca-hxcr5\" (UID: \"e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b\") " pod="openshift-image-registry/node-ca-hxcr5" Apr 17 17:25:08.734883 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.734866 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nplx6\" (UniqueName: \"kubernetes.io/projected/fd529c67-6434-4e26-bfa8-0edc94a3b098-kube-api-access-nplx6\") pod \"node-resolver-hzc4l\" (UID: \"fd529c67-6434-4e26-bfa8-0edc94a3b098\") " pod="openshift-dns/node-resolver-hzc4l" Apr 17 17:25:08.734963 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.734952 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtzml\" (UniqueName: \"kubernetes.io/projected/18c0221a-27a0-4a22-a6bf-befe72dd72c0-kube-api-access-gtzml\") pod \"aws-ebs-csi-driver-node-mvzv7\" (UID: \"18c0221a-27a0-4a22-a6bf-befe72dd72c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.735843 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.735824 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdr89\" (UniqueName: \"kubernetes.io/projected/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-kube-api-access-qdr89\") pod \"network-metrics-daemon-zbptt\" (UID: \"a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4\") " pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:08.820891 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.820857 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-w7cnl" Apr 17 17:25:08.827270 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:08.827244 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f443eb6_ad37_49de_a7de_e96834d08311.slice/crio-e8708e69589adb0a3e92bc554383cb683c0dfc96bfcacd64ffbdec799e84ad36 WatchSource:0}: Error finding container e8708e69589adb0a3e92bc554383cb683c0dfc96bfcacd64ffbdec799e84ad36: Status 404 returned error can't find the container with id e8708e69589adb0a3e92bc554383cb683c0dfc96bfcacd64ffbdec799e84ad36 Apr 17 17:25:08.837909 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.837888 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kx7gk" Apr 17 17:25:08.844768 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:08.844740 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54be9ce9_8e34_41ad_a9f8_f058d0ba0624.slice/crio-f7b2e94a7e947bfd27a86ca6d25a40e6b1e930736de47c68615def74af14eb0e WatchSource:0}: Error finding container f7b2e94a7e947bfd27a86ca6d25a40e6b1e930736de47c68615def74af14eb0e: Status 404 returned error can't find the container with id f7b2e94a7e947bfd27a86ca6d25a40e6b1e930736de47c68615def74af14eb0e Apr 17 17:25:08.858965 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.858940 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2l9v7" Apr 17 17:25:08.862640 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.862617 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:08.866351 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:08.866325 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba50df31_c8c4_481e_8f5a_53f3fc99f52c.slice/crio-7d715d6af513325e1d9447756bd5eaf36564448d264b41febed7c8bfb6fcfc78 WatchSource:0}: Error finding container 7d715d6af513325e1d9447756bd5eaf36564448d264b41febed7c8bfb6fcfc78: Status 404 returned error can't find the container with id 7d715d6af513325e1d9447756bd5eaf36564448d264b41febed7c8bfb6fcfc78 Apr 17 17:25:08.870789 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:08.870762 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod354f25fd_1072_4dea_8ce2_1953b417053a.slice/crio-ef64983b07f7b5b0c727745b621c0c74c08569f8cb22744146a62e5964ab2c42 WatchSource:0}: Error finding container ef64983b07f7b5b0c727745b621c0c74c08569f8cb22744146a62e5964ab2c42: Status 404 returned error can't find the container with id ef64983b07f7b5b0c727745b621c0c74c08569f8cb22744146a62e5964ab2c42 Apr 17 17:25:08.887618 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.887593 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-j67sl" Apr 17 17:25:08.893507 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:08.893479 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38a8697c_ea92_4488_90c5_13d587599dba.slice/crio-cdf0a0d20c5128ca8137f2b8284251c9ff0c1ca372f14c0b3c2ddb7cc73165ef WatchSource:0}: Error finding container cdf0a0d20c5128ca8137f2b8284251c9ff0c1ca372f14c0b3c2ddb7cc73165ef: Status 404 returned error can't find the container with id cdf0a0d20c5128ca8137f2b8284251c9ff0c1ca372f14c0b3c2ddb7cc73165ef Apr 17 17:25:08.904638 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.904617 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hxcr5" Apr 17 17:25:08.911501 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:08.911474 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode48539b9_0ec2_4c8c_bdd6_a57dbcc0fd2b.slice/crio-0f8623f2d9a5ca4034fa377e21a943f96e9e00671baf2c4a71eb538ae7c74f55 WatchSource:0}: Error finding container 0f8623f2d9a5ca4034fa377e21a943f96e9e00671baf2c4a71eb538ae7c74f55: Status 404 returned error can't find the container with id 0f8623f2d9a5ca4034fa377e21a943f96e9e00671baf2c4a71eb538ae7c74f55 Apr 17 17:25:08.920777 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.920759 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hzc4l" Apr 17 17:25:08.926855 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:08.926827 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd529c67_6434_4e26_bfa8_0edc94a3b098.slice/crio-bef8674addedfdbaf16e49db8c27ac336e5c08e4855e694a9fd27a9b47d2d83d WatchSource:0}: Error finding container bef8674addedfdbaf16e49db8c27ac336e5c08e4855e694a9fd27a9b47d2d83d: Status 404 returned error can't find the container with id bef8674addedfdbaf16e49db8c27ac336e5c08e4855e694a9fd27a9b47d2d83d Apr 17 17:25:08.938443 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.938416 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5fv24" Apr 17 17:25:08.942103 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:08.942081 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" Apr 17 17:25:08.944960 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:08.944939 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcf417a0_db1f_4d94_9f13_bd789e955760.slice/crio-9c65eb0aa54e5698e8ec49f868a63de843dbd6a3a9c00b52f258490dc29ad180 WatchSource:0}: Error finding container 9c65eb0aa54e5698e8ec49f868a63de843dbd6a3a9c00b52f258490dc29ad180: Status 404 returned error can't find the container with id 9c65eb0aa54e5698e8ec49f868a63de843dbd6a3a9c00b52f258490dc29ad180 Apr 17 17:25:08.950039 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:25:08.949595 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18c0221a_27a0_4a22_a6bf_befe72dd72c0.slice/crio-2d092d51f1e47b4d750423bd4a40492ead1f39a4e95722086dd8858082d7b6e1 WatchSource:0}: Error finding container 2d092d51f1e47b4d750423bd4a40492ead1f39a4e95722086dd8858082d7b6e1: Status 404 returned error can't find the container with id 2d092d51f1e47b4d750423bd4a40492ead1f39a4e95722086dd8858082d7b6e1 Apr 17 17:25:09.228506 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.228425 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs\") pod \"network-metrics-daemon-zbptt\" (UID: \"a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4\") " pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:09.228506 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.228496 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret\") pod \"global-pull-secret-syncer-lzfd4\" (UID: \"78665d2b-9809-4082-abf9-62de202c8f2f\") " pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:09.228714 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.228519 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t84fc\" (UniqueName: \"kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc\") pod \"network-check-target-lwq8d\" (UID: \"5a407c11-3cbe-4521-8abd-48c6506368fb\") " pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:09.228714 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:09.228634 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:09.228714 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:09.228649 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:09.228714 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:09.228658 2571 projected.go:194] Error preparing data for projected volume kube-api-access-t84fc for pod openshift-network-diagnostics/network-check-target-lwq8d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:09.228985 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:09.228714 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc podName:5a407c11-3cbe-4521-8abd-48c6506368fb nodeName:}" failed. No retries permitted until 2026-04-17 17:25:10.228695196 +0000 UTC m=+3.167600442 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-t84fc" (UniqueName: "kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc") pod "network-check-target-lwq8d" (UID: "5a407c11-3cbe-4521-8abd-48c6506368fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:09.229215 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:09.229194 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:09.229290 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:09.229255 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs podName:a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:10.229237599 +0000 UTC m=+3.168142840 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs") pod "network-metrics-daemon-zbptt" (UID: "a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:09.229358 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:09.229312 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:09.229358 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:09.229345 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret podName:78665d2b-9809-4082-abf9-62de202c8f2f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:10.22933364 +0000 UTC m=+3.168238882 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret") pod "global-pull-secret-syncer-lzfd4" (UID: "78665d2b-9809-4082-abf9-62de202c8f2f") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:09.418435 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.418404 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:09.550383 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.550291 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:20:08 +0000 UTC" deadline="2028-01-08 23:25:54.839831786 +0000 UTC" Apr 17 17:25:09.550383 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.550332 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15150h0m45.289508686s" Apr 17 17:25:09.653940 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.653906 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:09.654142 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:09.654058 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lwq8d" podUID="5a407c11-3cbe-4521-8abd-48c6506368fb" Apr 17 17:25:09.654540 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.654515 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:09.654658 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:09.654636 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbptt" podUID="a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4" Apr 17 17:25:09.680640 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.680548 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fv24" event={"ID":"dcf417a0-db1f-4d94-9f13-bd789e955760","Type":"ContainerStarted","Data":"9c65eb0aa54e5698e8ec49f868a63de843dbd6a3a9c00b52f258490dc29ad180"} Apr 17 17:25:09.691578 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.691538 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hzc4l" event={"ID":"fd529c67-6434-4e26-bfa8-0edc94a3b098","Type":"ContainerStarted","Data":"bef8674addedfdbaf16e49db8c27ac336e5c08e4855e694a9fd27a9b47d2d83d"} Apr 17 17:25:09.702375 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.702338 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hxcr5" event={"ID":"e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b","Type":"ContainerStarted","Data":"0f8623f2d9a5ca4034fa377e21a943f96e9e00671baf2c4a71eb538ae7c74f55"} Apr 17 17:25:09.716321 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.716285 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" event={"ID":"354f25fd-1072-4dea-8ce2-1953b417053a","Type":"ContainerStarted","Data":"ef64983b07f7b5b0c727745b621c0c74c08569f8cb22744146a62e5964ab2c42"} Apr 17 17:25:09.723094 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.723057 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-w7cnl" event={"ID":"1f443eb6-ad37-49de-a7de-e96834d08311","Type":"ContainerStarted","Data":"e8708e69589adb0a3e92bc554383cb683c0dfc96bfcacd64ffbdec799e84ad36"} Apr 17 17:25:09.745337 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.745290 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal" event={"ID":"c56e93d9fa5e41391e4558a0134622a0","Type":"ContainerStarted","Data":"a1f3bed5f783bb6e97065d183f3c2d7c8896bbc5e97b32324c146d363f58fcda"} Apr 17 17:25:09.763300 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.763259 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" event={"ID":"18c0221a-27a0-4a22-a6bf-befe72dd72c0","Type":"ContainerStarted","Data":"2d092d51f1e47b4d750423bd4a40492ead1f39a4e95722086dd8858082d7b6e1"} Apr 17 17:25:09.781820 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.781742 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-j67sl" event={"ID":"38a8697c-ea92-4488-90c5-13d587599dba","Type":"ContainerStarted","Data":"cdf0a0d20c5128ca8137f2b8284251c9ff0c1ca372f14c0b3c2ddb7cc73165ef"} Apr 17 17:25:09.795594 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.795554 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2l9v7" event={"ID":"ba50df31-c8c4-481e-8f5a-53f3fc99f52c","Type":"ContainerStarted","Data":"7d715d6af513325e1d9447756bd5eaf36564448d264b41febed7c8bfb6fcfc78"} Apr 17 17:25:09.823556 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.823338 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kx7gk" event={"ID":"54be9ce9-8e34-41ad-a9f8-f058d0ba0624","Type":"ContainerStarted","Data":"f7b2e94a7e947bfd27a86ca6d25a40e6b1e930736de47c68615def74af14eb0e"} Apr 17 17:25:09.830562 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.830524 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-105.ec2.internal" event={"ID":"2d0b942ad205f5fb58ca3113d3bd1bee","Type":"ContainerStarted","Data":"c355953fb571ad01d9431ef59daec2d7ae422b91c5629ca753ecb92a4e239189"} Apr 17 17:25:09.948066 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:09.947828 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:10.011363 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:10.011019 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:10.238381 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:10.238263 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs\") pod \"network-metrics-daemon-zbptt\" (UID: \"a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4\") " pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:10.238381 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:10.238326 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret\") pod \"global-pull-secret-syncer-lzfd4\" (UID: \"78665d2b-9809-4082-abf9-62de202c8f2f\") " pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:10.238381 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:10.238359 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t84fc\" (UniqueName: \"kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc\") pod \"network-check-target-lwq8d\" (UID: \"5a407c11-3cbe-4521-8abd-48c6506368fb\") " pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:10.238695 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:10.238511 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:10.238695 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:10.238530 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:10.238695 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:10.238543 2571 projected.go:194] Error preparing data for projected volume kube-api-access-t84fc for pod openshift-network-diagnostics/network-check-target-lwq8d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:10.238695 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:10.238602 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc podName:5a407c11-3cbe-4521-8abd-48c6506368fb nodeName:}" failed. No retries permitted until 2026-04-17 17:25:12.238584626 +0000 UTC m=+5.177489885 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-t84fc" (UniqueName: "kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc") pod "network-check-target-lwq8d" (UID: "5a407c11-3cbe-4521-8abd-48c6506368fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:10.239063 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:10.239043 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:10.239146 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:10.239108 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs podName:a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:12.23909219 +0000 UTC m=+5.177997436 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs") pod "network-metrics-daemon-zbptt" (UID: "a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:10.239221 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:10.239172 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:10.239221 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:10.239207 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret podName:78665d2b-9809-4082-abf9-62de202c8f2f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:12.239196044 +0000 UTC m=+5.178101291 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret") pod "global-pull-secret-syncer-lzfd4" (UID: "78665d2b-9809-4082-abf9-62de202c8f2f") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:10.551569 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:10.551472 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:20:08 +0000 UTC" deadline="2028-01-11 22:29:52.696659922 +0000 UTC" Apr 17 17:25:10.551569 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:10.551516 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15221h4m42.145148238s" Apr 17 17:25:10.654006 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:10.653954 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:10.654176 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:10.654114 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lzfd4" podUID="78665d2b-9809-4082-abf9-62de202c8f2f" Apr 17 17:25:11.656195 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:11.655416 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:11.656195 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:11.655550 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lwq8d" podUID="5a407c11-3cbe-4521-8abd-48c6506368fb" Apr 17 17:25:11.656195 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:11.655984 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:11.656195 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:11.656115 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbptt" podUID="a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4" Apr 17 17:25:12.257365 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:12.257328 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs\") pod \"network-metrics-daemon-zbptt\" (UID: \"a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4\") " pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:12.257524 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:12.257388 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret\") pod \"global-pull-secret-syncer-lzfd4\" (UID: \"78665d2b-9809-4082-abf9-62de202c8f2f\") " pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:12.257524 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:12.257420 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t84fc\" (UniqueName: \"kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc\") pod \"network-check-target-lwq8d\" (UID: \"5a407c11-3cbe-4521-8abd-48c6506368fb\") " pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:12.257606 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:12.257568 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:12.257606 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:12.257585 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:12.257606 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:12.257597 2571 projected.go:194] Error preparing data for projected volume kube-api-access-t84fc for pod openshift-network-diagnostics/network-check-target-lwq8d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:12.257694 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:12.257655 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc podName:5a407c11-3cbe-4521-8abd-48c6506368fb nodeName:}" failed. No retries permitted until 2026-04-17 17:25:16.257635195 +0000 UTC m=+9.196540439 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-t84fc" (UniqueName: "kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc") pod "network-check-target-lwq8d" (UID: "5a407c11-3cbe-4521-8abd-48c6506368fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:12.258102 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:12.258082 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:12.258196 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:12.258138 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs podName:a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:16.258122684 +0000 UTC m=+9.197027929 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs") pod "network-metrics-daemon-zbptt" (UID: "a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:12.258196 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:12.258187 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:12.258307 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:12.258218 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret podName:78665d2b-9809-4082-abf9-62de202c8f2f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:16.258208484 +0000 UTC m=+9.197113728 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret") pod "global-pull-secret-syncer-lzfd4" (UID: "78665d2b-9809-4082-abf9-62de202c8f2f") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:12.654192 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:12.653612 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:12.654192 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:12.653752 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lzfd4" podUID="78665d2b-9809-4082-abf9-62de202c8f2f" Apr 17 17:25:13.653830 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:13.653795 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:13.654348 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:13.653947 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbptt" podUID="a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4" Apr 17 17:25:13.656832 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:13.656651 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:13.656832 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:13.656775 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lwq8d" podUID="5a407c11-3cbe-4521-8abd-48c6506368fb" Apr 17 17:25:14.653008 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:14.652961 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:14.653172 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:14.653110 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lzfd4" podUID="78665d2b-9809-4082-abf9-62de202c8f2f" Apr 17 17:25:15.654740 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:15.654158 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:15.654740 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:15.654185 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:15.654740 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:15.654311 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbptt" podUID="a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4" Apr 17 17:25:15.654740 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:15.654433 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lwq8d" podUID="5a407c11-3cbe-4521-8abd-48c6506368fb" Apr 17 17:25:16.291046 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:16.291009 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs\") pod \"network-metrics-daemon-zbptt\" (UID: \"a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4\") " pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:16.291231 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:16.291068 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret\") pod \"global-pull-secret-syncer-lzfd4\" (UID: \"78665d2b-9809-4082-abf9-62de202c8f2f\") " pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:16.291231 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:16.291171 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:16.291231 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:16.291175 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:16.291231 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:16.291227 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret podName:78665d2b-9809-4082-abf9-62de202c8f2f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:24.291208556 +0000 UTC m=+17.230113812 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret") pod "global-pull-secret-syncer-lzfd4" (UID: "78665d2b-9809-4082-abf9-62de202c8f2f") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:16.291438 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:16.291249 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t84fc\" (UniqueName: \"kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc\") pod \"network-check-target-lwq8d\" (UID: \"5a407c11-3cbe-4521-8abd-48c6506368fb\") " pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:16.291438 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:16.291320 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs podName:a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:24.291307213 +0000 UTC m=+17.230212457 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs") pod "network-metrics-daemon-zbptt" (UID: "a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:16.291438 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:16.291378 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:16.291438 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:16.291403 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:16.291566 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:16.291448 2571 projected.go:194] Error preparing data for projected volume kube-api-access-t84fc for pod openshift-network-diagnostics/network-check-target-lwq8d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:16.291566 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:16.291495 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc podName:5a407c11-3cbe-4521-8abd-48c6506368fb nodeName:}" failed. No retries permitted until 2026-04-17 17:25:24.291479998 +0000 UTC m=+17.230385253 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-t84fc" (UniqueName: "kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc") pod "network-check-target-lwq8d" (UID: "5a407c11-3cbe-4521-8abd-48c6506368fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:16.653822 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:16.653118 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:16.653822 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:16.653465 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lzfd4" podUID="78665d2b-9809-4082-abf9-62de202c8f2f" Apr 17 17:25:17.654778 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:17.654687 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:17.655235 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:17.654791 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lwq8d" podUID="5a407c11-3cbe-4521-8abd-48c6506368fb" Apr 17 17:25:17.655300 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:17.655243 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:17.655379 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:17.655358 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbptt" podUID="a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4" Apr 17 17:25:18.653872 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:18.653837 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:18.654074 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:18.653969 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lzfd4" podUID="78665d2b-9809-4082-abf9-62de202c8f2f" Apr 17 17:25:19.653637 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:19.653602 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:19.654068 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:19.653609 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:19.654068 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:19.653731 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lwq8d" podUID="5a407c11-3cbe-4521-8abd-48c6506368fb" Apr 17 17:25:19.654068 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:19.653802 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbptt" podUID="a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4" Apr 17 17:25:20.653669 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:20.653628 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:20.654080 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:20.653746 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lzfd4" podUID="78665d2b-9809-4082-abf9-62de202c8f2f" Apr 17 17:25:21.653784 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:21.653748 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:21.653784 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:21.653787 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:21.654194 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:21.653865 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbptt" podUID="a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4" Apr 17 17:25:21.654194 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:21.653978 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lwq8d" podUID="5a407c11-3cbe-4521-8abd-48c6506368fb" Apr 17 17:25:22.653608 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:22.653564 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:22.653791 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:22.653683 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lzfd4" podUID="78665d2b-9809-4082-abf9-62de202c8f2f" Apr 17 17:25:23.653715 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:23.653634 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:23.654164 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:23.653634 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:23.654164 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:23.653756 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lwq8d" podUID="5a407c11-3cbe-4521-8abd-48c6506368fb" Apr 17 17:25:23.654164 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:23.653864 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbptt" podUID="a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4" Apr 17 17:25:24.351521 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:24.351471 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs\") pod \"network-metrics-daemon-zbptt\" (UID: \"a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4\") " pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:24.351521 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:24.351523 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret\") pod \"global-pull-secret-syncer-lzfd4\" (UID: \"78665d2b-9809-4082-abf9-62de202c8f2f\") " pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:24.351781 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:24.351545 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t84fc\" (UniqueName: \"kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc\") pod \"network-check-target-lwq8d\" (UID: \"5a407c11-3cbe-4521-8abd-48c6506368fb\") " pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:24.351781 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:24.351635 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:24.351781 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:24.351653 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:24.351781 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:24.351672 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:24.351781 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:24.351686 2571 projected.go:194] Error preparing data for projected volume kube-api-access-t84fc for pod openshift-network-diagnostics/network-check-target-lwq8d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:24.351781 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:24.351712 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret podName:78665d2b-9809-4082-abf9-62de202c8f2f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:40.351692238 +0000 UTC m=+33.290597482 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret") pod "global-pull-secret-syncer-lzfd4" (UID: "78665d2b-9809-4082-abf9-62de202c8f2f") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:24.351781 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:24.351635 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:24.351781 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:24.351752 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc podName:5a407c11-3cbe-4521-8abd-48c6506368fb nodeName:}" failed. No retries permitted until 2026-04-17 17:25:40.351722487 +0000 UTC m=+33.290627747 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-t84fc" (UniqueName: "kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc") pod "network-check-target-lwq8d" (UID: "5a407c11-3cbe-4521-8abd-48c6506368fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:24.352125 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:24.351793 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs podName:a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:40.351774353 +0000 UTC m=+33.290679610 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs") pod "network-metrics-daemon-zbptt" (UID: "a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:24.653367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:24.653282 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:24.653509 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:24.653397 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lzfd4" podUID="78665d2b-9809-4082-abf9-62de202c8f2f" Apr 17 17:25:25.653958 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:25.653923 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:25.654475 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:25.654062 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lwq8d" podUID="5a407c11-3cbe-4521-8abd-48c6506368fb" Apr 17 17:25:25.654475 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:25.654125 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:25.654475 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:25.654251 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbptt" podUID="a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4" Apr 17 17:25:26.652914 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:26.652883 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:26.653103 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:26.652981 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lzfd4" podUID="78665d2b-9809-4082-abf9-62de202c8f2f" Apr 17 17:25:27.654452 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:27.654418 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:27.655201 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:27.654542 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lwq8d" podUID="5a407c11-3cbe-4521-8abd-48c6506368fb" Apr 17 17:25:27.655201 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:27.654595 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:27.655201 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:27.654742 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbptt" podUID="a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4" Apr 17 17:25:27.879622 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:27.879593 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-j67sl" event={"ID":"38a8697c-ea92-4488-90c5-13d587599dba","Type":"ContainerStarted","Data":"0daa8da1ca0266d228f999aacaa1efa5d6a9b46b7cb905e04adce6636dda8f79"} Apr 17 17:25:27.881814 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:27.881788 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kx7gk" event={"ID":"54be9ce9-8e34-41ad-a9f8-f058d0ba0624","Type":"ContainerStarted","Data":"c6b908bce7d328adede88c76aa5bf1a85ca53732359bb523bc5de1c1db4558d4"} Apr 17 17:25:27.884347 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:27.883496 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-105.ec2.internal" event={"ID":"2d0b942ad205f5fb58ca3113d3bd1bee","Type":"ContainerStarted","Data":"e0e7fca71ab5309b3581fdcd46b00f66d790ce81ba602cae1c3fe1f6bfa79083"} Apr 17 17:25:27.887864 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:27.887843 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/ovn-acl-logging/0.log" Apr 17 17:25:27.888240 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:27.888215 2571 generic.go:358] "Generic (PLEG): container finished" podID="354f25fd-1072-4dea-8ce2-1953b417053a" containerID="37aebdf30b8b67c06b9426ae9847045bf6a9d65534a023ac0ad0d34803e0b4a9" exitCode=1 Apr 17 17:25:27.888306 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:27.888261 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" event={"ID":"354f25fd-1072-4dea-8ce2-1953b417053a","Type":"ContainerStarted","Data":"8fd779215d1d7e897a8c6dde236590eb2c31e181a459828fb3c44e3ad8fd868d"} Apr 17 17:25:27.888306 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:27.888283 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" event={"ID":"354f25fd-1072-4dea-8ce2-1953b417053a","Type":"ContainerStarted","Data":"3cacadfab026301bff56453c3b41a2c360748971b8bf24b0da6ded25cd315ecb"} Apr 17 17:25:27.888306 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:27.888297 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" event={"ID":"354f25fd-1072-4dea-8ce2-1953b417053a","Type":"ContainerStarted","Data":"986abd138e64fd51c0e3120e8999392a6e87da047f0dda2ed5383034433a91f9"} Apr 17 17:25:27.888432 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:27.888310 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" event={"ID":"354f25fd-1072-4dea-8ce2-1953b417053a","Type":"ContainerStarted","Data":"1a11c42dbe0af8ca0366c04f375ede27fa63d60d38779e609942212c09ce8e87"} Apr 17 17:25:27.888432 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:27.888323 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" event={"ID":"354f25fd-1072-4dea-8ce2-1953b417053a","Type":"ContainerDied","Data":"37aebdf30b8b67c06b9426ae9847045bf6a9d65534a023ac0ad0d34803e0b4a9"} Apr 17 17:25:27.888432 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:27.888339 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" event={"ID":"354f25fd-1072-4dea-8ce2-1953b417053a","Type":"ContainerStarted","Data":"4bcd99a9fadccca138db7b9360916097a9d231e9088e1459a1455a0e7f8f5478"} Apr 17 17:25:27.903700 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:27.903451 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-j67sl" podStartSLOduration=2.876316589 podStartE2EDuration="20.903430117s" podCreationTimestamp="2026-04-17 17:25:07 +0000 UTC" firstStartedPulling="2026-04-17 17:25:08.895078043 +0000 UTC m=+1.833983284" lastFinishedPulling="2026-04-17 17:25:26.922191559 +0000 UTC m=+19.861096812" observedRunningTime="2026-04-17 17:25:27.899472559 +0000 UTC m=+20.838377822" watchObservedRunningTime="2026-04-17 17:25:27.903430117 +0000 UTC m=+20.842335380" Apr 17 17:25:27.933546 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:27.932713 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kx7gk" podStartSLOduration=2.5466719209999997 podStartE2EDuration="20.932697138s" podCreationTimestamp="2026-04-17 17:25:07 +0000 UTC" firstStartedPulling="2026-04-17 17:25:08.846343657 +0000 UTC m=+1.785248898" lastFinishedPulling="2026-04-17 17:25:27.232368861 +0000 UTC m=+20.171274115" observedRunningTime="2026-04-17 17:25:27.9181548 +0000 UTC m=+20.857060064" watchObservedRunningTime="2026-04-17 17:25:27.932697138 +0000 UTC m=+20.871602402" Apr 17 17:25:28.653837 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:28.653738 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:28.654015 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:28.653868 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lzfd4" podUID="78665d2b-9809-4082-abf9-62de202c8f2f" Apr 17 17:25:28.860127 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:28.860104 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 17:25:28.891896 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:28.891855 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" event={"ID":"18c0221a-27a0-4a22-a6bf-befe72dd72c0","Type":"ContainerStarted","Data":"fd152d0def17b8bae287acb2a4939e11bae8f055f165eeddea330b6ffc371bd3"} Apr 17 17:25:28.891896 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:28.891887 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" event={"ID":"18c0221a-27a0-4a22-a6bf-befe72dd72c0","Type":"ContainerStarted","Data":"eb14c39e1f929e0458eb42043024969c5dd4e32e628c4f3727274c66a0c5a6d5"} Apr 17 17:25:28.893063 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:28.893034 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2l9v7" event={"ID":"ba50df31-c8c4-481e-8f5a-53f3fc99f52c","Type":"ContainerStarted","Data":"a331696bf529bd06ec6187c09f819abb8ef8a9e19732c8d41ce4522a5c1d64c5"} Apr 17 17:25:28.894259 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:28.894239 2571 generic.go:358] "Generic (PLEG): container finished" podID="dcf417a0-db1f-4d94-9f13-bd789e955760" containerID="fe86552e43f8045d851c620bd7ac3b5b38cf7a70d60090cbe3c19fd3ebd2cae4" exitCode=0 Apr 17 17:25:28.894339 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:28.894297 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fv24" event={"ID":"dcf417a0-db1f-4d94-9f13-bd789e955760","Type":"ContainerDied","Data":"fe86552e43f8045d851c620bd7ac3b5b38cf7a70d60090cbe3c19fd3ebd2cae4"} Apr 17 17:25:28.895639 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:28.895552 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hzc4l" event={"ID":"fd529c67-6434-4e26-bfa8-0edc94a3b098","Type":"ContainerStarted","Data":"9d9faea6f6bb75572c7ca2b43a47ad6b8906a88d7b8a7df4a5571c4284d0e975"} Apr 17 17:25:28.896840 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:28.896817 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hxcr5" event={"ID":"e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b","Type":"ContainerStarted","Data":"224e01f997c02fab1fb607feb75a781733a03f06336385db9954fcda45b2400a"} Apr 17 17:25:28.900043 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:28.900023 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-w7cnl" event={"ID":"1f443eb6-ad37-49de-a7de-e96834d08311","Type":"ContainerStarted","Data":"d12f3efdc3f72983e2b73d87b0bd91437f7bf5ec044a67cd21188eb0fcda6614"} Apr 17 17:25:28.901393 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:28.901371 2571 generic.go:358] "Generic (PLEG): container finished" podID="c56e93d9fa5e41391e4558a0134622a0" containerID="51577496f684f4c425a4feaefaab1ff053bcd7054bc6e0d6e9b24fcc0e23a0b1" exitCode=0 Apr 17 17:25:28.901502 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:28.901480 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal" event={"ID":"c56e93d9fa5e41391e4558a0134622a0","Type":"ContainerDied","Data":"51577496f684f4c425a4feaefaab1ff053bcd7054bc6e0d6e9b24fcc0e23a0b1"} Apr 17 17:25:28.908710 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:28.908674 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-105.ec2.internal" podStartSLOduration=21.908661772 podStartE2EDuration="21.908661772s" podCreationTimestamp="2026-04-17 17:25:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:27.933101322 +0000 UTC m=+20.872006586" watchObservedRunningTime="2026-04-17 17:25:28.908661772 +0000 UTC m=+21.847567061" Apr 17 17:25:28.922899 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:28.922852 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2l9v7" podStartSLOduration=3.865524007 podStartE2EDuration="21.922836457s" podCreationTimestamp="2026-04-17 17:25:07 +0000 UTC" firstStartedPulling="2026-04-17 17:25:08.867889024 +0000 UTC m=+1.806794265" lastFinishedPulling="2026-04-17 17:25:26.925201459 +0000 UTC m=+19.864106715" observedRunningTime="2026-04-17 17:25:28.908134609 +0000 UTC m=+21.847039871" watchObservedRunningTime="2026-04-17 17:25:28.922836457 +0000 UTC m=+21.861741735" Apr 17 17:25:28.923436 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:28.923399 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hzc4l" podStartSLOduration=3.931986728 podStartE2EDuration="21.923389767s" podCreationTimestamp="2026-04-17 17:25:07 +0000 UTC" firstStartedPulling="2026-04-17 17:25:08.92853093 +0000 UTC m=+1.867436170" lastFinishedPulling="2026-04-17 17:25:26.919933967 +0000 UTC m=+19.858839209" observedRunningTime="2026-04-17 17:25:28.92268915 +0000 UTC m=+21.861594413" watchObservedRunningTime="2026-04-17 17:25:28.923389767 +0000 UTC m=+21.862295031" Apr 17 17:25:28.936615 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:28.936574 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hxcr5" podStartSLOduration=3.958454646 podStartE2EDuration="21.936556819s" podCreationTimestamp="2026-04-17 17:25:07 +0000 UTC" firstStartedPulling="2026-04-17 17:25:08.913106521 +0000 UTC m=+1.852011762" lastFinishedPulling="2026-04-17 17:25:26.891208678 +0000 UTC m=+19.830113935" observedRunningTime="2026-04-17 17:25:28.936111641 +0000 UTC m=+21.875016905" watchObservedRunningTime="2026-04-17 17:25:28.936556819 +0000 UTC m=+21.875462084" Apr 17 17:25:29.002958 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:29.002919 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-w7cnl" podStartSLOduration=3.912793896 podStartE2EDuration="22.002906659s" podCreationTimestamp="2026-04-17 17:25:07 +0000 UTC" firstStartedPulling="2026-04-17 17:25:08.828822112 +0000 UTC m=+1.767727356" lastFinishedPulling="2026-04-17 17:25:26.918934866 +0000 UTC m=+19.857840119" observedRunningTime="2026-04-17 17:25:29.002546666 +0000 UTC m=+21.941451932" watchObservedRunningTime="2026-04-17 17:25:29.002906659 +0000 UTC m=+21.941811922" Apr 17 17:25:29.590761 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:29.590643 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T17:25:28.860124411Z","UUID":"247cea8d-ef92-4373-a486-43565d789ef6","Handler":null,"Name":"","Endpoint":""} Apr 17 17:25:29.595811 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:29.595036 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 17:25:29.595811 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:29.595070 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 17:25:29.653931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:29.653893 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:29.654128 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:29.653939 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:29.654128 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:29.654046 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbptt" podUID="a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4" Apr 17 17:25:29.654249 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:29.654134 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lwq8d" podUID="5a407c11-3cbe-4521-8abd-48c6506368fb" Apr 17 17:25:29.907301 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:29.907279 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/ovn-acl-logging/0.log" Apr 17 17:25:29.907657 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:29.907635 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" event={"ID":"354f25fd-1072-4dea-8ce2-1953b417053a","Type":"ContainerStarted","Data":"5dedfaae88be43c3f947f5c9d55ff591dbdf4633e17d2656f80c3edad5fa304e"} Apr 17 17:25:29.909915 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:29.909882 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal" event={"ID":"c56e93d9fa5e41391e4558a0134622a0","Type":"ContainerStarted","Data":"49b19943ebcd0a7cee070e173b68aebf7b339fc36dab37e90c4b334d9fbf29c5"} Apr 17 17:25:30.653415 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:30.653390 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:30.653620 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:30.653505 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lzfd4" podUID="78665d2b-9809-4082-abf9-62de202c8f2f" Apr 17 17:25:30.912818 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:30.912738 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" event={"ID":"18c0221a-27a0-4a22-a6bf-befe72dd72c0","Type":"ContainerStarted","Data":"8bd6ec303f341a7acd1b1d6e587f51a727d95d6876b139c352943aa688103d42"} Apr 17 17:25:30.933640 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:30.933589 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-105.ec2.internal" podStartSLOduration=23.933570808 podStartE2EDuration="23.933570808s" podCreationTimestamp="2026-04-17 17:25:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:29.937121265 +0000 UTC m=+22.876026540" watchObservedRunningTime="2026-04-17 17:25:30.933570808 +0000 UTC m=+23.872476071" Apr 17 17:25:30.934431 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:30.934391 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvzv7" podStartSLOduration=3.086883914 podStartE2EDuration="23.934382341s" podCreationTimestamp="2026-04-17 17:25:07 +0000 UTC" firstStartedPulling="2026-04-17 17:25:08.951824529 +0000 UTC m=+1.890729771" lastFinishedPulling="2026-04-17 17:25:29.799322945 +0000 UTC m=+22.738228198" observedRunningTime="2026-04-17 17:25:30.932750704 +0000 UTC m=+23.871655970" watchObservedRunningTime="2026-04-17 17:25:30.934382341 +0000 UTC m=+23.873287605" Apr 17 17:25:31.653875 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:31.653688 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:31.654079 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:31.653762 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:31.654079 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:31.653963 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lwq8d" podUID="5a407c11-3cbe-4521-8abd-48c6506368fb" Apr 17 17:25:31.654191 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:31.654098 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbptt" podUID="a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4" Apr 17 17:25:32.056075 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:32.056039 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-w7cnl" Apr 17 17:25:32.056800 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:32.056778 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-w7cnl" Apr 17 17:25:32.653729 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:32.653693 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:32.653903 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:32.653809 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lzfd4" podUID="78665d2b-9809-4082-abf9-62de202c8f2f" Apr 17 17:25:32.919684 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:32.919616 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/ovn-acl-logging/0.log" Apr 17 17:25:32.920035 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:32.920001 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" event={"ID":"354f25fd-1072-4dea-8ce2-1953b417053a","Type":"ContainerStarted","Data":"b4358d8db67546248869f03bf1661c5272028e578edcaeca1f7de4e1175c8ee8"} Apr 17 17:25:32.920261 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:32.920244 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-w7cnl" Apr 17 17:25:32.920576 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:32.920562 2571 scope.go:117] "RemoveContainer" containerID="37aebdf30b8b67c06b9426ae9847045bf6a9d65534a023ac0ad0d34803e0b4a9" Apr 17 17:25:32.920813 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:32.920795 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-w7cnl" Apr 17 17:25:33.654025 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:33.653833 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:33.654600 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:33.653863 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:33.654600 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:33.654103 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lwq8d" podUID="5a407c11-3cbe-4521-8abd-48c6506368fb" Apr 17 17:25:33.654600 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:33.654232 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbptt" podUID="a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4" Apr 17 17:25:33.923516 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:33.923438 2571 generic.go:358] "Generic (PLEG): container finished" podID="dcf417a0-db1f-4d94-9f13-bd789e955760" containerID="084c04345dd42dccd3a99c5c8490899c647d5f450bb4749b22b060bd720c3484" exitCode=0 Apr 17 17:25:33.923652 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:33.923519 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fv24" event={"ID":"dcf417a0-db1f-4d94-9f13-bd789e955760","Type":"ContainerDied","Data":"084c04345dd42dccd3a99c5c8490899c647d5f450bb4749b22b060bd720c3484"} Apr 17 17:25:33.926875 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:33.926857 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/ovn-acl-logging/0.log" Apr 17 17:25:33.927242 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:33.927214 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" event={"ID":"354f25fd-1072-4dea-8ce2-1953b417053a","Type":"ContainerStarted","Data":"dfccded32c514b38558e403b7fb70e044aa39e3a07a55d30196ab0e24d1a075a"} Apr 17 17:25:33.927592 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:33.927575 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:33.927665 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:33.927599 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:33.927665 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:33.927612 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:33.942230 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:33.942203 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:33.942347 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:33.942266 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:25:33.973263 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:33.973200 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" podStartSLOduration=8.833992174 podStartE2EDuration="26.973180845s" podCreationTimestamp="2026-04-17 17:25:07 +0000 UTC" firstStartedPulling="2026-04-17 17:25:08.872255114 +0000 UTC m=+1.811160355" lastFinishedPulling="2026-04-17 17:25:27.011443785 +0000 UTC m=+19.950349026" observedRunningTime="2026-04-17 17:25:33.972034494 +0000 UTC m=+26.910939757" watchObservedRunningTime="2026-04-17 17:25:33.973180845 +0000 UTC m=+26.912086109" Apr 17 17:25:34.653535 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:34.653510 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:34.653638 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:34.653611 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lzfd4" podUID="78665d2b-9809-4082-abf9-62de202c8f2f" Apr 17 17:25:34.730406 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:34.730372 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zbptt"] Apr 17 17:25:34.730783 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:34.730547 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:34.730783 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:34.730679 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbptt" podUID="a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4" Apr 17 17:25:34.734376 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:34.734350 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lzfd4"] Apr 17 17:25:34.741802 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:34.741781 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lwq8d"] Apr 17 17:25:34.741916 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:34.741903 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:34.742041 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:34.741983 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lwq8d" podUID="5a407c11-3cbe-4521-8abd-48c6506368fb" Apr 17 17:25:34.931022 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:34.930923 2571 generic.go:358] "Generic (PLEG): container finished" podID="dcf417a0-db1f-4d94-9f13-bd789e955760" containerID="6cd1deda2988542248461649a5e528c941c5aa3b42603f87801306e175b1b9fb" exitCode=0 Apr 17 17:25:34.931160 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:34.931056 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fv24" event={"ID":"dcf417a0-db1f-4d94-9f13-bd789e955760","Type":"ContainerDied","Data":"6cd1deda2988542248461649a5e528c941c5aa3b42603f87801306e175b1b9fb"} Apr 17 17:25:34.931506 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:34.931486 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:34.931658 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:34.931632 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lzfd4" podUID="78665d2b-9809-4082-abf9-62de202c8f2f" Apr 17 17:25:35.935848 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:35.935758 2571 generic.go:358] "Generic (PLEG): container finished" podID="dcf417a0-db1f-4d94-9f13-bd789e955760" containerID="5783e43b17e9abfba4b0b88a0624fed302d3721508df5352b5238b2a4159fccb" exitCode=0 Apr 17 17:25:35.935848 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:35.935811 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fv24" event={"ID":"dcf417a0-db1f-4d94-9f13-bd789e955760","Type":"ContainerDied","Data":"5783e43b17e9abfba4b0b88a0624fed302d3721508df5352b5238b2a4159fccb"} Apr 17 17:25:36.653774 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:36.653543 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:36.653928 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:36.653579 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:36.653928 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:36.653825 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbptt" podUID="a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4" Apr 17 17:25:36.653928 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:36.653584 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:36.653928 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:36.653909 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lwq8d" podUID="5a407c11-3cbe-4521-8abd-48c6506368fb" Apr 17 17:25:36.654163 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:36.653983 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lzfd4" podUID="78665d2b-9809-4082-abf9-62de202c8f2f" Apr 17 17:25:38.653407 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:38.653375 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:38.653951 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:38.653375 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:38.653951 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:38.653504 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lzfd4" podUID="78665d2b-9809-4082-abf9-62de202c8f2f" Apr 17 17:25:38.653951 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:38.653386 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:38.653951 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:38.653577 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lwq8d" podUID="5a407c11-3cbe-4521-8abd-48c6506368fb" Apr 17 17:25:38.653951 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:38.653717 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbptt" podUID="a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4" Apr 17 17:25:38.875731 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:38.875705 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-105.ec2.internal" event="NodeReady" Apr 17 17:25:38.875882 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:38.875856 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 17:25:38.935028 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:38.934927 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-frpbr"] Apr 17 17:25:38.983458 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:38.983425 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-r7dzq"] Apr 17 17:25:38.983646 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:38.983605 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-frpbr" Apr 17 17:25:38.986549 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:38.986526 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 17:25:38.986914 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:38.986896 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-w4qb9\"" Apr 17 17:25:38.987026 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:38.986969 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 17:25:38.998637 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:38.998616 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-frpbr"] Apr 17 17:25:38.998637 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:38.998639 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r7dzq"] Apr 17 17:25:38.998773 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:38.998745 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r7dzq" Apr 17 17:25:39.001821 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.001801 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 17:25:39.002305 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.002280 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fhhs9\"" Apr 17 17:25:39.002398 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.002284 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 17:25:39.003171 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.003152 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 17:25:39.063613 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.063572 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:25:39.063613 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.063617 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-tmp-dir\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:25:39.063835 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.063655 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p42jr\" (UniqueName: \"kubernetes.io/projected/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-kube-api-access-p42jr\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:25:39.063835 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.063749 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-config-volume\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:25:39.164466 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.164430 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-config-volume\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:25:39.164466 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.164473 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-598td\" (UniqueName: \"kubernetes.io/projected/32351f44-6fa2-44e0-81dd-549f2f19d705-kube-api-access-598td\") pod \"ingress-canary-r7dzq\" (UID: \"32351f44-6fa2-44e0-81dd-549f2f19d705\") " pod="openshift-ingress-canary/ingress-canary-r7dzq" Apr 17 17:25:39.164689 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.164511 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert\") pod \"ingress-canary-r7dzq\" (UID: \"32351f44-6fa2-44e0-81dd-549f2f19d705\") " pod="openshift-ingress-canary/ingress-canary-r7dzq" Apr 17 17:25:39.164689 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.164568 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:25:39.164689 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.164613 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-tmp-dir\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:25:39.164689 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.164675 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p42jr\" (UniqueName: \"kubernetes.io/projected/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-kube-api-access-p42jr\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:25:39.164870 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:39.164727 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:39.164870 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:39.164815 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls podName:4efcb7d2-ede6-4e86-920c-19ff33a94f7b nodeName:}" failed. No retries permitted until 2026-04-17 17:25:39.664789943 +0000 UTC m=+32.603695198 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls") pod "dns-default-frpbr" (UID: "4efcb7d2-ede6-4e86-920c-19ff33a94f7b") : secret "dns-default-metrics-tls" not found Apr 17 17:25:39.165059 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.165034 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-tmp-dir\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:25:39.165174 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.165154 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-config-volume\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:25:39.178709 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.178679 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p42jr\" (UniqueName: \"kubernetes.io/projected/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-kube-api-access-p42jr\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:25:39.265730 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.265691 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-598td\" (UniqueName: \"kubernetes.io/projected/32351f44-6fa2-44e0-81dd-549f2f19d705-kube-api-access-598td\") pod \"ingress-canary-r7dzq\" (UID: \"32351f44-6fa2-44e0-81dd-549f2f19d705\") " pod="openshift-ingress-canary/ingress-canary-r7dzq" Apr 17 17:25:39.265891 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.265759 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert\") pod \"ingress-canary-r7dzq\" (UID: \"32351f44-6fa2-44e0-81dd-549f2f19d705\") " pod="openshift-ingress-canary/ingress-canary-r7dzq" Apr 17 17:25:39.265969 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:39.265905 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:39.265969 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:39.265964 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert podName:32351f44-6fa2-44e0-81dd-549f2f19d705 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:39.765944883 +0000 UTC m=+32.704850144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert") pod "ingress-canary-r7dzq" (UID: "32351f44-6fa2-44e0-81dd-549f2f19d705") : secret "canary-serving-cert" not found Apr 17 17:25:39.286149 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.286125 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-598td\" (UniqueName: \"kubernetes.io/projected/32351f44-6fa2-44e0-81dd-549f2f19d705-kube-api-access-598td\") pod \"ingress-canary-r7dzq\" (UID: \"32351f44-6fa2-44e0-81dd-549f2f19d705\") " pod="openshift-ingress-canary/ingress-canary-r7dzq" Apr 17 17:25:39.668003 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.667912 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:25:39.668606 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:39.668081 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:39.668606 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:39.668148 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls podName:4efcb7d2-ede6-4e86-920c-19ff33a94f7b nodeName:}" failed. No retries permitted until 2026-04-17 17:25:40.668132774 +0000 UTC m=+33.607038019 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls") pod "dns-default-frpbr" (UID: "4efcb7d2-ede6-4e86-920c-19ff33a94f7b") : secret "dns-default-metrics-tls" not found Apr 17 17:25:39.769182 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:39.769144 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert\") pod \"ingress-canary-r7dzq\" (UID: \"32351f44-6fa2-44e0-81dd-549f2f19d705\") " pod="openshift-ingress-canary/ingress-canary-r7dzq" Apr 17 17:25:39.769337 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:39.769315 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:39.769409 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:39.769397 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert podName:32351f44-6fa2-44e0-81dd-549f2f19d705 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:40.769372452 +0000 UTC m=+33.708277707 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert") pod "ingress-canary-r7dzq" (UID: "32351f44-6fa2-44e0-81dd-549f2f19d705") : secret "canary-serving-cert" not found Apr 17 17:25:40.374591 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:40.374548 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs\") pod \"network-metrics-daemon-zbptt\" (UID: \"a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4\") " pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:40.374774 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:40.374616 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret\") pod \"global-pull-secret-syncer-lzfd4\" (UID: \"78665d2b-9809-4082-abf9-62de202c8f2f\") " pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:40.374774 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:40.374656 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t84fc\" (UniqueName: \"kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc\") pod \"network-check-target-lwq8d\" (UID: \"5a407c11-3cbe-4521-8abd-48c6506368fb\") " pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:40.374774 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:40.374724 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:40.374895 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:40.374798 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:40.374895 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:40.374804 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret podName:78665d2b-9809-4082-abf9-62de202c8f2f nodeName:}" failed. No retries permitted until 2026-04-17 17:26:12.374788658 +0000 UTC m=+65.313693902 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret") pod "global-pull-secret-syncer-lzfd4" (UID: "78665d2b-9809-4082-abf9-62de202c8f2f") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:40.374895 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:40.374813 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:40.374895 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:40.374724 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:40.374895 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:40.374826 2571 projected.go:194] Error preparing data for projected volume kube-api-access-t84fc for pod openshift-network-diagnostics/network-check-target-lwq8d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:40.374895 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:40.374866 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs podName:a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:12.374853741 +0000 UTC m=+65.313758982 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs") pod "network-metrics-daemon-zbptt" (UID: "a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:40.374895 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:40.374888 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc podName:5a407c11-3cbe-4521-8abd-48c6506368fb nodeName:}" failed. No retries permitted until 2026-04-17 17:26:12.374872235 +0000 UTC m=+65.313777493 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-t84fc" (UniqueName: "kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc") pod "network-check-target-lwq8d" (UID: "5a407c11-3cbe-4521-8abd-48c6506368fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:40.654036 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:40.653935 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:25:40.654217 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:40.654097 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:25:40.654217 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:40.654134 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:25:40.658346 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:40.657796 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wl5wm\"" Apr 17 17:25:40.658346 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:40.657856 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:25:40.658346 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:40.657859 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lkqvr\"" Apr 17 17:25:40.658346 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:40.657910 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:25:40.658346 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:40.657859 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:25:40.658346 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:40.657859 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:25:40.676851 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:40.676829 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:25:40.677365 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:40.676945 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:40.677365 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:40.677024 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls podName:4efcb7d2-ede6-4e86-920c-19ff33a94f7b nodeName:}" failed. No retries permitted until 2026-04-17 17:25:42.676977476 +0000 UTC m=+35.615882716 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls") pod "dns-default-frpbr" (UID: "4efcb7d2-ede6-4e86-920c-19ff33a94f7b") : secret "dns-default-metrics-tls" not found Apr 17 17:25:40.778198 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:40.778167 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert\") pod \"ingress-canary-r7dzq\" (UID: \"32351f44-6fa2-44e0-81dd-549f2f19d705\") " pod="openshift-ingress-canary/ingress-canary-r7dzq" Apr 17 17:25:40.778382 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:40.778316 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:40.778382 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:40.778376 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert podName:32351f44-6fa2-44e0-81dd-549f2f19d705 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:42.778355591 +0000 UTC m=+35.717260834 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert") pod "ingress-canary-r7dzq" (UID: "32351f44-6fa2-44e0-81dd-549f2f19d705") : secret "canary-serving-cert" not found Apr 17 17:25:42.694422 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:42.694383 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:25:42.694853 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:42.694526 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:42.694853 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:42.694581 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls podName:4efcb7d2-ede6-4e86-920c-19ff33a94f7b nodeName:}" failed. No retries permitted until 2026-04-17 17:25:46.694566704 +0000 UTC m=+39.633471946 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls") pod "dns-default-frpbr" (UID: "4efcb7d2-ede6-4e86-920c-19ff33a94f7b") : secret "dns-default-metrics-tls" not found Apr 17 17:25:42.795307 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:42.795270 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert\") pod \"ingress-canary-r7dzq\" (UID: \"32351f44-6fa2-44e0-81dd-549f2f19d705\") " pod="openshift-ingress-canary/ingress-canary-r7dzq" Apr 17 17:25:42.795458 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:42.795382 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:42.795458 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:42.795432 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert podName:32351f44-6fa2-44e0-81dd-549f2f19d705 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:46.795419439 +0000 UTC m=+39.734324685 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert") pod "ingress-canary-r7dzq" (UID: "32351f44-6fa2-44e0-81dd-549f2f19d705") : secret "canary-serving-cert" not found Apr 17 17:25:42.952655 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:42.952374 2571 generic.go:358] "Generic (PLEG): container finished" podID="dcf417a0-db1f-4d94-9f13-bd789e955760" containerID="e3537e926ff39ee82541cfffce96bfcd8b574e113f84e4f3733f9f21112175aa" exitCode=0 Apr 17 17:25:42.952655 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:42.952453 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fv24" event={"ID":"dcf417a0-db1f-4d94-9f13-bd789e955760","Type":"ContainerDied","Data":"e3537e926ff39ee82541cfffce96bfcd8b574e113f84e4f3733f9f21112175aa"} Apr 17 17:25:43.956486 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:43.956447 2571 generic.go:358] "Generic (PLEG): container finished" podID="dcf417a0-db1f-4d94-9f13-bd789e955760" containerID="c2ceab23af89aaa31ea5b2861fb25b5d42f8addd138519d7c1642f0e911a7993" exitCode=0 Apr 17 17:25:43.956826 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:43.956498 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fv24" event={"ID":"dcf417a0-db1f-4d94-9f13-bd789e955760","Type":"ContainerDied","Data":"c2ceab23af89aaa31ea5b2861fb25b5d42f8addd138519d7c1642f0e911a7993"} Apr 17 17:25:44.960896 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:44.960863 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fv24" event={"ID":"dcf417a0-db1f-4d94-9f13-bd789e955760","Type":"ContainerStarted","Data":"77bcc2cb165bce9de42640008add9f3e126cf07b59f761bf42685daa4c805ee3"} Apr 17 17:25:44.982709 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:44.982662 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5fv24" podStartSLOduration=5.075453591 podStartE2EDuration="37.982645868s" podCreationTimestamp="2026-04-17 17:25:07 +0000 UTC" firstStartedPulling="2026-04-17 17:25:08.947144325 +0000 UTC m=+1.886049565" lastFinishedPulling="2026-04-17 17:25:41.854336597 +0000 UTC m=+34.793241842" observedRunningTime="2026-04-17 17:25:44.981208388 +0000 UTC m=+37.920113654" watchObservedRunningTime="2026-04-17 17:25:44.982645868 +0000 UTC m=+37.921551131" Apr 17 17:25:46.725088 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:46.725046 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:25:46.725504 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:46.725163 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:46.725504 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:46.725223 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls podName:4efcb7d2-ede6-4e86-920c-19ff33a94f7b nodeName:}" failed. No retries permitted until 2026-04-17 17:25:54.725210393 +0000 UTC m=+47.664115634 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls") pod "dns-default-frpbr" (UID: "4efcb7d2-ede6-4e86-920c-19ff33a94f7b") : secret "dns-default-metrics-tls" not found Apr 17 17:25:46.825746 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:46.825712 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert\") pod \"ingress-canary-r7dzq\" (UID: \"32351f44-6fa2-44e0-81dd-549f2f19d705\") " pod="openshift-ingress-canary/ingress-canary-r7dzq" Apr 17 17:25:46.825923 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:46.825895 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:46.826018 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:46.826007 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert podName:32351f44-6fa2-44e0-81dd-549f2f19d705 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:54.825963191 +0000 UTC m=+47.764868448 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert") pod "ingress-canary-r7dzq" (UID: "32351f44-6fa2-44e0-81dd-549f2f19d705") : secret "canary-serving-cert" not found Apr 17 17:25:54.784179 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:54.784134 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:25:54.784588 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:54.784260 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:54.784588 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:54.784312 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls podName:4efcb7d2-ede6-4e86-920c-19ff33a94f7b nodeName:}" failed. No retries permitted until 2026-04-17 17:26:10.784298909 +0000 UTC m=+63.723204151 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls") pod "dns-default-frpbr" (UID: "4efcb7d2-ede6-4e86-920c-19ff33a94f7b") : secret "dns-default-metrics-tls" not found Apr 17 17:25:54.884584 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:25:54.884553 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert\") pod \"ingress-canary-r7dzq\" (UID: \"32351f44-6fa2-44e0-81dd-549f2f19d705\") " pod="openshift-ingress-canary/ingress-canary-r7dzq" Apr 17 17:25:54.884705 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:54.884691 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:54.884760 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:25:54.884751 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert podName:32351f44-6fa2-44e0-81dd-549f2f19d705 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:10.884734536 +0000 UTC m=+63.823639798 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert") pod "ingress-canary-r7dzq" (UID: "32351f44-6fa2-44e0-81dd-549f2f19d705") : secret "canary-serving-cert" not found Apr 17 17:26:05.950112 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:05.950075 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hhm5l" Apr 17 17:26:10.793882 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:10.793825 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:26:10.794298 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:26:10.793976 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:26:10.794298 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:26:10.794072 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls podName:4efcb7d2-ede6-4e86-920c-19ff33a94f7b nodeName:}" failed. No retries permitted until 2026-04-17 17:26:42.79405344 +0000 UTC m=+95.732958682 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls") pod "dns-default-frpbr" (UID: "4efcb7d2-ede6-4e86-920c-19ff33a94f7b") : secret "dns-default-metrics-tls" not found Apr 17 17:26:10.894169 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:10.894124 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert\") pod \"ingress-canary-r7dzq\" (UID: \"32351f44-6fa2-44e0-81dd-549f2f19d705\") " pod="openshift-ingress-canary/ingress-canary-r7dzq" Apr 17 17:26:10.894338 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:26:10.894268 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:26:10.894338 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:26:10.894328 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert podName:32351f44-6fa2-44e0-81dd-549f2f19d705 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:42.894313303 +0000 UTC m=+95.833218544 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert") pod "ingress-canary-r7dzq" (UID: "32351f44-6fa2-44e0-81dd-549f2f19d705") : secret "canary-serving-cert" not found Apr 17 17:26:12.405175 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:12.405125 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret\") pod \"global-pull-secret-syncer-lzfd4\" (UID: \"78665d2b-9809-4082-abf9-62de202c8f2f\") " pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:26:12.405175 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:12.405176 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t84fc\" (UniqueName: \"kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc\") pod \"network-check-target-lwq8d\" (UID: \"5a407c11-3cbe-4521-8abd-48c6506368fb\") " pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:26:12.405680 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:12.405231 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs\") pod \"network-metrics-daemon-zbptt\" (UID: \"a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4\") " pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:26:12.412442 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:12.412417 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:26:12.412503 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:12.412427 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:26:12.412503 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:12.412456 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:26:12.415635 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:26:12.415619 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:26:12.415698 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:26:12.415688 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs podName:a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:16.415667272 +0000 UTC m=+129.354572513 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs") pod "network-metrics-daemon-zbptt" (UID: "a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4") : secret "metrics-daemon-secret" not found Apr 17 17:26:12.418763 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:12.418732 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78665d2b-9809-4082-abf9-62de202c8f2f-original-pull-secret\") pod \"global-pull-secret-syncer-lzfd4\" (UID: \"78665d2b-9809-4082-abf9-62de202c8f2f\") " pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:26:12.419785 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:12.419772 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:26:12.428146 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:12.428126 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t84fc\" (UniqueName: \"kubernetes.io/projected/5a407c11-3cbe-4521-8abd-48c6506368fb-kube-api-access-t84fc\") pod \"network-check-target-lwq8d\" (UID: \"5a407c11-3cbe-4521-8abd-48c6506368fb\") " pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:26:12.476867 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:12.476837 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wl5wm\"" Apr 17 17:26:12.485081 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:12.485056 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:26:12.487851 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:12.487827 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lzfd4" Apr 17 17:26:12.679971 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:12.679895 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lwq8d"] Apr 17 17:26:12.682510 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:12.682487 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lzfd4"] Apr 17 17:26:12.683067 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:26:12.683041 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a407c11_3cbe_4521_8abd_48c6506368fb.slice/crio-8a393f30df741e2b609fc294693f0fdfe936185cf6892b6eb4a94c5b24f7e969 WatchSource:0}: Error finding container 8a393f30df741e2b609fc294693f0fdfe936185cf6892b6eb4a94c5b24f7e969: Status 404 returned error can't find the container with id 8a393f30df741e2b609fc294693f0fdfe936185cf6892b6eb4a94c5b24f7e969 Apr 17 17:26:12.685584 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:26:12.685550 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78665d2b_9809_4082_abf9_62de202c8f2f.slice/crio-94d78e176a42bef2a3434b844845b20d6a2cffa8801f448a7f9cfa3ba6c205d2 WatchSource:0}: Error finding container 94d78e176a42bef2a3434b844845b20d6a2cffa8801f448a7f9cfa3ba6c205d2: Status 404 returned error can't find the container with id 94d78e176a42bef2a3434b844845b20d6a2cffa8801f448a7f9cfa3ba6c205d2 Apr 17 17:26:13.013285 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:13.013251 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lwq8d" event={"ID":"5a407c11-3cbe-4521-8abd-48c6506368fb","Type":"ContainerStarted","Data":"8a393f30df741e2b609fc294693f0fdfe936185cf6892b6eb4a94c5b24f7e969"} Apr 17 17:26:13.014474 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:13.014432 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lzfd4" event={"ID":"78665d2b-9809-4082-abf9-62de202c8f2f","Type":"ContainerStarted","Data":"94d78e176a42bef2a3434b844845b20d6a2cffa8801f448a7f9cfa3ba6c205d2"} Apr 17 17:26:18.025865 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:18.025825 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lzfd4" event={"ID":"78665d2b-9809-4082-abf9-62de202c8f2f","Type":"ContainerStarted","Data":"c9ba76dd703c7b081ac93e575cd1bf44f39312ac36d78d7f9d705d81b6e043a7"} Apr 17 17:26:18.027228 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:18.027201 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lwq8d" event={"ID":"5a407c11-3cbe-4521-8abd-48c6506368fb","Type":"ContainerStarted","Data":"7b4b627abe750e41b67c7eefcf341abc5057b5e085d3da30015a3a8206cd2922"} Apr 17 17:26:18.027372 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:18.027356 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:26:18.042265 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:18.042229 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-lzfd4" podStartSLOduration=65.689481621 podStartE2EDuration="1m10.042218595s" podCreationTimestamp="2026-04-17 17:25:08 +0000 UTC" firstStartedPulling="2026-04-17 17:26:12.68701605 +0000 UTC m=+65.625921291" lastFinishedPulling="2026-04-17 17:26:17.039753011 +0000 UTC m=+69.978658265" observedRunningTime="2026-04-17 17:26:18.041418091 +0000 UTC m=+70.980323356" watchObservedRunningTime="2026-04-17 17:26:18.042218595 +0000 UTC m=+70.981123912" Apr 17 17:26:18.057115 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:18.057076 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-lwq8d" podStartSLOduration=66.70547953 podStartE2EDuration="1m11.057064652s" podCreationTimestamp="2026-04-17 17:25:07 +0000 UTC" firstStartedPulling="2026-04-17 17:26:12.684868937 +0000 UTC m=+65.623774178" lastFinishedPulling="2026-04-17 17:26:17.036454059 +0000 UTC m=+69.975359300" observedRunningTime="2026-04-17 17:26:18.056666711 +0000 UTC m=+70.995571979" watchObservedRunningTime="2026-04-17 17:26:18.057064652 +0000 UTC m=+70.995969914" Apr 17 17:26:42.817463 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:42.817418 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:26:42.817933 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:26:42.817538 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:26:42.817933 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:26:42.817594 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls podName:4efcb7d2-ede6-4e86-920c-19ff33a94f7b nodeName:}" failed. No retries permitted until 2026-04-17 17:27:46.817579262 +0000 UTC m=+159.756484503 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls") pod "dns-default-frpbr" (UID: "4efcb7d2-ede6-4e86-920c-19ff33a94f7b") : secret "dns-default-metrics-tls" not found Apr 17 17:26:42.917718 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:42.917671 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert\") pod \"ingress-canary-r7dzq\" (UID: \"32351f44-6fa2-44e0-81dd-549f2f19d705\") " pod="openshift-ingress-canary/ingress-canary-r7dzq" Apr 17 17:26:42.917869 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:26:42.917843 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:26:42.917943 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:26:42.917933 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert podName:32351f44-6fa2-44e0-81dd-549f2f19d705 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:46.917914873 +0000 UTC m=+159.856820114 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert") pod "ingress-canary-r7dzq" (UID: "32351f44-6fa2-44e0-81dd-549f2f19d705") : secret "canary-serving-cert" not found Apr 17 17:26:49.031262 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:26:49.031225 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-lwq8d" Apr 17 17:27:01.189649 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:01.189616 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mvh8f"] Apr 17 17:27:01.192461 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:01.192442 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mvh8f" Apr 17 17:27:01.194783 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:01.194761 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 17:27:01.195591 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:01.195565 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:27:01.195591 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:01.195585 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-htvf9\"" Apr 17 17:27:01.201044 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:01.201025 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mvh8f"] Apr 17 17:27:01.345181 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:01.345144 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt6m7\" (UniqueName: \"kubernetes.io/projected/390c586c-bb4c-4700-b4eb-ccd63ad31506-kube-api-access-xt6m7\") pod \"volume-data-source-validator-7c6cbb6c87-mvh8f\" (UID: \"390c586c-bb4c-4700-b4eb-ccd63ad31506\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mvh8f" Apr 17 17:27:01.445783 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:01.445688 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt6m7\" (UniqueName: \"kubernetes.io/projected/390c586c-bb4c-4700-b4eb-ccd63ad31506-kube-api-access-xt6m7\") pod \"volume-data-source-validator-7c6cbb6c87-mvh8f\" (UID: \"390c586c-bb4c-4700-b4eb-ccd63ad31506\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mvh8f" Apr 17 17:27:01.454406 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:01.454381 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt6m7\" (UniqueName: \"kubernetes.io/projected/390c586c-bb4c-4700-b4eb-ccd63ad31506-kube-api-access-xt6m7\") pod \"volume-data-source-validator-7c6cbb6c87-mvh8f\" (UID: \"390c586c-bb4c-4700-b4eb-ccd63ad31506\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mvh8f" Apr 17 17:27:01.500750 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:01.500709 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mvh8f" Apr 17 17:27:01.609982 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:01.609951 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mvh8f"] Apr 17 17:27:01.614346 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:27:01.614319 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod390c586c_bb4c_4700_b4eb_ccd63ad31506.slice/crio-edd81f8caccfad427694a9b7678f9979c2ec95184f3064ea75204583a92ba5fc WatchSource:0}: Error finding container edd81f8caccfad427694a9b7678f9979c2ec95184f3064ea75204583a92ba5fc: Status 404 returned error can't find the container with id edd81f8caccfad427694a9b7678f9979c2ec95184f3064ea75204583a92ba5fc Apr 17 17:27:02.108417 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:02.108381 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mvh8f" event={"ID":"390c586c-bb4c-4700-b4eb-ccd63ad31506","Type":"ContainerStarted","Data":"edd81f8caccfad427694a9b7678f9979c2ec95184f3064ea75204583a92ba5fc"} Apr 17 17:27:02.992944 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:02.992919 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd"] Apr 17 17:27:02.995778 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:02.995760 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd" Apr 17 17:27:02.998223 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:02.998202 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:27:02.998329 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:02.998248 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 17:27:02.998329 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:02.998295 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 17:27:02.998329 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:02.998248 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-gprl4\"" Apr 17 17:27:03.005438 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:03.005348 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd"] Apr 17 17:27:03.058725 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:03.058694 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22hpr\" (UniqueName: \"kubernetes.io/projected/ec479a05-737c-41d2-aa0d-9d627b239283-kube-api-access-22hpr\") pod \"cluster-samples-operator-6dc5bdb6b4-t9ptd\" (UID: \"ec479a05-737c-41d2-aa0d-9d627b239283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd" Apr 17 17:27:03.058725 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:03.058737 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-t9ptd\" (UID: \"ec479a05-737c-41d2-aa0d-9d627b239283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd" Apr 17 17:27:03.111920 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:03.111883 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mvh8f" event={"ID":"390c586c-bb4c-4700-b4eb-ccd63ad31506","Type":"ContainerStarted","Data":"3b9d995cd7d8b39e33a1b16951b5967902a067520eeef03818c9a66f09ed7b7d"} Apr 17 17:27:03.127526 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:03.127478 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mvh8f" podStartSLOduration=0.807976378 podStartE2EDuration="2.127465159s" podCreationTimestamp="2026-04-17 17:27:01 +0000 UTC" firstStartedPulling="2026-04-17 17:27:01.616519581 +0000 UTC m=+114.555424823" lastFinishedPulling="2026-04-17 17:27:02.936008363 +0000 UTC m=+115.874913604" observedRunningTime="2026-04-17 17:27:03.126475827 +0000 UTC m=+116.065381091" watchObservedRunningTime="2026-04-17 17:27:03.127465159 +0000 UTC m=+116.066370422" Apr 17 17:27:03.159873 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:03.159844 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22hpr\" (UniqueName: \"kubernetes.io/projected/ec479a05-737c-41d2-aa0d-9d627b239283-kube-api-access-22hpr\") pod \"cluster-samples-operator-6dc5bdb6b4-t9ptd\" (UID: \"ec479a05-737c-41d2-aa0d-9d627b239283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd" Apr 17 17:27:03.159981 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:03.159883 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-t9ptd\" (UID: \"ec479a05-737c-41d2-aa0d-9d627b239283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd" Apr 17 17:27:03.160047 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:03.159983 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:27:03.160086 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:03.160058 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls podName:ec479a05-737c-41d2-aa0d-9d627b239283 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:03.660042048 +0000 UTC m=+116.598947290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-t9ptd" (UID: "ec479a05-737c-41d2-aa0d-9d627b239283") : secret "samples-operator-tls" not found Apr 17 17:27:03.168608 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:03.168579 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22hpr\" (UniqueName: \"kubernetes.io/projected/ec479a05-737c-41d2-aa0d-9d627b239283-kube-api-access-22hpr\") pod \"cluster-samples-operator-6dc5bdb6b4-t9ptd\" (UID: \"ec479a05-737c-41d2-aa0d-9d627b239283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd" Apr 17 17:27:03.664817 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:03.664787 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-t9ptd\" (UID: \"ec479a05-737c-41d2-aa0d-9d627b239283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd" Apr 17 17:27:03.665007 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:03.664922 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:27:03.665007 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:03.664979 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls podName:ec479a05-737c-41d2-aa0d-9d627b239283 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:04.66496557 +0000 UTC m=+117.603870814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-t9ptd" (UID: "ec479a05-737c-41d2-aa0d-9d627b239283") : secret "samples-operator-tls" not found Apr 17 17:27:03.994914 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:03.994835 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-nq8bf"] Apr 17 17:27:03.997724 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:03.997709 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" Apr 17 17:27:04.000200 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.000149 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-xppxh\"" Apr 17 17:27:04.000343 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.000315 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 17:27:04.001100 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.001081 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 17:27:04.001396 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.001380 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 17:27:04.001478 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.001443 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:27:04.005845 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.005828 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 17:27:04.011473 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.008542 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-nq8bf"] Apr 17 17:27:04.067361 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.067317 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52hkw\" (UniqueName: \"kubernetes.io/projected/9dd87d69-6559-458b-94cc-fc8a74f27e9a-kube-api-access-52hkw\") pod \"console-operator-9d4b6777b-nq8bf\" (UID: \"9dd87d69-6559-458b-94cc-fc8a74f27e9a\") " pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" Apr 17 17:27:04.067361 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.067363 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9dd87d69-6559-458b-94cc-fc8a74f27e9a-trusted-ca\") pod \"console-operator-9d4b6777b-nq8bf\" (UID: \"9dd87d69-6559-458b-94cc-fc8a74f27e9a\") " pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" Apr 17 17:27:04.067585 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.067470 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9dd87d69-6559-458b-94cc-fc8a74f27e9a-serving-cert\") pod \"console-operator-9d4b6777b-nq8bf\" (UID: \"9dd87d69-6559-458b-94cc-fc8a74f27e9a\") " pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" Apr 17 17:27:04.067585 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.067512 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dd87d69-6559-458b-94cc-fc8a74f27e9a-config\") pod \"console-operator-9d4b6777b-nq8bf\" (UID: \"9dd87d69-6559-458b-94cc-fc8a74f27e9a\") " pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" Apr 17 17:27:04.168535 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.168493 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52hkw\" (UniqueName: \"kubernetes.io/projected/9dd87d69-6559-458b-94cc-fc8a74f27e9a-kube-api-access-52hkw\") pod \"console-operator-9d4b6777b-nq8bf\" (UID: \"9dd87d69-6559-458b-94cc-fc8a74f27e9a\") " pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" Apr 17 17:27:04.168720 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.168545 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9dd87d69-6559-458b-94cc-fc8a74f27e9a-trusted-ca\") pod \"console-operator-9d4b6777b-nq8bf\" (UID: \"9dd87d69-6559-458b-94cc-fc8a74f27e9a\") " pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" Apr 17 17:27:04.168720 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.168596 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9dd87d69-6559-458b-94cc-fc8a74f27e9a-serving-cert\") pod \"console-operator-9d4b6777b-nq8bf\" (UID: \"9dd87d69-6559-458b-94cc-fc8a74f27e9a\") " pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" Apr 17 17:27:04.168720 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.168627 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dd87d69-6559-458b-94cc-fc8a74f27e9a-config\") pod \"console-operator-9d4b6777b-nq8bf\" (UID: \"9dd87d69-6559-458b-94cc-fc8a74f27e9a\") " pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" Apr 17 17:27:04.169401 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.169374 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dd87d69-6559-458b-94cc-fc8a74f27e9a-config\") pod \"console-operator-9d4b6777b-nq8bf\" (UID: \"9dd87d69-6559-458b-94cc-fc8a74f27e9a\") " pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" Apr 17 17:27:04.169594 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.169574 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9dd87d69-6559-458b-94cc-fc8a74f27e9a-trusted-ca\") pod \"console-operator-9d4b6777b-nq8bf\" (UID: \"9dd87d69-6559-458b-94cc-fc8a74f27e9a\") " pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" Apr 17 17:27:04.171023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.171005 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9dd87d69-6559-458b-94cc-fc8a74f27e9a-serving-cert\") pod \"console-operator-9d4b6777b-nq8bf\" (UID: \"9dd87d69-6559-458b-94cc-fc8a74f27e9a\") " pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" Apr 17 17:27:04.176420 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.176400 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52hkw\" (UniqueName: \"kubernetes.io/projected/9dd87d69-6559-458b-94cc-fc8a74f27e9a-kube-api-access-52hkw\") pod \"console-operator-9d4b6777b-nq8bf\" (UID: \"9dd87d69-6559-458b-94cc-fc8a74f27e9a\") " pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" Apr 17 17:27:04.306435 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.306393 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" Apr 17 17:27:04.420064 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.420034 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-nq8bf"] Apr 17 17:27:04.422838 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:27:04.422799 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dd87d69_6559_458b_94cc_fc8a74f27e9a.slice/crio-7e6588a492078a326e8cfc0c3d9a55b3bc0919e2c53a389daec075347ddfc064 WatchSource:0}: Error finding container 7e6588a492078a326e8cfc0c3d9a55b3bc0919e2c53a389daec075347ddfc064: Status 404 returned error can't find the container with id 7e6588a492078a326e8cfc0c3d9a55b3bc0919e2c53a389daec075347ddfc064 Apr 17 17:27:04.672079 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:04.671979 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-t9ptd\" (UID: \"ec479a05-737c-41d2-aa0d-9d627b239283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd" Apr 17 17:27:04.672224 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:04.672126 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:27:04.672224 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:04.672201 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls podName:ec479a05-737c-41d2-aa0d-9d627b239283 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:06.672184553 +0000 UTC m=+119.611089815 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-t9ptd" (UID: "ec479a05-737c-41d2-aa0d-9d627b239283") : secret "samples-operator-tls" not found Apr 17 17:27:05.117510 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:05.117468 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" event={"ID":"9dd87d69-6559-458b-94cc-fc8a74f27e9a","Type":"ContainerStarted","Data":"7e6588a492078a326e8cfc0c3d9a55b3bc0919e2c53a389daec075347ddfc064"} Apr 17 17:27:06.688223 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:06.688135 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-t9ptd\" (UID: \"ec479a05-737c-41d2-aa0d-9d627b239283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd" Apr 17 17:27:06.688616 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:06.688243 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:27:06.688616 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:06.688297 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls podName:ec479a05-737c-41d2-aa0d-9d627b239283 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:10.68828327 +0000 UTC m=+123.627188512 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-t9ptd" (UID: "ec479a05-737c-41d2-aa0d-9d627b239283") : secret "samples-operator-tls" not found Apr 17 17:27:06.692588 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:06.692569 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hzc4l_fd529c67-6434-4e26-bfa8-0edc94a3b098/dns-node-resolver/0.log" Apr 17 17:27:07.122446 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:07.122418 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nq8bf_9dd87d69-6559-458b-94cc-fc8a74f27e9a/console-operator/0.log" Apr 17 17:27:07.122629 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:07.122457 2571 generic.go:358] "Generic (PLEG): container finished" podID="9dd87d69-6559-458b-94cc-fc8a74f27e9a" containerID="6e9f0c73177275149b764df76dae353c36f6855cd5424e4b4c53c22ae807b5b1" exitCode=255 Apr 17 17:27:07.122629 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:07.122512 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" event={"ID":"9dd87d69-6559-458b-94cc-fc8a74f27e9a","Type":"ContainerDied","Data":"6e9f0c73177275149b764df76dae353c36f6855cd5424e4b4c53c22ae807b5b1"} Apr 17 17:27:07.122768 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:07.122751 2571 scope.go:117] "RemoveContainer" containerID="6e9f0c73177275149b764df76dae353c36f6855cd5424e4b4c53c22ae807b5b1" Apr 17 17:27:07.676671 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:07.676646 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hxcr5_e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b/node-ca/0.log" Apr 17 17:27:08.126690 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:08.126662 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nq8bf_9dd87d69-6559-458b-94cc-fc8a74f27e9a/console-operator/1.log" Apr 17 17:27:08.127111 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:08.127019 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nq8bf_9dd87d69-6559-458b-94cc-fc8a74f27e9a/console-operator/0.log" Apr 17 17:27:08.127111 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:08.127051 2571 generic.go:358] "Generic (PLEG): container finished" podID="9dd87d69-6559-458b-94cc-fc8a74f27e9a" containerID="e11cfa5a1aefa59179ab2b78b6b549b1f085d497b2998c7df3f6f6f560d86a37" exitCode=255 Apr 17 17:27:08.127184 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:08.127112 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" event={"ID":"9dd87d69-6559-458b-94cc-fc8a74f27e9a","Type":"ContainerDied","Data":"e11cfa5a1aefa59179ab2b78b6b549b1f085d497b2998c7df3f6f6f560d86a37"} Apr 17 17:27:08.127184 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:08.127154 2571 scope.go:117] "RemoveContainer" containerID="6e9f0c73177275149b764df76dae353c36f6855cd5424e4b4c53c22ae807b5b1" Apr 17 17:27:08.127419 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:08.127399 2571 scope.go:117] "RemoveContainer" containerID="e11cfa5a1aefa59179ab2b78b6b549b1f085d497b2998c7df3f6f6f560d86a37" Apr 17 17:27:08.127618 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:08.127592 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-nq8bf_openshift-console-operator(9dd87d69-6559-458b-94cc-fc8a74f27e9a)\"" pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" podUID="9dd87d69-6559-458b-94cc-fc8a74f27e9a" Apr 17 17:27:09.130649 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:09.130619 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nq8bf_9dd87d69-6559-458b-94cc-fc8a74f27e9a/console-operator/1.log" Apr 17 17:27:09.131039 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:09.130937 2571 scope.go:117] "RemoveContainer" containerID="e11cfa5a1aefa59179ab2b78b6b549b1f085d497b2998c7df3f6f6f560d86a37" Apr 17 17:27:09.131136 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:09.131116 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-nq8bf_openshift-console-operator(9dd87d69-6559-458b-94cc-fc8a74f27e9a)\"" pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" podUID="9dd87d69-6559-458b-94cc-fc8a74f27e9a" Apr 17 17:27:09.414136 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:09.414061 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6kv77"] Apr 17 17:27:09.417113 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:09.417097 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6kv77" Apr 17 17:27:09.419471 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:09.419452 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-zmrzj\"" Apr 17 17:27:09.426542 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:09.426521 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6kv77"] Apr 17 17:27:09.507959 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:09.507924 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h8cx\" (UniqueName: \"kubernetes.io/projected/47718045-17b9-4963-8fe8-9bf81ac118cd-kube-api-access-5h8cx\") pod \"network-check-source-8894fc9bd-6kv77\" (UID: \"47718045-17b9-4963-8fe8-9bf81ac118cd\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6kv77" Apr 17 17:27:09.608519 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:09.608486 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h8cx\" (UniqueName: \"kubernetes.io/projected/47718045-17b9-4963-8fe8-9bf81ac118cd-kube-api-access-5h8cx\") pod \"network-check-source-8894fc9bd-6kv77\" (UID: \"47718045-17b9-4963-8fe8-9bf81ac118cd\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6kv77" Apr 17 17:27:09.616935 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:09.616915 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h8cx\" (UniqueName: \"kubernetes.io/projected/47718045-17b9-4963-8fe8-9bf81ac118cd-kube-api-access-5h8cx\") pod \"network-check-source-8894fc9bd-6kv77\" (UID: \"47718045-17b9-4963-8fe8-9bf81ac118cd\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6kv77" Apr 17 17:27:09.725627 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:09.725528 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6kv77" Apr 17 17:27:09.839943 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:09.839912 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6kv77"] Apr 17 17:27:09.843217 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:27:09.843188 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47718045_17b9_4963_8fe8_9bf81ac118cd.slice/crio-02f48d79f43e3f4e7db0e50fa0192f095cba7b46337894c04d34ade214381344 WatchSource:0}: Error finding container 02f48d79f43e3f4e7db0e50fa0192f095cba7b46337894c04d34ade214381344: Status 404 returned error can't find the container with id 02f48d79f43e3f4e7db0e50fa0192f095cba7b46337894c04d34ade214381344 Apr 17 17:27:10.138669 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:10.138626 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6kv77" event={"ID":"47718045-17b9-4963-8fe8-9bf81ac118cd","Type":"ContainerStarted","Data":"a546a1b6204d19ee07d770fd37bf9f327e6464f41c6836953d8f46066698ead3"} Apr 17 17:27:10.138669 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:10.138672 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6kv77" event={"ID":"47718045-17b9-4963-8fe8-9bf81ac118cd","Type":"ContainerStarted","Data":"02f48d79f43e3f4e7db0e50fa0192f095cba7b46337894c04d34ade214381344"} Apr 17 17:27:10.154005 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:10.153935 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6kv77" podStartSLOduration=1.153916092 podStartE2EDuration="1.153916092s" podCreationTimestamp="2026-04-17 17:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:27:10.153756552 +0000 UTC m=+123.092661816" watchObservedRunningTime="2026-04-17 17:27:10.153916092 +0000 UTC m=+123.092821352" Apr 17 17:27:10.718719 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:10.718677 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-t9ptd\" (UID: \"ec479a05-737c-41d2-aa0d-9d627b239283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd" Apr 17 17:27:10.718898 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:10.718815 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:27:10.718898 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:10.718875 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls podName:ec479a05-737c-41d2-aa0d-9d627b239283 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:18.718861585 +0000 UTC m=+131.657766831 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-t9ptd" (UID: "ec479a05-737c-41d2-aa0d-9d627b239283") : secret "samples-operator-tls" not found Apr 17 17:27:11.005373 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.005341 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd"] Apr 17 17:27:11.008298 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.008282 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd" Apr 17 17:27:11.010934 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.010909 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 17:27:11.011066 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.010933 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 17:27:11.011066 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.010911 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:27:11.011066 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.010917 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-fjftf\"" Apr 17 17:27:11.011722 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.011703 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 17:27:11.018584 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.018561 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd"] Apr 17 17:27:11.120457 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.120414 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/687643ca-4ad3-44fa-8f6b-ef388beace89-config\") pod \"service-ca-operator-d6fc45fc5-jmwrd\" (UID: \"687643ca-4ad3-44fa-8f6b-ef388beace89\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd" Apr 17 17:27:11.120669 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.120510 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/687643ca-4ad3-44fa-8f6b-ef388beace89-serving-cert\") pod \"service-ca-operator-d6fc45fc5-jmwrd\" (UID: \"687643ca-4ad3-44fa-8f6b-ef388beace89\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd" Apr 17 17:27:11.120669 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.120534 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm5xp\" (UniqueName: \"kubernetes.io/projected/687643ca-4ad3-44fa-8f6b-ef388beace89-kube-api-access-tm5xp\") pod \"service-ca-operator-d6fc45fc5-jmwrd\" (UID: \"687643ca-4ad3-44fa-8f6b-ef388beace89\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd" Apr 17 17:27:11.221586 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.221545 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/687643ca-4ad3-44fa-8f6b-ef388beace89-config\") pod \"service-ca-operator-d6fc45fc5-jmwrd\" (UID: \"687643ca-4ad3-44fa-8f6b-ef388beace89\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd" Apr 17 17:27:11.221943 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.221651 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/687643ca-4ad3-44fa-8f6b-ef388beace89-serving-cert\") pod \"service-ca-operator-d6fc45fc5-jmwrd\" (UID: \"687643ca-4ad3-44fa-8f6b-ef388beace89\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd" Apr 17 17:27:11.221943 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.221695 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tm5xp\" (UniqueName: \"kubernetes.io/projected/687643ca-4ad3-44fa-8f6b-ef388beace89-kube-api-access-tm5xp\") pod \"service-ca-operator-d6fc45fc5-jmwrd\" (UID: \"687643ca-4ad3-44fa-8f6b-ef388beace89\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd" Apr 17 17:27:11.222211 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.222183 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/687643ca-4ad3-44fa-8f6b-ef388beace89-config\") pod \"service-ca-operator-d6fc45fc5-jmwrd\" (UID: \"687643ca-4ad3-44fa-8f6b-ef388beace89\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd" Apr 17 17:27:11.223975 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.223955 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/687643ca-4ad3-44fa-8f6b-ef388beace89-serving-cert\") pod \"service-ca-operator-d6fc45fc5-jmwrd\" (UID: \"687643ca-4ad3-44fa-8f6b-ef388beace89\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd" Apr 17 17:27:11.230827 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.230805 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm5xp\" (UniqueName: \"kubernetes.io/projected/687643ca-4ad3-44fa-8f6b-ef388beace89-kube-api-access-tm5xp\") pod \"service-ca-operator-d6fc45fc5-jmwrd\" (UID: \"687643ca-4ad3-44fa-8f6b-ef388beace89\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd" Apr 17 17:27:11.317115 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.317032 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd" Apr 17 17:27:11.437249 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:11.437218 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd"] Apr 17 17:27:11.440395 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:27:11.440360 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod687643ca_4ad3_44fa_8f6b_ef388beace89.slice/crio-942eca4117f24b57600137a5e4e5e8f2d9d892905d66fbe20607cba3918e6b44 WatchSource:0}: Error finding container 942eca4117f24b57600137a5e4e5e8f2d9d892905d66fbe20607cba3918e6b44: Status 404 returned error can't find the container with id 942eca4117f24b57600137a5e4e5e8f2d9d892905d66fbe20607cba3918e6b44 Apr 17 17:27:12.145510 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:12.145473 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd" event={"ID":"687643ca-4ad3-44fa-8f6b-ef388beace89","Type":"ContainerStarted","Data":"942eca4117f24b57600137a5e4e5e8f2d9d892905d66fbe20607cba3918e6b44"} Apr 17 17:27:13.293894 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:13.293864 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4dpv"] Apr 17 17:27:13.297902 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:13.297881 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4dpv" Apr 17 17:27:13.300411 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:13.300390 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 17:27:13.300493 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:13.300390 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 17:27:13.301234 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:13.301220 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-5vw29\"" Apr 17 17:27:13.309724 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:13.309698 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4dpv"] Apr 17 17:27:13.437116 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:13.437084 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqxfh\" (UniqueName: \"kubernetes.io/projected/5d8fa452-8b8f-48f3-90bf-a980adde658a-kube-api-access-xqxfh\") pod \"migrator-74bb7799d9-x4dpv\" (UID: \"5d8fa452-8b8f-48f3-90bf-a980adde658a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4dpv" Apr 17 17:27:13.537964 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:13.537752 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqxfh\" (UniqueName: \"kubernetes.io/projected/5d8fa452-8b8f-48f3-90bf-a980adde658a-kube-api-access-xqxfh\") pod \"migrator-74bb7799d9-x4dpv\" (UID: \"5d8fa452-8b8f-48f3-90bf-a980adde658a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4dpv" Apr 17 17:27:13.547084 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:13.547052 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqxfh\" (UniqueName: \"kubernetes.io/projected/5d8fa452-8b8f-48f3-90bf-a980adde658a-kube-api-access-xqxfh\") pod \"migrator-74bb7799d9-x4dpv\" (UID: \"5d8fa452-8b8f-48f3-90bf-a980adde658a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4dpv" Apr 17 17:27:13.627124 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:13.627079 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4dpv" Apr 17 17:27:13.743238 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:13.743210 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4dpv"] Apr 17 17:27:13.746070 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:27:13.746037 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d8fa452_8b8f_48f3_90bf_a980adde658a.slice/crio-858b8d457652b14d91a37b9e61be90f4a3aaab3326e5d1e7bf06fbccb6e6147d WatchSource:0}: Error finding container 858b8d457652b14d91a37b9e61be90f4a3aaab3326e5d1e7bf06fbccb6e6147d: Status 404 returned error can't find the container with id 858b8d457652b14d91a37b9e61be90f4a3aaab3326e5d1e7bf06fbccb6e6147d Apr 17 17:27:14.151396 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:14.151356 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd" event={"ID":"687643ca-4ad3-44fa-8f6b-ef388beace89","Type":"ContainerStarted","Data":"975e25b509920a10259e6f262cd24dad26c9442085c3de924e43a758210fa5f3"} Apr 17 17:27:14.154494 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:14.154428 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4dpv" event={"ID":"5d8fa452-8b8f-48f3-90bf-a980adde658a","Type":"ContainerStarted","Data":"858b8d457652b14d91a37b9e61be90f4a3aaab3326e5d1e7bf06fbccb6e6147d"} Apr 17 17:27:14.167300 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:14.167254 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd" podStartSLOduration=2.336131564 podStartE2EDuration="4.167241827s" podCreationTimestamp="2026-04-17 17:27:10 +0000 UTC" firstStartedPulling="2026-04-17 17:27:11.442262771 +0000 UTC m=+124.381168013" lastFinishedPulling="2026-04-17 17:27:13.273373024 +0000 UTC m=+126.212278276" observedRunningTime="2026-04-17 17:27:14.166740964 +0000 UTC m=+127.105646228" watchObservedRunningTime="2026-04-17 17:27:14.167241827 +0000 UTC m=+127.106147281" Apr 17 17:27:14.307395 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:14.307356 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" Apr 17 17:27:14.307395 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:14.307394 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" Apr 17 17:27:14.307793 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:14.307740 2571 scope.go:117] "RemoveContainer" containerID="e11cfa5a1aefa59179ab2b78b6b549b1f085d497b2998c7df3f6f6f560d86a37" Apr 17 17:27:14.307926 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:14.307908 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-nq8bf_openshift-console-operator(9dd87d69-6559-458b-94cc-fc8a74f27e9a)\"" pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" podUID="9dd87d69-6559-458b-94cc-fc8a74f27e9a" Apr 17 17:27:15.157902 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:15.157814 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4dpv" event={"ID":"5d8fa452-8b8f-48f3-90bf-a980adde658a","Type":"ContainerStarted","Data":"32527efed8a69f8d3fe5e0669e9a3edf2fd74b758f50bd0f6cc9d07eed613a89"} Apr 17 17:27:15.157902 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:15.157860 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4dpv" event={"ID":"5d8fa452-8b8f-48f3-90bf-a980adde658a","Type":"ContainerStarted","Data":"91af3364976c245a7f4d4938effdcd1d12a6e3044329f5605cd816ed37ab2f29"} Apr 17 17:27:15.175858 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:15.175804 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4dpv" podStartSLOduration=1.040325957 podStartE2EDuration="2.175789356s" podCreationTimestamp="2026-04-17 17:27:13 +0000 UTC" firstStartedPulling="2026-04-17 17:27:13.748030355 +0000 UTC m=+126.686935601" lastFinishedPulling="2026-04-17 17:27:14.883493745 +0000 UTC m=+127.822399000" observedRunningTime="2026-04-17 17:27:15.174430416 +0000 UTC m=+128.113335681" watchObservedRunningTime="2026-04-17 17:27:15.175789356 +0000 UTC m=+128.114694619" Apr 17 17:27:16.460636 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:16.460598 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs\") pod \"network-metrics-daemon-zbptt\" (UID: \"a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4\") " pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:27:16.461060 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:16.460748 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:27:16.461060 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:16.460815 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs podName:a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4 nodeName:}" failed. No retries permitted until 2026-04-17 17:29:18.460799735 +0000 UTC m=+251.399704981 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs") pod "network-metrics-daemon-zbptt" (UID: "a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4") : secret "metrics-daemon-secret" not found Apr 17 17:27:16.951860 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:16.951826 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2lwxn"] Apr 17 17:27:16.955175 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:16.955156 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-2lwxn" Apr 17 17:27:16.959173 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:16.959153 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-njqwh\"" Apr 17 17:27:16.959300 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:16.959277 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 17:27:16.959879 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:16.959849 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 17:27:16.959879 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:16.959860 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 17:27:16.960027 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:16.960009 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 17:27:16.965697 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:16.965677 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2lwxn"] Apr 17 17:27:17.064388 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:17.064354 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/201a7c5f-c5af-465c-8046-20ccdfd64d24-signing-cabundle\") pod \"service-ca-865cb79987-2lwxn\" (UID: \"201a7c5f-c5af-465c-8046-20ccdfd64d24\") " pod="openshift-service-ca/service-ca-865cb79987-2lwxn" Apr 17 17:27:17.064388 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:17.064392 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp9b6\" (UniqueName: \"kubernetes.io/projected/201a7c5f-c5af-465c-8046-20ccdfd64d24-kube-api-access-fp9b6\") pod \"service-ca-865cb79987-2lwxn\" (UID: \"201a7c5f-c5af-465c-8046-20ccdfd64d24\") " pod="openshift-service-ca/service-ca-865cb79987-2lwxn" Apr 17 17:27:17.064593 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:17.064417 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/201a7c5f-c5af-465c-8046-20ccdfd64d24-signing-key\") pod \"service-ca-865cb79987-2lwxn\" (UID: \"201a7c5f-c5af-465c-8046-20ccdfd64d24\") " pod="openshift-service-ca/service-ca-865cb79987-2lwxn" Apr 17 17:27:17.165569 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:17.165535 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/201a7c5f-c5af-465c-8046-20ccdfd64d24-signing-cabundle\") pod \"service-ca-865cb79987-2lwxn\" (UID: \"201a7c5f-c5af-465c-8046-20ccdfd64d24\") " pod="openshift-service-ca/service-ca-865cb79987-2lwxn" Apr 17 17:27:17.165569 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:17.165574 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fp9b6\" (UniqueName: \"kubernetes.io/projected/201a7c5f-c5af-465c-8046-20ccdfd64d24-kube-api-access-fp9b6\") pod \"service-ca-865cb79987-2lwxn\" (UID: \"201a7c5f-c5af-465c-8046-20ccdfd64d24\") " pod="openshift-service-ca/service-ca-865cb79987-2lwxn" Apr 17 17:27:17.165802 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:17.165605 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/201a7c5f-c5af-465c-8046-20ccdfd64d24-signing-key\") pod \"service-ca-865cb79987-2lwxn\" (UID: \"201a7c5f-c5af-465c-8046-20ccdfd64d24\") " pod="openshift-service-ca/service-ca-865cb79987-2lwxn" Apr 17 17:27:17.166291 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:17.166222 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/201a7c5f-c5af-465c-8046-20ccdfd64d24-signing-cabundle\") pod \"service-ca-865cb79987-2lwxn\" (UID: \"201a7c5f-c5af-465c-8046-20ccdfd64d24\") " pod="openshift-service-ca/service-ca-865cb79987-2lwxn" Apr 17 17:27:17.168092 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:17.168070 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/201a7c5f-c5af-465c-8046-20ccdfd64d24-signing-key\") pod \"service-ca-865cb79987-2lwxn\" (UID: \"201a7c5f-c5af-465c-8046-20ccdfd64d24\") " pod="openshift-service-ca/service-ca-865cb79987-2lwxn" Apr 17 17:27:17.179674 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:17.179647 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp9b6\" (UniqueName: \"kubernetes.io/projected/201a7c5f-c5af-465c-8046-20ccdfd64d24-kube-api-access-fp9b6\") pod \"service-ca-865cb79987-2lwxn\" (UID: \"201a7c5f-c5af-465c-8046-20ccdfd64d24\") " pod="openshift-service-ca/service-ca-865cb79987-2lwxn" Apr 17 17:27:17.264289 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:17.264260 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-2lwxn" Apr 17 17:27:17.382371 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:17.382336 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2lwxn"] Apr 17 17:27:17.386229 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:27:17.386201 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod201a7c5f_c5af_465c_8046_20ccdfd64d24.slice/crio-4eae1d818ec8056e80ea041919066812f38d6fa6049c0bbfb83a7b3b274dc90f WatchSource:0}: Error finding container 4eae1d818ec8056e80ea041919066812f38d6fa6049c0bbfb83a7b3b274dc90f: Status 404 returned error can't find the container with id 4eae1d818ec8056e80ea041919066812f38d6fa6049c0bbfb83a7b3b274dc90f Apr 17 17:27:18.166482 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:18.166443 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-2lwxn" event={"ID":"201a7c5f-c5af-465c-8046-20ccdfd64d24","Type":"ContainerStarted","Data":"560a1706bf48f4005caf40e6be5bc9411290a7235f3a37930b9389fc693cc1be"} Apr 17 17:27:18.166482 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:18.166486 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-2lwxn" event={"ID":"201a7c5f-c5af-465c-8046-20ccdfd64d24","Type":"ContainerStarted","Data":"4eae1d818ec8056e80ea041919066812f38d6fa6049c0bbfb83a7b3b274dc90f"} Apr 17 17:27:18.187734 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:18.187684 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-2lwxn" podStartSLOduration=2.187669453 podStartE2EDuration="2.187669453s" podCreationTimestamp="2026-04-17 17:27:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:27:18.186393914 +0000 UTC m=+131.125299193" watchObservedRunningTime="2026-04-17 17:27:18.187669453 +0000 UTC m=+131.126574716" Apr 17 17:27:18.779783 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:18.779739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-t9ptd\" (UID: \"ec479a05-737c-41d2-aa0d-9d627b239283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd" Apr 17 17:27:18.779975 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:18.779886 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:27:18.779975 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:18.779952 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls podName:ec479a05-737c-41d2-aa0d-9d627b239283 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:34.779936807 +0000 UTC m=+147.718842048 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-t9ptd" (UID: "ec479a05-737c-41d2-aa0d-9d627b239283") : secret "samples-operator-tls" not found Apr 17 17:27:27.654352 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:27.654309 2571 scope.go:117] "RemoveContainer" containerID="e11cfa5a1aefa59179ab2b78b6b549b1f085d497b2998c7df3f6f6f560d86a37" Apr 17 17:27:28.190737 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:28.190708 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nq8bf_9dd87d69-6559-458b-94cc-fc8a74f27e9a/console-operator/1.log" Apr 17 17:27:28.190903 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:28.190800 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" event={"ID":"9dd87d69-6559-458b-94cc-fc8a74f27e9a","Type":"ContainerStarted","Data":"b06b2de6c96703a24d39b0f93fcab4f7189a823db25ca0a7e98e671035f2becc"} Apr 17 17:27:28.191132 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:28.191109 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" Apr 17 17:27:28.210346 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:28.210296 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" podStartSLOduration=23.364991452 podStartE2EDuration="25.210281733s" podCreationTimestamp="2026-04-17 17:27:03 +0000 UTC" firstStartedPulling="2026-04-17 17:27:04.424604049 +0000 UTC m=+117.363509289" lastFinishedPulling="2026-04-17 17:27:06.269894328 +0000 UTC m=+119.208799570" observedRunningTime="2026-04-17 17:27:28.208789238 +0000 UTC m=+141.147694503" watchObservedRunningTime="2026-04-17 17:27:28.210281733 +0000 UTC m=+141.149186995" Apr 17 17:27:28.939137 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:28.939108 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-nq8bf" Apr 17 17:27:34.807142 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:34.807089 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-t9ptd\" (UID: \"ec479a05-737c-41d2-aa0d-9d627b239283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd" Apr 17 17:27:34.809676 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:34.809654 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec479a05-737c-41d2-aa0d-9d627b239283-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-t9ptd\" (UID: \"ec479a05-737c-41d2-aa0d-9d627b239283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd" Apr 17 17:27:35.108381 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:35.108297 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-gprl4\"" Apr 17 17:27:35.116378 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:35.116355 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd" Apr 17 17:27:35.235023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:35.234967 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd"] Apr 17 17:27:36.212805 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.212767 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd" event={"ID":"ec479a05-737c-41d2-aa0d-9d627b239283","Type":"ContainerStarted","Data":"7c6c9890483062b8d6ea10fc3dd93cea804e78d6bcd075476a54893e89a1b4c8"} Apr 17 17:27:36.694237 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.694203 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-244jf"] Apr 17 17:27:36.700878 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.700854 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-244jf" Apr 17 17:27:36.703480 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.703452 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 17:27:36.704588 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.704401 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 17:27:36.704588 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.704429 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 17:27:36.704588 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.704439 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 17:27:36.704588 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.704476 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-68nj2\"" Apr 17 17:27:36.709369 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.709201 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-244jf"] Apr 17 17:27:36.721657 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.721625 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/457a7752-1528-46e3-a693-0e6eddb138c7-data-volume\") pod \"insights-runtime-extractor-244jf\" (UID: \"457a7752-1528-46e3-a693-0e6eddb138c7\") " pod="openshift-insights/insights-runtime-extractor-244jf" Apr 17 17:27:36.721809 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.721715 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/457a7752-1528-46e3-a693-0e6eddb138c7-crio-socket\") pod \"insights-runtime-extractor-244jf\" (UID: \"457a7752-1528-46e3-a693-0e6eddb138c7\") " pod="openshift-insights/insights-runtime-extractor-244jf" Apr 17 17:27:36.721809 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.721787 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4snf5\" (UniqueName: \"kubernetes.io/projected/457a7752-1528-46e3-a693-0e6eddb138c7-kube-api-access-4snf5\") pod \"insights-runtime-extractor-244jf\" (UID: \"457a7752-1528-46e3-a693-0e6eddb138c7\") " pod="openshift-insights/insights-runtime-extractor-244jf" Apr 17 17:27:36.721907 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.721818 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/457a7752-1528-46e3-a693-0e6eddb138c7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-244jf\" (UID: \"457a7752-1528-46e3-a693-0e6eddb138c7\") " pod="openshift-insights/insights-runtime-extractor-244jf" Apr 17 17:27:36.721907 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.721849 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/457a7752-1528-46e3-a693-0e6eddb138c7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-244jf\" (UID: \"457a7752-1528-46e3-a693-0e6eddb138c7\") " pod="openshift-insights/insights-runtime-extractor-244jf" Apr 17 17:27:36.802039 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.801974 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-v6scj"] Apr 17 17:27:36.809784 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.809751 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-clqzj"] Apr 17 17:27:36.810550 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.810525 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-v6scj" Apr 17 17:27:36.813461 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.813340 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-lmkz6\"" Apr 17 17:27:36.813655 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.813641 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 17:27:36.813877 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.813851 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 17:27:36.814305 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.814272 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-92kx6"] Apr 17 17:27:36.814513 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.814485 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-clqzj" Apr 17 17:27:36.817890 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.817866 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-92kx6"] Apr 17 17:27:36.817890 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.817893 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-v6scj"] Apr 17 17:27:36.818060 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.817904 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-clqzj"] Apr 17 17:27:36.818060 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.818021 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-92kx6" Apr 17 17:27:36.819232 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.819158 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-294mx\"" Apr 17 17:27:36.819419 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.819397 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 17:27:36.820776 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.820757 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 17:27:36.821620 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.821411 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 17:27:36.821620 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.821428 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-bxck6\"" Apr 17 17:27:36.822319 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.822296 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/457a7752-1528-46e3-a693-0e6eddb138c7-data-volume\") pod \"insights-runtime-extractor-244jf\" (UID: \"457a7752-1528-46e3-a693-0e6eddb138c7\") " pod="openshift-insights/insights-runtime-extractor-244jf" Apr 17 17:27:36.822428 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.822370 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/457a7752-1528-46e3-a693-0e6eddb138c7-crio-socket\") pod \"insights-runtime-extractor-244jf\" (UID: \"457a7752-1528-46e3-a693-0e6eddb138c7\") " pod="openshift-insights/insights-runtime-extractor-244jf" Apr 17 17:27:36.822487 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.822439 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4snf5\" (UniqueName: \"kubernetes.io/projected/457a7752-1528-46e3-a693-0e6eddb138c7-kube-api-access-4snf5\") pod \"insights-runtime-extractor-244jf\" (UID: \"457a7752-1528-46e3-a693-0e6eddb138c7\") " pod="openshift-insights/insights-runtime-extractor-244jf" Apr 17 17:27:36.822487 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.822469 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/457a7752-1528-46e3-a693-0e6eddb138c7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-244jf\" (UID: \"457a7752-1528-46e3-a693-0e6eddb138c7\") " pod="openshift-insights/insights-runtime-extractor-244jf" Apr 17 17:27:36.822588 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.822497 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/457a7752-1528-46e3-a693-0e6eddb138c7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-244jf\" (UID: \"457a7752-1528-46e3-a693-0e6eddb138c7\") " pod="openshift-insights/insights-runtime-extractor-244jf" Apr 17 17:27:36.822695 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.822677 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/457a7752-1528-46e3-a693-0e6eddb138c7-crio-socket\") pod \"insights-runtime-extractor-244jf\" (UID: \"457a7752-1528-46e3-a693-0e6eddb138c7\") " pod="openshift-insights/insights-runtime-extractor-244jf" Apr 17 17:27:36.822956 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.822936 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/457a7752-1528-46e3-a693-0e6eddb138c7-data-volume\") pod \"insights-runtime-extractor-244jf\" (UID: \"457a7752-1528-46e3-a693-0e6eddb138c7\") " pod="openshift-insights/insights-runtime-extractor-244jf" Apr 17 17:27:36.823572 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.823553 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/457a7752-1528-46e3-a693-0e6eddb138c7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-244jf\" (UID: \"457a7752-1528-46e3-a693-0e6eddb138c7\") " pod="openshift-insights/insights-runtime-extractor-244jf" Apr 17 17:27:36.825406 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.825299 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/457a7752-1528-46e3-a693-0e6eddb138c7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-244jf\" (UID: \"457a7752-1528-46e3-a693-0e6eddb138c7\") " pod="openshift-insights/insights-runtime-extractor-244jf" Apr 17 17:27:36.842679 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.842655 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4snf5\" (UniqueName: \"kubernetes.io/projected/457a7752-1528-46e3-a693-0e6eddb138c7-kube-api-access-4snf5\") pod \"insights-runtime-extractor-244jf\" (UID: \"457a7752-1528-46e3-a693-0e6eddb138c7\") " pod="openshift-insights/insights-runtime-extractor-244jf" Apr 17 17:27:36.923280 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.923240 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/050d613e-46b7-4949-ab46-ebcf42deeeb3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-clqzj\" (UID: \"050d613e-46b7-4949-ab46-ebcf42deeeb3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-clqzj" Apr 17 17:27:36.923460 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.923288 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a266ff4-f2ce-4fe5-b4cc-806ebc260ff6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-92kx6\" (UID: \"6a266ff4-f2ce-4fe5-b4cc-806ebc260ff6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-92kx6" Apr 17 17:27:36.923460 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.923365 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzq22\" (UniqueName: \"kubernetes.io/projected/c71e7458-3162-4d35-af2f-4cb2853de04b-kube-api-access-zzq22\") pod \"downloads-6bcc868b7-v6scj\" (UID: \"c71e7458-3162-4d35-af2f-4cb2853de04b\") " pod="openshift-console/downloads-6bcc868b7-v6scj" Apr 17 17:27:36.923460 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:36.923409 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a266ff4-f2ce-4fe5-b4cc-806ebc260ff6-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-92kx6\" (UID: \"6a266ff4-f2ce-4fe5-b4cc-806ebc260ff6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-92kx6" Apr 17 17:27:37.013383 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.013356 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-244jf" Apr 17 17:27:37.024102 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.024067 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/050d613e-46b7-4949-ab46-ebcf42deeeb3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-clqzj\" (UID: \"050d613e-46b7-4949-ab46-ebcf42deeeb3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-clqzj" Apr 17 17:27:37.024275 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.024115 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a266ff4-f2ce-4fe5-b4cc-806ebc260ff6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-92kx6\" (UID: \"6a266ff4-f2ce-4fe5-b4cc-806ebc260ff6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-92kx6" Apr 17 17:27:37.024275 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.024151 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzq22\" (UniqueName: \"kubernetes.io/projected/c71e7458-3162-4d35-af2f-4cb2853de04b-kube-api-access-zzq22\") pod \"downloads-6bcc868b7-v6scj\" (UID: \"c71e7458-3162-4d35-af2f-4cb2853de04b\") " pod="openshift-console/downloads-6bcc868b7-v6scj" Apr 17 17:27:37.024275 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.024189 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a266ff4-f2ce-4fe5-b4cc-806ebc260ff6-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-92kx6\" (UID: \"6a266ff4-f2ce-4fe5-b4cc-806ebc260ff6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-92kx6" Apr 17 17:27:37.024974 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.024944 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a266ff4-f2ce-4fe5-b4cc-806ebc260ff6-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-92kx6\" (UID: \"6a266ff4-f2ce-4fe5-b4cc-806ebc260ff6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-92kx6" Apr 17 17:27:37.026739 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.026714 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/050d613e-46b7-4949-ab46-ebcf42deeeb3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-clqzj\" (UID: \"050d613e-46b7-4949-ab46-ebcf42deeeb3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-clqzj" Apr 17 17:27:37.026840 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.026822 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a266ff4-f2ce-4fe5-b4cc-806ebc260ff6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-92kx6\" (UID: \"6a266ff4-f2ce-4fe5-b4cc-806ebc260ff6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-92kx6" Apr 17 17:27:37.037438 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.037406 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzq22\" (UniqueName: \"kubernetes.io/projected/c71e7458-3162-4d35-af2f-4cb2853de04b-kube-api-access-zzq22\") pod \"downloads-6bcc868b7-v6scj\" (UID: \"c71e7458-3162-4d35-af2f-4cb2853de04b\") " pod="openshift-console/downloads-6bcc868b7-v6scj" Apr 17 17:27:37.124819 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.124339 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-v6scj" Apr 17 17:27:37.139643 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.139115 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-clqzj" Apr 17 17:27:37.151046 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.150862 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-92kx6" Apr 17 17:27:37.185155 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.185104 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-244jf"] Apr 17 17:27:37.188905 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:27:37.188872 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod457a7752_1528_46e3_a693_0e6eddb138c7.slice/crio-016b851d791e24f57d85686c1965d17b7b1a278a935ebfbf742edaab3f6e7eaa WatchSource:0}: Error finding container 016b851d791e24f57d85686c1965d17b7b1a278a935ebfbf742edaab3f6e7eaa: Status 404 returned error can't find the container with id 016b851d791e24f57d85686c1965d17b7b1a278a935ebfbf742edaab3f6e7eaa Apr 17 17:27:37.238316 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.236326 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd" event={"ID":"ec479a05-737c-41d2-aa0d-9d627b239283","Type":"ContainerStarted","Data":"b193c6114ba870046ad17203535ca48ca21f7b74cabd40b154a743f383dd9b6a"} Apr 17 17:27:37.238316 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.238277 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd" event={"ID":"ec479a05-737c-41d2-aa0d-9d627b239283","Type":"ContainerStarted","Data":"a25e09d90fcfc20e04ff8a23060c9a5ef49089bb6247edf54fabba7221541492"} Apr 17 17:27:37.242811 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.242753 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-244jf" event={"ID":"457a7752-1528-46e3-a693-0e6eddb138c7","Type":"ContainerStarted","Data":"016b851d791e24f57d85686c1965d17b7b1a278a935ebfbf742edaab3f6e7eaa"} Apr 17 17:27:37.257688 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.257599 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t9ptd" podStartSLOduration=33.505883291 podStartE2EDuration="35.25758037s" podCreationTimestamp="2026-04-17 17:27:02 +0000 UTC" firstStartedPulling="2026-04-17 17:27:35.277737272 +0000 UTC m=+148.216642513" lastFinishedPulling="2026-04-17 17:27:37.029434332 +0000 UTC m=+149.968339592" observedRunningTime="2026-04-17 17:27:37.256093587 +0000 UTC m=+150.194998866" watchObservedRunningTime="2026-04-17 17:27:37.25758037 +0000 UTC m=+150.196485633" Apr 17 17:27:37.300194 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.300151 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-v6scj"] Apr 17 17:27:37.307504 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:27:37.307474 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc71e7458_3162_4d35_af2f_4cb2853de04b.slice/crio-2099f9f1d3b7505d5dc90ddd97dbab5158d64cba09ffd9907bb33eef4eba2813 WatchSource:0}: Error finding container 2099f9f1d3b7505d5dc90ddd97dbab5158d64cba09ffd9907bb33eef4eba2813: Status 404 returned error can't find the container with id 2099f9f1d3b7505d5dc90ddd97dbab5158d64cba09ffd9907bb33eef4eba2813 Apr 17 17:27:37.328679 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.328336 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-clqzj"] Apr 17 17:27:37.331316 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:27:37.331287 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod050d613e_46b7_4949_ab46_ebcf42deeeb3.slice/crio-bb0f1cdcf5a22aa97f4c19bd4d28f7d273d16285c21ab17fd90754290dd13b3c WatchSource:0}: Error finding container bb0f1cdcf5a22aa97f4c19bd4d28f7d273d16285c21ab17fd90754290dd13b3c: Status 404 returned error can't find the container with id bb0f1cdcf5a22aa97f4c19bd4d28f7d273d16285c21ab17fd90754290dd13b3c Apr 17 17:27:37.341396 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:37.341370 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-92kx6"] Apr 17 17:27:37.344340 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:27:37.344299 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a266ff4_f2ce_4fe5_b4cc_806ebc260ff6.slice/crio-0dc0eb33515550ee0195ec35ed6acb30c005aa1175d5b35db1ca332d61edda9a WatchSource:0}: Error finding container 0dc0eb33515550ee0195ec35ed6acb30c005aa1175d5b35db1ca332d61edda9a: Status 404 returned error can't find the container with id 0dc0eb33515550ee0195ec35ed6acb30c005aa1175d5b35db1ca332d61edda9a Apr 17 17:27:38.246923 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.246808 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-v6scj" event={"ID":"c71e7458-3162-4d35-af2f-4cb2853de04b","Type":"ContainerStarted","Data":"2099f9f1d3b7505d5dc90ddd97dbab5158d64cba09ffd9907bb33eef4eba2813"} Apr 17 17:27:38.248291 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.248237 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-92kx6" event={"ID":"6a266ff4-f2ce-4fe5-b4cc-806ebc260ff6","Type":"ContainerStarted","Data":"0dc0eb33515550ee0195ec35ed6acb30c005aa1175d5b35db1ca332d61edda9a"} Apr 17 17:27:38.249547 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.249505 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-clqzj" event={"ID":"050d613e-46b7-4949-ab46-ebcf42deeeb3","Type":"ContainerStarted","Data":"bb0f1cdcf5a22aa97f4c19bd4d28f7d273d16285c21ab17fd90754290dd13b3c"} Apr 17 17:27:38.251600 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.251570 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-244jf" event={"ID":"457a7752-1528-46e3-a693-0e6eddb138c7","Type":"ContainerStarted","Data":"0cddcc4ec0e58c1d8995208086894f7ecc362232392fc23068d9d9d7ce4f4634"} Apr 17 17:27:38.251707 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.251611 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-244jf" event={"ID":"457a7752-1528-46e3-a693-0e6eddb138c7","Type":"ContainerStarted","Data":"2205a289f33fa01a2592eb74e77132242392d728e77661c09655186579e2d4f8"} Apr 17 17:27:38.843486 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.843450 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66d8c9c64d-k8knx"] Apr 17 17:27:38.848309 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.848283 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:38.852263 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.852242 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 17:27:38.852380 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.852351 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 17:27:38.852450 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.852405 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-wgtt5\"" Apr 17 17:27:38.852450 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.852242 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 17:27:38.852606 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.852578 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 17:27:38.852691 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.852660 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 17:27:38.854625 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.854545 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66d8c9c64d-k8knx"] Apr 17 17:27:38.944844 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.944808 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a9c78c6-8531-4bfe-a000-a26721924369-console-config\") pod \"console-66d8c9c64d-k8knx\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:38.945104 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.944944 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a9c78c6-8531-4bfe-a000-a26721924369-service-ca\") pod \"console-66d8c9c64d-k8knx\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:38.945104 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.944969 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a9c78c6-8531-4bfe-a000-a26721924369-console-serving-cert\") pod \"console-66d8c9c64d-k8knx\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:38.945104 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.945056 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4rjw\" (UniqueName: \"kubernetes.io/projected/5a9c78c6-8531-4bfe-a000-a26721924369-kube-api-access-v4rjw\") pod \"console-66d8c9c64d-k8knx\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:38.945104 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.945095 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a9c78c6-8531-4bfe-a000-a26721924369-oauth-serving-cert\") pod \"console-66d8c9c64d-k8knx\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:38.945332 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:38.945134 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a9c78c6-8531-4bfe-a000-a26721924369-console-oauth-config\") pod \"console-66d8c9c64d-k8knx\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:39.045904 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.045778 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a9c78c6-8531-4bfe-a000-a26721924369-console-config\") pod \"console-66d8c9c64d-k8knx\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:39.046101 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.045909 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a9c78c6-8531-4bfe-a000-a26721924369-service-ca\") pod \"console-66d8c9c64d-k8knx\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:39.046101 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.045938 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a9c78c6-8531-4bfe-a000-a26721924369-console-serving-cert\") pod \"console-66d8c9c64d-k8knx\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:39.046101 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.045972 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4rjw\" (UniqueName: \"kubernetes.io/projected/5a9c78c6-8531-4bfe-a000-a26721924369-kube-api-access-v4rjw\") pod \"console-66d8c9c64d-k8knx\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:39.046101 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.046038 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a9c78c6-8531-4bfe-a000-a26721924369-oauth-serving-cert\") pod \"console-66d8c9c64d-k8knx\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:39.046101 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.046073 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a9c78c6-8531-4bfe-a000-a26721924369-console-oauth-config\") pod \"console-66d8c9c64d-k8knx\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:39.046643 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.046582 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a9c78c6-8531-4bfe-a000-a26721924369-console-config\") pod \"console-66d8c9c64d-k8knx\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:39.047015 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.046928 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a9c78c6-8531-4bfe-a000-a26721924369-service-ca\") pod \"console-66d8c9c64d-k8knx\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:39.047163 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.047122 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a9c78c6-8531-4bfe-a000-a26721924369-oauth-serving-cert\") pod \"console-66d8c9c64d-k8knx\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:39.049290 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.049261 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a9c78c6-8531-4bfe-a000-a26721924369-console-oauth-config\") pod \"console-66d8c9c64d-k8knx\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:39.049480 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.049456 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a9c78c6-8531-4bfe-a000-a26721924369-console-serving-cert\") pod \"console-66d8c9c64d-k8knx\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:39.056241 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.056199 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4rjw\" (UniqueName: \"kubernetes.io/projected/5a9c78c6-8531-4bfe-a000-a26721924369-kube-api-access-v4rjw\") pod \"console-66d8c9c64d-k8knx\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:39.179239 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.179165 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:39.257694 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.257642 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-92kx6" event={"ID":"6a266ff4-f2ce-4fe5-b4cc-806ebc260ff6","Type":"ContainerStarted","Data":"70b9d4e59eecc7d0bd808c574470e6e8474ae415b60d6642285a6344ec0055ef"} Apr 17 17:27:39.260628 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.260590 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-clqzj" event={"ID":"050d613e-46b7-4949-ab46-ebcf42deeeb3","Type":"ContainerStarted","Data":"87bdcb263f45f4a89b7be70f179124387dd10fa0c25a78379712a386ce00d2c0"} Apr 17 17:27:39.261193 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.261171 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-clqzj" Apr 17 17:27:39.267803 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.267771 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-clqzj" Apr 17 17:27:39.298117 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.297958 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-92kx6" podStartSLOduration=1.847883195 podStartE2EDuration="3.297939735s" podCreationTimestamp="2026-04-17 17:27:36 +0000 UTC" firstStartedPulling="2026-04-17 17:27:37.34615751 +0000 UTC m=+150.285062754" lastFinishedPulling="2026-04-17 17:27:38.79621405 +0000 UTC m=+151.735119294" observedRunningTime="2026-04-17 17:27:39.276395284 +0000 UTC m=+152.215300549" watchObservedRunningTime="2026-04-17 17:27:39.297939735 +0000 UTC m=+152.236844999" Apr 17 17:27:39.335283 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.335224 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-clqzj" podStartSLOduration=1.86997489 podStartE2EDuration="3.335201963s" podCreationTimestamp="2026-04-17 17:27:36 +0000 UTC" firstStartedPulling="2026-04-17 17:27:37.333604877 +0000 UTC m=+150.272510117" lastFinishedPulling="2026-04-17 17:27:38.798831936 +0000 UTC m=+151.737737190" observedRunningTime="2026-04-17 17:27:39.298197061 +0000 UTC m=+152.237102314" watchObservedRunningTime="2026-04-17 17:27:39.335201963 +0000 UTC m=+152.274107227" Apr 17 17:27:39.335533 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.335511 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66d8c9c64d-k8knx"] Apr 17 17:27:39.338742 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:27:39.338703 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a9c78c6_8531_4bfe_a000_a26721924369.slice/crio-9ddd8fb7b5007670c9e93a6133f2ddbbdc5a591129dd3f8629867746bb8b9e9e WatchSource:0}: Error finding container 9ddd8fb7b5007670c9e93a6133f2ddbbdc5a591129dd3f8629867746bb8b9e9e: Status 404 returned error can't find the container with id 9ddd8fb7b5007670c9e93a6133f2ddbbdc5a591129dd3f8629867746bb8b9e9e Apr 17 17:27:39.691169 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.691075 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-k9kxz"] Apr 17 17:27:39.695927 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.695718 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" Apr 17 17:27:39.699205 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.698469 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 17:27:39.699205 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.698774 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 17:27:39.699205 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.698983 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 17:27:39.699728 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.699587 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-29fkf\"" Apr 17 17:27:39.699728 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.699605 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 17:27:39.699728 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.699612 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 17:27:39.701515 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.701484 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-k9kxz"] Apr 17 17:27:39.754225 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.754184 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcjqv\" (UniqueName: \"kubernetes.io/projected/841bb604-188c-482a-815c-d8da332a5b02-kube-api-access-pcjqv\") pod \"prometheus-operator-5676c8c784-k9kxz\" (UID: \"841bb604-188c-482a-815c-d8da332a5b02\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" Apr 17 17:27:39.754460 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.754284 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/841bb604-188c-482a-815c-d8da332a5b02-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-k9kxz\" (UID: \"841bb604-188c-482a-815c-d8da332a5b02\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" Apr 17 17:27:39.754460 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.754370 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/841bb604-188c-482a-815c-d8da332a5b02-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-k9kxz\" (UID: \"841bb604-188c-482a-815c-d8da332a5b02\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" Apr 17 17:27:39.754582 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.754489 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/841bb604-188c-482a-815c-d8da332a5b02-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-k9kxz\" (UID: \"841bb604-188c-482a-815c-d8da332a5b02\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" Apr 17 17:27:39.855668 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.855635 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/841bb604-188c-482a-815c-d8da332a5b02-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-k9kxz\" (UID: \"841bb604-188c-482a-815c-d8da332a5b02\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" Apr 17 17:27:39.855854 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.855695 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcjqv\" (UniqueName: \"kubernetes.io/projected/841bb604-188c-482a-815c-d8da332a5b02-kube-api-access-pcjqv\") pod \"prometheus-operator-5676c8c784-k9kxz\" (UID: \"841bb604-188c-482a-815c-d8da332a5b02\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" Apr 17 17:27:39.855854 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.855739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/841bb604-188c-482a-815c-d8da332a5b02-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-k9kxz\" (UID: \"841bb604-188c-482a-815c-d8da332a5b02\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" Apr 17 17:27:39.855854 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.855775 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/841bb604-188c-482a-815c-d8da332a5b02-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-k9kxz\" (UID: \"841bb604-188c-482a-815c-d8da332a5b02\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" Apr 17 17:27:39.856356 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:39.856109 2571 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 17:27:39.856356 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:39.856185 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/841bb604-188c-482a-815c-d8da332a5b02-prometheus-operator-tls podName:841bb604-188c-482a-815c-d8da332a5b02 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:40.356165448 +0000 UTC m=+153.295070689 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/841bb604-188c-482a-815c-d8da332a5b02-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-k9kxz" (UID: "841bb604-188c-482a-815c-d8da332a5b02") : secret "prometheus-operator-tls" not found Apr 17 17:27:39.856559 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.856534 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/841bb604-188c-482a-815c-d8da332a5b02-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-k9kxz\" (UID: \"841bb604-188c-482a-815c-d8da332a5b02\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" Apr 17 17:27:39.859189 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.859163 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/841bb604-188c-482a-815c-d8da332a5b02-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-k9kxz\" (UID: \"841bb604-188c-482a-815c-d8da332a5b02\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" Apr 17 17:27:39.865661 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:39.865634 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcjqv\" (UniqueName: \"kubernetes.io/projected/841bb604-188c-482a-815c-d8da332a5b02-kube-api-access-pcjqv\") pod \"prometheus-operator-5676c8c784-k9kxz\" (UID: \"841bb604-188c-482a-815c-d8da332a5b02\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" Apr 17 17:27:40.270726 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:40.270683 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-244jf" event={"ID":"457a7752-1528-46e3-a693-0e6eddb138c7","Type":"ContainerStarted","Data":"6eec9114e97224471760fd3709bd10a88ef5aacac6560fde83054dcbb00e84d2"} Apr 17 17:27:40.272056 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:40.271964 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66d8c9c64d-k8knx" event={"ID":"5a9c78c6-8531-4bfe-a000-a26721924369","Type":"ContainerStarted","Data":"9ddd8fb7b5007670c9e93a6133f2ddbbdc5a591129dd3f8629867746bb8b9e9e"} Apr 17 17:27:40.360860 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:40.360458 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/841bb604-188c-482a-815c-d8da332a5b02-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-k9kxz\" (UID: \"841bb604-188c-482a-815c-d8da332a5b02\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" Apr 17 17:27:40.363911 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:40.363859 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/841bb604-188c-482a-815c-d8da332a5b02-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-k9kxz\" (UID: \"841bb604-188c-482a-815c-d8da332a5b02\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" Apr 17 17:27:40.611231 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:40.611135 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" Apr 17 17:27:40.765721 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:40.765658 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-244jf" podStartSLOduration=2.073220677 podStartE2EDuration="4.765636467s" podCreationTimestamp="2026-04-17 17:27:36 +0000 UTC" firstStartedPulling="2026-04-17 17:27:37.286391191 +0000 UTC m=+150.225296448" lastFinishedPulling="2026-04-17 17:27:39.978806991 +0000 UTC m=+152.917712238" observedRunningTime="2026-04-17 17:27:40.291069911 +0000 UTC m=+153.229975175" watchObservedRunningTime="2026-04-17 17:27:40.765636467 +0000 UTC m=+153.704541733" Apr 17 17:27:40.766605 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:40.766571 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-k9kxz"] Apr 17 17:27:40.770509 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:27:40.770478 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod841bb604_188c_482a_815c_d8da332a5b02.slice/crio-f929540477452bdc4ab1c05a1c98997f9dc68d81050580b4ccf34273434e9a19 WatchSource:0}: Error finding container f929540477452bdc4ab1c05a1c98997f9dc68d81050580b4ccf34273434e9a19: Status 404 returned error can't find the container with id f929540477452bdc4ab1c05a1c98997f9dc68d81050580b4ccf34273434e9a19 Apr 17 17:27:41.276706 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:41.276666 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" event={"ID":"841bb604-188c-482a-815c-d8da332a5b02","Type":"ContainerStarted","Data":"f929540477452bdc4ab1c05a1c98997f9dc68d81050580b4ccf34273434e9a19"} Apr 17 17:27:41.994932 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:41.994885 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-frpbr" podUID="4efcb7d2-ede6-4e86-920c-19ff33a94f7b" Apr 17 17:27:42.010147 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:42.010083 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-r7dzq" podUID="32351f44-6fa2-44e0-81dd-549f2f19d705" Apr 17 17:27:42.282317 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:42.282284 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-frpbr" Apr 17 17:27:43.288736 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:43.288653 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66d8c9c64d-k8knx" event={"ID":"5a9c78c6-8531-4bfe-a000-a26721924369","Type":"ContainerStarted","Data":"eee8a6dbc2e0d247017eedc2b00da517865ce4e5dd827c8449d602c0a5ce38d6"} Apr 17 17:27:43.291972 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:43.291936 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" event={"ID":"841bb604-188c-482a-815c-d8da332a5b02","Type":"ContainerStarted","Data":"b96f5037dae36cdc920467e774543ce90ca99dc98b6890cf87da60ec948df7c7"} Apr 17 17:27:43.291972 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:43.291972 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" event={"ID":"841bb604-188c-482a-815c-d8da332a5b02","Type":"ContainerStarted","Data":"7d9395470db9163eeac127e6dc5702f278935ecc99572c4353e6ea84df03c202"} Apr 17 17:27:43.308340 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:43.308203 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66d8c9c64d-k8knx" podStartSLOduration=1.603653859 podStartE2EDuration="5.308175664s" podCreationTimestamp="2026-04-17 17:27:38 +0000 UTC" firstStartedPulling="2026-04-17 17:27:39.341174447 +0000 UTC m=+152.280079693" lastFinishedPulling="2026-04-17 17:27:43.045696241 +0000 UTC m=+155.984601498" observedRunningTime="2026-04-17 17:27:43.305533152 +0000 UTC m=+156.244438417" watchObservedRunningTime="2026-04-17 17:27:43.308175664 +0000 UTC m=+156.247080928" Apr 17 17:27:43.666209 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:43.666126 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-zbptt" podUID="a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4" Apr 17 17:27:45.067724 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.067551 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-k9kxz" podStartSLOduration=3.795783547 podStartE2EDuration="6.067530403s" podCreationTimestamp="2026-04-17 17:27:39 +0000 UTC" firstStartedPulling="2026-04-17 17:27:40.773048677 +0000 UTC m=+153.711953923" lastFinishedPulling="2026-04-17 17:27:43.04479552 +0000 UTC m=+155.983700779" observedRunningTime="2026-04-17 17:27:43.322109353 +0000 UTC m=+156.261014617" watchObservedRunningTime="2026-04-17 17:27:45.067530403 +0000 UTC m=+158.006435667" Apr 17 17:27:45.067724 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.068059 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-f8mfn"] Apr 17 17:27:45.072349 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.072329 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.075819 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.075798 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 17:27:45.076718 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.076041 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-4qm8w\"" Apr 17 17:27:45.076718 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.076271 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 17:27:45.076718 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.076408 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 17:27:45.207720 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.207679 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.207894 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.207799 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/35d91680-e317-4b76-b153-a4f79eba959a-sys\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.207894 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.207838 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-tls\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.207894 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.207879 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-accelerators-collector-config\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.208095 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.207909 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-textfile\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.208095 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.207934 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-wtmp\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.208095 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.207967 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/35d91680-e317-4b76-b153-a4f79eba959a-root\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.208095 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.208012 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35d91680-e317-4b76-b153-a4f79eba959a-metrics-client-ca\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.208095 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.208071 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8t9s\" (UniqueName: \"kubernetes.io/projected/35d91680-e317-4b76-b153-a4f79eba959a-kube-api-access-c8t9s\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.309289 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.309250 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/35d91680-e317-4b76-b153-a4f79eba959a-root\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.309487 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.309302 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35d91680-e317-4b76-b153-a4f79eba959a-metrics-client-ca\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.309487 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.309354 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8t9s\" (UniqueName: \"kubernetes.io/projected/35d91680-e317-4b76-b153-a4f79eba959a-kube-api-access-c8t9s\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.309487 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.309386 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/35d91680-e317-4b76-b153-a4f79eba959a-root\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.309487 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.309428 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.309487 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.309479 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/35d91680-e317-4b76-b153-a4f79eba959a-sys\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.309722 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.309502 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-tls\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.309722 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.309537 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-accelerators-collector-config\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.309722 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.309570 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-textfile\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.309722 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.309595 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-wtmp\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.309927 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.309763 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-wtmp\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.309927 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:45.309795 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:27:45.309927 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:45.309854 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-tls podName:35d91680-e317-4b76-b153-a4f79eba959a nodeName:}" failed. No retries permitted until 2026-04-17 17:27:45.809836486 +0000 UTC m=+158.748741726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-tls") pod "node-exporter-f8mfn" (UID: "35d91680-e317-4b76-b153-a4f79eba959a") : secret "node-exporter-tls" not found Apr 17 17:27:45.310123 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.309950 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35d91680-e317-4b76-b153-a4f79eba959a-metrics-client-ca\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.310348 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.310248 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-textfile\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.310348 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.310310 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-accelerators-collector-config\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.310348 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.310314 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/35d91680-e317-4b76-b153-a4f79eba959a-sys\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.312718 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.312697 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.318469 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.318419 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8t9s\" (UniqueName: \"kubernetes.io/projected/35d91680-e317-4b76-b153-a4f79eba959a-kube-api-access-c8t9s\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.813822 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:45.813775 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-tls\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:45.814019 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:45.813919 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:27:45.814019 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:27:45.814018 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-tls podName:35d91680-e317-4b76-b153-a4f79eba959a nodeName:}" failed. No retries permitted until 2026-04-17 17:27:46.813982359 +0000 UTC m=+159.752887614 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-tls") pod "node-exporter-f8mfn" (UID: "35d91680-e317-4b76-b153-a4f79eba959a") : secret "node-exporter-tls" not found Apr 17 17:27:46.821258 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:46.821217 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:27:46.821683 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:46.821305 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-tls\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:46.824103 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:46.824072 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/35d91680-e317-4b76-b153-a4f79eba959a-node-exporter-tls\") pod \"node-exporter-f8mfn\" (UID: \"35d91680-e317-4b76-b153-a4f79eba959a\") " pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:46.824233 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:46.824099 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4efcb7d2-ede6-4e86-920c-19ff33a94f7b-metrics-tls\") pod \"dns-default-frpbr\" (UID: \"4efcb7d2-ede6-4e86-920c-19ff33a94f7b\") " pod="openshift-dns/dns-default-frpbr" Apr 17 17:27:46.885908 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:46.885874 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-f8mfn" Apr 17 17:27:46.895469 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:27:46.895429 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35d91680_e317_4b76_b153_a4f79eba959a.slice/crio-bb72a9312baefd661869910357373c227a2766a84c62c06da29ae853daa9ca25 WatchSource:0}: Error finding container bb72a9312baefd661869910357373c227a2766a84c62c06da29ae853daa9ca25: Status 404 returned error can't find the container with id bb72a9312baefd661869910357373c227a2766a84c62c06da29ae853daa9ca25 Apr 17 17:27:46.922396 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:46.922364 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert\") pod \"ingress-canary-r7dzq\" (UID: \"32351f44-6fa2-44e0-81dd-549f2f19d705\") " pod="openshift-ingress-canary/ingress-canary-r7dzq" Apr 17 17:27:46.925196 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:46.925171 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32351f44-6fa2-44e0-81dd-549f2f19d705-cert\") pod \"ingress-canary-r7dzq\" (UID: \"32351f44-6fa2-44e0-81dd-549f2f19d705\") " pod="openshift-ingress-canary/ingress-canary-r7dzq" Apr 17 17:27:47.086271 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:47.086193 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-w4qb9\"" Apr 17 17:27:47.093587 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:47.093558 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-frpbr" Apr 17 17:27:47.305319 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:47.305266 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f8mfn" event={"ID":"35d91680-e317-4b76-b153-a4f79eba959a","Type":"ContainerStarted","Data":"bb72a9312baefd661869910357373c227a2766a84c62c06da29ae853daa9ca25"} Apr 17 17:27:48.053155 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.053109 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-55c6bf5658-7jfq4"] Apr 17 17:27:48.058186 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.058161 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.061481 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.061227 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 17:27:48.061481 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.061277 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 17:27:48.061481 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.061232 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 17:27:48.061782 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.061765 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-2t9bjqp8vg57a\"" Apr 17 17:27:48.061855 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.061832 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 17:27:48.061908 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.061849 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-869lr\"" Apr 17 17:27:48.062322 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.062121 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 17:27:48.068872 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.068847 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-55c6bf5658-7jfq4"] Apr 17 17:27:48.132831 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.132608 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjc62\" (UniqueName: \"kubernetes.io/projected/a205f472-b21e-4f79-ae09-96715d56ac66-kube-api-access-pjc62\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.132831 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.132678 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a205f472-b21e-4f79-ae09-96715d56ac66-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.132831 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.132713 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a205f472-b21e-4f79-ae09-96715d56ac66-secret-thanos-querier-tls\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.132831 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.132742 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a205f472-b21e-4f79-ae09-96715d56ac66-secret-grpc-tls\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.132831 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.132779 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a205f472-b21e-4f79-ae09-96715d56ac66-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.133228 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.132844 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a205f472-b21e-4f79-ae09-96715d56ac66-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.133228 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.132887 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a205f472-b21e-4f79-ae09-96715d56ac66-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.133228 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.132929 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a205f472-b21e-4f79-ae09-96715d56ac66-metrics-client-ca\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.234087 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.234055 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a205f472-b21e-4f79-ae09-96715d56ac66-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.234278 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.234107 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a205f472-b21e-4f79-ae09-96715d56ac66-secret-thanos-querier-tls\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.234278 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.234139 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a205f472-b21e-4f79-ae09-96715d56ac66-secret-grpc-tls\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.234278 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.234171 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a205f472-b21e-4f79-ae09-96715d56ac66-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.234278 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.234235 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a205f472-b21e-4f79-ae09-96715d56ac66-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.234278 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.234266 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a205f472-b21e-4f79-ae09-96715d56ac66-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.234516 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.234305 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a205f472-b21e-4f79-ae09-96715d56ac66-metrics-client-ca\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.234516 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.234333 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjc62\" (UniqueName: \"kubernetes.io/projected/a205f472-b21e-4f79-ae09-96715d56ac66-kube-api-access-pjc62\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.236157 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.236032 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a205f472-b21e-4f79-ae09-96715d56ac66-metrics-client-ca\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.237981 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.237935 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a205f472-b21e-4f79-ae09-96715d56ac66-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.242616 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.239055 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a205f472-b21e-4f79-ae09-96715d56ac66-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.242616 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.239431 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a205f472-b21e-4f79-ae09-96715d56ac66-secret-thanos-querier-tls\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.242616 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.239499 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a205f472-b21e-4f79-ae09-96715d56ac66-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.242616 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.239491 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a205f472-b21e-4f79-ae09-96715d56ac66-secret-grpc-tls\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.242616 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.241849 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjc62\" (UniqueName: \"kubernetes.io/projected/a205f472-b21e-4f79-ae09-96715d56ac66-kube-api-access-pjc62\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.246608 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.246579 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a205f472-b21e-4f79-ae09-96715d56ac66-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-55c6bf5658-7jfq4\" (UID: \"a205f472-b21e-4f79-ae09-96715d56ac66\") " pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:48.371962 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:48.371870 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:27:49.179797 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:49.179742 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:49.179797 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:49.179788 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:27:49.181238 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:49.181206 2571 patch_prober.go:28] interesting pod/console-66d8c9c64d-k8knx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.19:8443/health\": dial tcp 10.133.0.19:8443: connect: connection refused" start-of-body= Apr 17 17:27:49.181357 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:49.181267 2571 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-66d8c9c64d-k8knx" podUID="5a9c78c6-8531-4bfe-a000-a26721924369" containerName="console" probeResult="failure" output="Get \"https://10.133.0.19:8443/health\": dial tcp 10.133.0.19:8443: connect: connection refused" Apr 17 17:27:51.288180 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.288145 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:27:51.300603 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.300567 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.304093 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.303217 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 17:27:51.304093 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.303471 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 17:27:51.304093 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.303849 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 17:27:51.305593 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.304391 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 17:27:51.305593 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.304896 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 17:27:51.305593 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.305022 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 17:27:51.305593 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.305060 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-33rvvt8hga5fo\"" Apr 17 17:27:51.305593 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.305166 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 17:27:51.305593 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.305278 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:27:51.305593 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.305314 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 17:27:51.305593 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.305323 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jn6zl\"" Apr 17 17:27:51.305593 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.305340 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 17:27:51.305593 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.305457 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 17:27:51.308030 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.306444 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 17:27:51.309496 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.309476 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 17:27:51.311446 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.311424 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 17:27:51.462874 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.462838 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.462874 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.462872 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.463145 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.462893 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-config\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.463145 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.462916 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.463145 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.463035 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.463145 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.463075 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.463145 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.463139 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/014735e0-f64c-4e42-add4-183e239411ae-config-out\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.463384 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.463176 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/014735e0-f64c-4e42-add4-183e239411ae-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.463384 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.463205 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/014735e0-f64c-4e42-add4-183e239411ae-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.463384 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.463271 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfnvw\" (UniqueName: \"kubernetes.io/projected/014735e0-f64c-4e42-add4-183e239411ae-kube-api-access-nfnvw\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.463384 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.463329 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.463384 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.463375 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.463735 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.463404 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.463735 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.463453 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.463735 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.463485 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.463735 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.463521 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-web-config\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.463735 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.463579 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.463735 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.463605 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.564511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.564430 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.564511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.564477 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.564511 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.564502 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-web-config\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.564775 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.564529 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.564775 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.564570 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.564775 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.564626 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.564775 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.564652 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.564775 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.564677 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-config\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.564775 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.564701 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.564775 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.564752 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.565109 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.564777 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.565109 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.564853 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/014735e0-f64c-4e42-add4-183e239411ae-config-out\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.565109 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.564878 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/014735e0-f64c-4e42-add4-183e239411ae-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.565109 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.564903 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/014735e0-f64c-4e42-add4-183e239411ae-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.565109 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.564931 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfnvw\" (UniqueName: \"kubernetes.io/projected/014735e0-f64c-4e42-add4-183e239411ae-kube-api-access-nfnvw\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.565109 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.564955 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.565109 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.564978 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.565109 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.565022 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.565437 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.565291 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.565601 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.565574 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.566967 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.566456 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.568848 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.568827 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.569059 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.569037 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/014735e0-f64c-4e42-add4-183e239411ae-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.569962 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.569938 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-web-config\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.571230 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.570819 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.571230 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.570894 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.572738 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.571367 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.572738 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.571766 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.573400 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.573357 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-config\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.573501 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.573401 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/014735e0-f64c-4e42-add4-183e239411ae-config-out\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.573798 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.573734 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.573926 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.573906 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.574385 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.574360 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.574896 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.574878 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/014735e0-f64c-4e42-add4-183e239411ae-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.575691 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.575672 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.581257 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.581217 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfnvw\" (UniqueName: \"kubernetes.io/projected/014735e0-f64c-4e42-add4-183e239411ae-kube-api-access-nfnvw\") pod \"prometheus-k8s-0\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:51.614952 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:51.614909 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:54.178067 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:54.178036 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:27:54.185346 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:27:54.185272 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod014735e0_f64c_4e42_add4_183e239411ae.slice/crio-a5a41c5fc26a02fc034e52dac25ba2d762b70ae1c435b24c9789c0988fc3537d WatchSource:0}: Error finding container a5a41c5fc26a02fc034e52dac25ba2d762b70ae1c435b24c9789c0988fc3537d: Status 404 returned error can't find the container with id a5a41c5fc26a02fc034e52dac25ba2d762b70ae1c435b24c9789c0988fc3537d Apr 17 17:27:54.326498 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:54.326412 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-v6scj" event={"ID":"c71e7458-3162-4d35-af2f-4cb2853de04b","Type":"ContainerStarted","Data":"3465d21fa946fc9d9edae78d927428ecb4d3e2411390b618542663a45df4a22d"} Apr 17 17:27:54.326793 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:54.326773 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-v6scj" Apr 17 17:27:54.327868 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:54.327844 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"014735e0-f64c-4e42-add4-183e239411ae","Type":"ContainerStarted","Data":"a5a41c5fc26a02fc034e52dac25ba2d762b70ae1c435b24c9789c0988fc3537d"} Apr 17 17:27:54.328217 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:54.328197 2571 patch_prober.go:28] interesting pod/downloads-6bcc868b7-v6scj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.133.0.16:8080/\": dial tcp 10.133.0.16:8080: connect: connection refused" start-of-body= Apr 17 17:27:54.328300 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:54.328245 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-v6scj" podUID="c71e7458-3162-4d35-af2f-4cb2853de04b" containerName="download-server" probeResult="failure" output="Get \"http://10.133.0.16:8080/\": dial tcp 10.133.0.16:8080: connect: connection refused" Apr 17 17:27:54.346505 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:54.346243 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-v6scj" podStartSLOduration=1.534683696 podStartE2EDuration="18.346222649s" podCreationTimestamp="2026-04-17 17:27:36 +0000 UTC" firstStartedPulling="2026-04-17 17:27:37.309742914 +0000 UTC m=+150.248648159" lastFinishedPulling="2026-04-17 17:27:54.12128187 +0000 UTC m=+167.060187112" observedRunningTime="2026-04-17 17:27:54.344922536 +0000 UTC m=+167.283827800" watchObservedRunningTime="2026-04-17 17:27:54.346222649 +0000 UTC m=+167.285127926" Apr 17 17:27:54.398184 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:54.397940 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-frpbr"] Apr 17 17:27:54.400823 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:54.400796 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-55c6bf5658-7jfq4"] Apr 17 17:27:54.503347 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:27:54.503310 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4efcb7d2_ede6_4e86_920c_19ff33a94f7b.slice/crio-ef420ac1e9a61ccaf249239ea8ccf6de4fbd2661a76286280302d2771860f964 WatchSource:0}: Error finding container ef420ac1e9a61ccaf249239ea8ccf6de4fbd2661a76286280302d2771860f964: Status 404 returned error can't find the container with id ef420ac1e9a61ccaf249239ea8ccf6de4fbd2661a76286280302d2771860f964 Apr 17 17:27:54.504245 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:27:54.504119 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda205f472_b21e_4f79_ae09_96715d56ac66.slice/crio-147d3b65e6e4b431b86288c9aa37190216ca6626c1daa9342c93c8a452ffe309 WatchSource:0}: Error finding container 147d3b65e6e4b431b86288c9aa37190216ca6626c1daa9342c93c8a452ffe309: Status 404 returned error can't find the container with id 147d3b65e6e4b431b86288c9aa37190216ca6626c1daa9342c93c8a452ffe309 Apr 17 17:27:55.333043 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:55.332973 2571 generic.go:358] "Generic (PLEG): container finished" podID="35d91680-e317-4b76-b153-a4f79eba959a" containerID="45dd5fc6688e2798c96a66121fb02e15d6199dda2a8f70dbfd9bf5ddc543be99" exitCode=0 Apr 17 17:27:55.333589 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:55.333127 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f8mfn" event={"ID":"35d91680-e317-4b76-b153-a4f79eba959a","Type":"ContainerDied","Data":"45dd5fc6688e2798c96a66121fb02e15d6199dda2a8f70dbfd9bf5ddc543be99"} Apr 17 17:27:55.337170 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:55.336950 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-frpbr" event={"ID":"4efcb7d2-ede6-4e86-920c-19ff33a94f7b","Type":"ContainerStarted","Data":"ef420ac1e9a61ccaf249239ea8ccf6de4fbd2661a76286280302d2771860f964"} Apr 17 17:27:55.340124 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:55.340052 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" event={"ID":"a205f472-b21e-4f79-ae09-96715d56ac66","Type":"ContainerStarted","Data":"147d3b65e6e4b431b86288c9aa37190216ca6626c1daa9342c93c8a452ffe309"} Apr 17 17:27:55.355025 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:55.354947 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-v6scj" Apr 17 17:27:55.654396 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:55.654321 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r7dzq" Apr 17 17:27:55.657580 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:55.657547 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fhhs9\"" Apr 17 17:27:55.667784 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:55.667684 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r7dzq" Apr 17 17:27:55.839616 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:55.839570 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r7dzq"] Apr 17 17:27:56.042664 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:56.042621 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66d8c9c64d-k8knx"] Apr 17 17:27:56.346689 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:56.346597 2571 generic.go:358] "Generic (PLEG): container finished" podID="014735e0-f64c-4e42-add4-183e239411ae" containerID="77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966" exitCode=0 Apr 17 17:27:56.347107 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:56.346692 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"014735e0-f64c-4e42-add4-183e239411ae","Type":"ContainerDied","Data":"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966"} Apr 17 17:27:56.351516 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:56.351481 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f8mfn" event={"ID":"35d91680-e317-4b76-b153-a4f79eba959a","Type":"ContainerStarted","Data":"063a124562029b3abe674c9d7cc52cc8edb036e67e233d456ceab45c1467ae57"} Apr 17 17:27:56.351632 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:56.351524 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f8mfn" event={"ID":"35d91680-e317-4b76-b153-a4f79eba959a","Type":"ContainerStarted","Data":"ccca19ade3064d54d857d618abb26bc4aee627efd574275d2bbb340d77fc8ecb"} Apr 17 17:27:56.430229 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:56.430175 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-f8mfn" podStartSLOduration=3.775318431 podStartE2EDuration="11.430157516s" podCreationTimestamp="2026-04-17 17:27:45 +0000 UTC" firstStartedPulling="2026-04-17 17:27:46.897649405 +0000 UTC m=+159.836554647" lastFinishedPulling="2026-04-17 17:27:54.552488484 +0000 UTC m=+167.491393732" observedRunningTime="2026-04-17 17:27:56.429134385 +0000 UTC m=+169.368039650" watchObservedRunningTime="2026-04-17 17:27:56.430157516 +0000 UTC m=+169.369062776" Apr 17 17:27:57.356059 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:57.356016 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r7dzq" event={"ID":"32351f44-6fa2-44e0-81dd-549f2f19d705","Type":"ContainerStarted","Data":"b8262ce402769d9d21587f173541eeeb4d3775d4e2e0f847dcde1f3e4cfc3624"} Apr 17 17:27:58.364806 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:58.364517 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-frpbr" event={"ID":"4efcb7d2-ede6-4e86-920c-19ff33a94f7b","Type":"ContainerStarted","Data":"bd312300ddd6ebf4f8e52ac4bdc65c89eb575c5f284da55fdc44cd36ef12f483"} Apr 17 17:27:58.364806 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:58.364563 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-frpbr" event={"ID":"4efcb7d2-ede6-4e86-920c-19ff33a94f7b","Type":"ContainerStarted","Data":"ec7a9d26a113154497c47387ee4f29f7b51877e739a6f78314d0c8e24f37d7b9"} Apr 17 17:27:58.364806 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:58.364620 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-frpbr" Apr 17 17:27:58.369481 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:58.369292 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" event={"ID":"a205f472-b21e-4f79-ae09-96715d56ac66","Type":"ContainerStarted","Data":"deea374c49e78fab9954abb38138d90ff7dd42f344a16d70a8bed7d59adfa73e"} Apr 17 17:27:58.369481 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:58.369427 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" event={"ID":"a205f472-b21e-4f79-ae09-96715d56ac66","Type":"ContainerStarted","Data":"c5d3e59b3dd9a5f92e1b6691e7ae10e3447ea3c3f6aa12b25f0dd2d1bf7ea32b"} Apr 17 17:27:58.369481 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:58.369444 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" event={"ID":"a205f472-b21e-4f79-ae09-96715d56ac66","Type":"ContainerStarted","Data":"8b5240e0cdbb3f55c742a485dc3d941c20e7599da2d13cbab4d31799eee75f1d"} Apr 17 17:27:58.384096 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:58.383659 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-frpbr" podStartSLOduration=137.28312545 podStartE2EDuration="2m20.383640141s" podCreationTimestamp="2026-04-17 17:25:38 +0000 UTC" firstStartedPulling="2026-04-17 17:27:54.505624364 +0000 UTC m=+167.444529605" lastFinishedPulling="2026-04-17 17:27:57.606139048 +0000 UTC m=+170.545044296" observedRunningTime="2026-04-17 17:27:58.382490782 +0000 UTC m=+171.321396045" watchObservedRunningTime="2026-04-17 17:27:58.383640141 +0000 UTC m=+171.322545396" Apr 17 17:27:58.653582 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:27:58.653504 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:28:02.388154 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:02.388039 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"014735e0-f64c-4e42-add4-183e239411ae","Type":"ContainerStarted","Data":"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f"} Apr 17 17:28:02.388154 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:02.388088 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"014735e0-f64c-4e42-add4-183e239411ae","Type":"ContainerStarted","Data":"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7"} Apr 17 17:28:02.388154 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:02.388106 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"014735e0-f64c-4e42-add4-183e239411ae","Type":"ContainerStarted","Data":"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8"} Apr 17 17:28:02.388154 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:02.388120 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"014735e0-f64c-4e42-add4-183e239411ae","Type":"ContainerStarted","Data":"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5"} Apr 17 17:28:02.388154 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:02.388132 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"014735e0-f64c-4e42-add4-183e239411ae","Type":"ContainerStarted","Data":"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499"} Apr 17 17:28:02.391096 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:02.391068 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" event={"ID":"a205f472-b21e-4f79-ae09-96715d56ac66","Type":"ContainerStarted","Data":"1fbd1576063c3d83de6063baf16befc9100e2dbc77e1160bddf6f9309e46dd3f"} Apr 17 17:28:02.391203 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:02.391105 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" event={"ID":"a205f472-b21e-4f79-ae09-96715d56ac66","Type":"ContainerStarted","Data":"5060c36aabf379af3d7fcaf498e9372571d052dee09d50938bbe7f5567dce923"} Apr 17 17:28:02.391203 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:02.391121 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" event={"ID":"a205f472-b21e-4f79-ae09-96715d56ac66","Type":"ContainerStarted","Data":"c5931ab9159a007527b38fde3f3c9f8b8e46f8efde795e948b505f34e66c1135"} Apr 17 17:28:02.391407 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:02.391384 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:28:02.392804 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:02.392764 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r7dzq" event={"ID":"32351f44-6fa2-44e0-81dd-549f2f19d705","Type":"ContainerStarted","Data":"8b45854274825beccd38f1eb3c6eb9cff1b67463fd4b5d23879134baee2f9122"} Apr 17 17:28:02.398290 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:02.398271 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" Apr 17 17:28:02.419947 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:02.419896 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-55c6bf5658-7jfq4" podStartSLOduration=7.284901194 podStartE2EDuration="14.419875537s" podCreationTimestamp="2026-04-17 17:27:48 +0000 UTC" firstStartedPulling="2026-04-17 17:27:54.506108511 +0000 UTC m=+167.445013751" lastFinishedPulling="2026-04-17 17:28:01.641082832 +0000 UTC m=+174.579988094" observedRunningTime="2026-04-17 17:28:02.418558767 +0000 UTC m=+175.357464031" watchObservedRunningTime="2026-04-17 17:28:02.419875537 +0000 UTC m=+175.358780803" Apr 17 17:28:03.400644 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:03.400596 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"014735e0-f64c-4e42-add4-183e239411ae","Type":"ContainerStarted","Data":"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e"} Apr 17 17:28:03.434722 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:03.434666 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.973745248 podStartE2EDuration="12.434647328s" podCreationTimestamp="2026-04-17 17:27:51 +0000 UTC" firstStartedPulling="2026-04-17 17:27:54.187617322 +0000 UTC m=+167.126522563" lastFinishedPulling="2026-04-17 17:28:01.64851939 +0000 UTC m=+174.587424643" observedRunningTime="2026-04-17 17:28:03.432947035 +0000 UTC m=+176.371852311" watchObservedRunningTime="2026-04-17 17:28:03.434647328 +0000 UTC m=+176.373552593" Apr 17 17:28:03.435134 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:03.435104 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-r7dzq" podStartSLOduration=140.177274002 podStartE2EDuration="2m25.43509501s" podCreationTimestamp="2026-04-17 17:25:38 +0000 UTC" firstStartedPulling="2026-04-17 17:27:56.383259771 +0000 UTC m=+169.322165029" lastFinishedPulling="2026-04-17 17:28:01.641080778 +0000 UTC m=+174.579986037" observedRunningTime="2026-04-17 17:28:02.460094279 +0000 UTC m=+175.398999539" watchObservedRunningTime="2026-04-17 17:28:03.43509501 +0000 UTC m=+176.374000281" Apr 17 17:28:06.615555 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:06.615512 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:08.376788 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:08.376753 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-frpbr" Apr 17 17:28:21.069843 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.069761 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-66d8c9c64d-k8knx" podUID="5a9c78c6-8531-4bfe-a000-a26721924369" containerName="console" containerID="cri-o://eee8a6dbc2e0d247017eedc2b00da517865ce4e5dd827c8449d602c0a5ce38d6" gracePeriod=15 Apr 17 17:28:21.313059 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.313036 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66d8c9c64d-k8knx_5a9c78c6-8531-4bfe-a000-a26721924369/console/0.log" Apr 17 17:28:21.313169 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.313109 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:28:21.354147 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.354068 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a9c78c6-8531-4bfe-a000-a26721924369-service-ca\") pod \"5a9c78c6-8531-4bfe-a000-a26721924369\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " Apr 17 17:28:21.354147 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.354115 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a9c78c6-8531-4bfe-a000-a26721924369-console-config\") pod \"5a9c78c6-8531-4bfe-a000-a26721924369\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " Apr 17 17:28:21.354147 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.354141 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a9c78c6-8531-4bfe-a000-a26721924369-oauth-serving-cert\") pod \"5a9c78c6-8531-4bfe-a000-a26721924369\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " Apr 17 17:28:21.354394 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.354321 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a9c78c6-8531-4bfe-a000-a26721924369-console-serving-cert\") pod \"5a9c78c6-8531-4bfe-a000-a26721924369\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " Apr 17 17:28:21.354394 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.354389 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4rjw\" (UniqueName: \"kubernetes.io/projected/5a9c78c6-8531-4bfe-a000-a26721924369-kube-api-access-v4rjw\") pod \"5a9c78c6-8531-4bfe-a000-a26721924369\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " Apr 17 17:28:21.354499 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.354445 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9c78c6-8531-4bfe-a000-a26721924369-console-config" (OuterVolumeSpecName: "console-config") pod "5a9c78c6-8531-4bfe-a000-a26721924369" (UID: "5a9c78c6-8531-4bfe-a000-a26721924369"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:21.354499 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.354465 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9c78c6-8531-4bfe-a000-a26721924369-service-ca" (OuterVolumeSpecName: "service-ca") pod "5a9c78c6-8531-4bfe-a000-a26721924369" (UID: "5a9c78c6-8531-4bfe-a000-a26721924369"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:21.354499 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.354478 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9c78c6-8531-4bfe-a000-a26721924369-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5a9c78c6-8531-4bfe-a000-a26721924369" (UID: "5a9c78c6-8531-4bfe-a000-a26721924369"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:21.354609 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.354497 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a9c78c6-8531-4bfe-a000-a26721924369-console-oauth-config\") pod \"5a9c78c6-8531-4bfe-a000-a26721924369\" (UID: \"5a9c78c6-8531-4bfe-a000-a26721924369\") " Apr 17 17:28:21.354787 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.354759 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a9c78c6-8531-4bfe-a000-a26721924369-service-ca\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:28:21.354787 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.354783 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a9c78c6-8531-4bfe-a000-a26721924369-console-config\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:28:21.354931 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.354798 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a9c78c6-8531-4bfe-a000-a26721924369-oauth-serving-cert\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:28:21.356644 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.356611 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a9c78c6-8531-4bfe-a000-a26721924369-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5a9c78c6-8531-4bfe-a000-a26721924369" (UID: "5a9c78c6-8531-4bfe-a000-a26721924369"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:21.356721 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.356653 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a9c78c6-8531-4bfe-a000-a26721924369-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5a9c78c6-8531-4bfe-a000-a26721924369" (UID: "5a9c78c6-8531-4bfe-a000-a26721924369"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:21.356765 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.356729 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a9c78c6-8531-4bfe-a000-a26721924369-kube-api-access-v4rjw" (OuterVolumeSpecName: "kube-api-access-v4rjw") pod "5a9c78c6-8531-4bfe-a000-a26721924369" (UID: "5a9c78c6-8531-4bfe-a000-a26721924369"). InnerVolumeSpecName "kube-api-access-v4rjw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:28:21.455542 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.455508 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v4rjw\" (UniqueName: \"kubernetes.io/projected/5a9c78c6-8531-4bfe-a000-a26721924369-kube-api-access-v4rjw\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:28:21.455542 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.455538 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a9c78c6-8531-4bfe-a000-a26721924369-console-oauth-config\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:28:21.455744 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.455553 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a9c78c6-8531-4bfe-a000-a26721924369-console-serving-cert\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:28:21.459335 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.459313 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66d8c9c64d-k8knx_5a9c78c6-8531-4bfe-a000-a26721924369/console/0.log" Apr 17 17:28:21.459485 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.459352 2571 generic.go:358] "Generic (PLEG): container finished" podID="5a9c78c6-8531-4bfe-a000-a26721924369" containerID="eee8a6dbc2e0d247017eedc2b00da517865ce4e5dd827c8449d602c0a5ce38d6" exitCode=2 Apr 17 17:28:21.459485 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.459388 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66d8c9c64d-k8knx" event={"ID":"5a9c78c6-8531-4bfe-a000-a26721924369","Type":"ContainerDied","Data":"eee8a6dbc2e0d247017eedc2b00da517865ce4e5dd827c8449d602c0a5ce38d6"} Apr 17 17:28:21.459485 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.459436 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66d8c9c64d-k8knx" event={"ID":"5a9c78c6-8531-4bfe-a000-a26721924369","Type":"ContainerDied","Data":"9ddd8fb7b5007670c9e93a6133f2ddbbdc5a591129dd3f8629867746bb8b9e9e"} Apr 17 17:28:21.459485 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.459439 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66d8c9c64d-k8knx" Apr 17 17:28:21.459485 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.459456 2571 scope.go:117] "RemoveContainer" containerID="eee8a6dbc2e0d247017eedc2b00da517865ce4e5dd827c8449d602c0a5ce38d6" Apr 17 17:28:21.469043 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.469018 2571 scope.go:117] "RemoveContainer" containerID="eee8a6dbc2e0d247017eedc2b00da517865ce4e5dd827c8449d602c0a5ce38d6" Apr 17 17:28:21.469317 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:28:21.469294 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eee8a6dbc2e0d247017eedc2b00da517865ce4e5dd827c8449d602c0a5ce38d6\": container with ID starting with eee8a6dbc2e0d247017eedc2b00da517865ce4e5dd827c8449d602c0a5ce38d6 not found: ID does not exist" containerID="eee8a6dbc2e0d247017eedc2b00da517865ce4e5dd827c8449d602c0a5ce38d6" Apr 17 17:28:21.469368 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.469327 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eee8a6dbc2e0d247017eedc2b00da517865ce4e5dd827c8449d602c0a5ce38d6"} err="failed to get container status \"eee8a6dbc2e0d247017eedc2b00da517865ce4e5dd827c8449d602c0a5ce38d6\": rpc error: code = NotFound desc = could not find container \"eee8a6dbc2e0d247017eedc2b00da517865ce4e5dd827c8449d602c0a5ce38d6\": container with ID starting with eee8a6dbc2e0d247017eedc2b00da517865ce4e5dd827c8449d602c0a5ce38d6 not found: ID does not exist" Apr 17 17:28:21.481529 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.481493 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66d8c9c64d-k8knx"] Apr 17 17:28:21.490124 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.490089 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66d8c9c64d-k8knx"] Apr 17 17:28:21.658422 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:21.658335 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a9c78c6-8531-4bfe-a000-a26721924369" path="/var/lib/kubelet/pods/5a9c78c6-8531-4bfe-a000-a26721924369/volumes" Apr 17 17:28:44.533823 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:44.533734 2571 generic.go:358] "Generic (PLEG): container finished" podID="687643ca-4ad3-44fa-8f6b-ef388beace89" containerID="975e25b509920a10259e6f262cd24dad26c9442085c3de924e43a758210fa5f3" exitCode=0 Apr 17 17:28:44.534349 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:44.533813 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd" event={"ID":"687643ca-4ad3-44fa-8f6b-ef388beace89","Type":"ContainerDied","Data":"975e25b509920a10259e6f262cd24dad26c9442085c3de924e43a758210fa5f3"} Apr 17 17:28:44.534349 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:44.534206 2571 scope.go:117] "RemoveContainer" containerID="975e25b509920a10259e6f262cd24dad26c9442085c3de924e43a758210fa5f3" Apr 17 17:28:45.539312 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:45.539274 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jmwrd" event={"ID":"687643ca-4ad3-44fa-8f6b-ef388beace89","Type":"ContainerStarted","Data":"f2dfc81d2743b595be3c6bdfc6959081c4fa000a89c0af544cc1ff59819f429f"} Apr 17 17:28:51.634236 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:51.634190 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:51.658825 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:51.658799 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:52.576099 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:28:52.576071 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:09.629409 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.629376 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:29:09.629869 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.629817 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="prometheus" containerID="cri-o://9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499" gracePeriod=600 Apr 17 17:29:09.629954 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.629870 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="kube-rbac-proxy-thanos" containerID="cri-o://3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e" gracePeriod=600 Apr 17 17:29:09.629954 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.629881 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="config-reloader" containerID="cri-o://f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5" gracePeriod=600 Apr 17 17:29:09.630152 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.629881 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="thanos-sidecar" containerID="cri-o://9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8" gracePeriod=600 Apr 17 17:29:09.630152 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.629828 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="kube-rbac-proxy" containerID="cri-o://d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f" gracePeriod=600 Apr 17 17:29:09.630152 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.630048 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="kube-rbac-proxy-web" containerID="cri-o://fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7" gracePeriod=600 Apr 17 17:29:09.866646 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.866623 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:09.997035 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.996930 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-prometheus-trusted-ca-bundle\") pod \"014735e0-f64c-4e42-add4-183e239411ae\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " Apr 17 17:29:09.997035 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.996973 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-prometheus-k8s-rulefiles-0\") pod \"014735e0-f64c-4e42-add4-183e239411ae\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " Apr 17 17:29:09.997035 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997013 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/014735e0-f64c-4e42-add4-183e239411ae-tls-assets\") pod \"014735e0-f64c-4e42-add4-183e239411ae\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " Apr 17 17:29:09.997035 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997032 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-configmap-kubelet-serving-ca-bundle\") pod \"014735e0-f64c-4e42-add4-183e239411ae\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " Apr 17 17:29:09.997355 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997057 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfnvw\" (UniqueName: \"kubernetes.io/projected/014735e0-f64c-4e42-add4-183e239411ae-kube-api-access-nfnvw\") pod \"014735e0-f64c-4e42-add4-183e239411ae\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " Apr 17 17:29:09.997355 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997101 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"014735e0-f64c-4e42-add4-183e239411ae\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " Apr 17 17:29:09.997355 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997131 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-configmap-metrics-client-ca\") pod \"014735e0-f64c-4e42-add4-183e239411ae\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " Apr 17 17:29:09.997355 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997162 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-web-config\") pod \"014735e0-f64c-4e42-add4-183e239411ae\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " Apr 17 17:29:09.997355 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997192 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-kube-rbac-proxy\") pod \"014735e0-f64c-4e42-add4-183e239411ae\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " Apr 17 17:29:09.997355 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997220 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-prometheus-k8s-tls\") pod \"014735e0-f64c-4e42-add4-183e239411ae\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " Apr 17 17:29:09.997355 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997250 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-metrics-client-certs\") pod \"014735e0-f64c-4e42-add4-183e239411ae\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " Apr 17 17:29:09.997355 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997281 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-grpc-tls\") pod \"014735e0-f64c-4e42-add4-183e239411ae\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " Apr 17 17:29:09.997355 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997308 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/014735e0-f64c-4e42-add4-183e239411ae-config-out\") pod \"014735e0-f64c-4e42-add4-183e239411ae\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " Apr 17 17:29:09.997355 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997346 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"014735e0-f64c-4e42-add4-183e239411ae\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " Apr 17 17:29:09.998038 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997386 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-thanos-prometheus-http-client-file\") pod \"014735e0-f64c-4e42-add4-183e239411ae\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " Apr 17 17:29:09.998038 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997406 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "014735e0-f64c-4e42-add4-183e239411ae" (UID: "014735e0-f64c-4e42-add4-183e239411ae"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:29:09.998038 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997422 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "014735e0-f64c-4e42-add4-183e239411ae" (UID: "014735e0-f64c-4e42-add4-183e239411ae"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:29:09.998038 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997434 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-config\") pod \"014735e0-f64c-4e42-add4-183e239411ae\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " Apr 17 17:29:09.998038 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997469 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/014735e0-f64c-4e42-add4-183e239411ae-prometheus-k8s-db\") pod \"014735e0-f64c-4e42-add4-183e239411ae\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " Apr 17 17:29:09.998038 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997520 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-configmap-serving-certs-ca-bundle\") pod \"014735e0-f64c-4e42-add4-183e239411ae\" (UID: \"014735e0-f64c-4e42-add4-183e239411ae\") " Apr 17 17:29:09.998038 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997668 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "014735e0-f64c-4e42-add4-183e239411ae" (UID: "014735e0-f64c-4e42-add4-183e239411ae"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:29:09.998038 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997794 2571 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-prometheus-trusted-ca-bundle\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:29:09.998038 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997901 2571 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:29:09.998038 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997920 2571 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-configmap-metrics-client-ca\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:29:09.998038 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.997930 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "014735e0-f64c-4e42-add4-183e239411ae" (UID: "014735e0-f64c-4e42-add4-183e239411ae"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:29:09.999870 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.999835 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/014735e0-f64c-4e42-add4-183e239411ae-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "014735e0-f64c-4e42-add4-183e239411ae" (UID: "014735e0-f64c-4e42-add4-183e239411ae"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:29:09.999974 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.999897 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/014735e0-f64c-4e42-add4-183e239411ae-kube-api-access-nfnvw" (OuterVolumeSpecName: "kube-api-access-nfnvw") pod "014735e0-f64c-4e42-add4-183e239411ae" (UID: "014735e0-f64c-4e42-add4-183e239411ae"). InnerVolumeSpecName "kube-api-access-nfnvw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:29:09.999974 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:09.999903 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "014735e0-f64c-4e42-add4-183e239411ae" (UID: "014735e0-f64c-4e42-add4-183e239411ae"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:10.000369 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.000338 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "014735e0-f64c-4e42-add4-183e239411ae" (UID: "014735e0-f64c-4e42-add4-183e239411ae"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:10.001074 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.000842 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-config" (OuterVolumeSpecName: "config") pod "014735e0-f64c-4e42-add4-183e239411ae" (UID: "014735e0-f64c-4e42-add4-183e239411ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:10.001430 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.001241 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "014735e0-f64c-4e42-add4-183e239411ae" (UID: "014735e0-f64c-4e42-add4-183e239411ae"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:10.001430 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.001358 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "014735e0-f64c-4e42-add4-183e239411ae" (UID: "014735e0-f64c-4e42-add4-183e239411ae"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:29:10.001716 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.001690 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "014735e0-f64c-4e42-add4-183e239411ae" (UID: "014735e0-f64c-4e42-add4-183e239411ae"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:10.002020 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.001970 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/014735e0-f64c-4e42-add4-183e239411ae-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "014735e0-f64c-4e42-add4-183e239411ae" (UID: "014735e0-f64c-4e42-add4-183e239411ae"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:29:10.002529 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.002503 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "014735e0-f64c-4e42-add4-183e239411ae" (UID: "014735e0-f64c-4e42-add4-183e239411ae"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:10.002648 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.002626 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/014735e0-f64c-4e42-add4-183e239411ae-config-out" (OuterVolumeSpecName: "config-out") pod "014735e0-f64c-4e42-add4-183e239411ae" (UID: "014735e0-f64c-4e42-add4-183e239411ae"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:29:10.002719 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.002700 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "014735e0-f64c-4e42-add4-183e239411ae" (UID: "014735e0-f64c-4e42-add4-183e239411ae"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:10.003266 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.003248 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "014735e0-f64c-4e42-add4-183e239411ae" (UID: "014735e0-f64c-4e42-add4-183e239411ae"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:10.011855 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.011823 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-web-config" (OuterVolumeSpecName: "web-config") pod "014735e0-f64c-4e42-add4-183e239411ae" (UID: "014735e0-f64c-4e42-add4-183e239411ae"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:10.098584 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.098541 2571 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/014735e0-f64c-4e42-add4-183e239411ae-prometheus-k8s-db\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:29:10.098584 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.098576 2571 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:29:10.098584 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.098587 2571 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/014735e0-f64c-4e42-add4-183e239411ae-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:29:10.098821 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.098599 2571 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/014735e0-f64c-4e42-add4-183e239411ae-tls-assets\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:29:10.098821 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.098610 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nfnvw\" (UniqueName: \"kubernetes.io/projected/014735e0-f64c-4e42-add4-183e239411ae-kube-api-access-nfnvw\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:29:10.098821 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.098622 2571 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:29:10.098821 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.098632 2571 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-web-config\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:29:10.098821 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.098640 2571 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-kube-rbac-proxy\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:29:10.098821 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.098650 2571 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-prometheus-k8s-tls\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:29:10.098821 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.098659 2571 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-metrics-client-certs\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:29:10.098821 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.098668 2571 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-grpc-tls\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:29:10.098821 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.098675 2571 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/014735e0-f64c-4e42-add4-183e239411ae-config-out\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:29:10.098821 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.098684 2571 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:29:10.098821 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.098693 2571 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-thanos-prometheus-http-client-file\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:29:10.098821 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.098702 2571 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/014735e0-f64c-4e42-add4-183e239411ae-config\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:29:10.616805 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.616766 2571 generic.go:358] "Generic (PLEG): container finished" podID="014735e0-f64c-4e42-add4-183e239411ae" containerID="3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e" exitCode=0 Apr 17 17:29:10.616805 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.616794 2571 generic.go:358] "Generic (PLEG): container finished" podID="014735e0-f64c-4e42-add4-183e239411ae" containerID="d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f" exitCode=0 Apr 17 17:29:10.616805 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.616800 2571 generic.go:358] "Generic (PLEG): container finished" podID="014735e0-f64c-4e42-add4-183e239411ae" containerID="fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7" exitCode=0 Apr 17 17:29:10.616805 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.616806 2571 generic.go:358] "Generic (PLEG): container finished" podID="014735e0-f64c-4e42-add4-183e239411ae" containerID="9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8" exitCode=0 Apr 17 17:29:10.616805 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.616812 2571 generic.go:358] "Generic (PLEG): container finished" podID="014735e0-f64c-4e42-add4-183e239411ae" containerID="f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5" exitCode=0 Apr 17 17:29:10.616805 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.616817 2571 generic.go:358] "Generic (PLEG): container finished" podID="014735e0-f64c-4e42-add4-183e239411ae" containerID="9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499" exitCode=0 Apr 17 17:29:10.617159 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.616854 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"014735e0-f64c-4e42-add4-183e239411ae","Type":"ContainerDied","Data":"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e"} Apr 17 17:29:10.617159 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.616884 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.617159 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.616902 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"014735e0-f64c-4e42-add4-183e239411ae","Type":"ContainerDied","Data":"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f"} Apr 17 17:29:10.617159 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.616919 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"014735e0-f64c-4e42-add4-183e239411ae","Type":"ContainerDied","Data":"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7"} Apr 17 17:29:10.617159 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.616933 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"014735e0-f64c-4e42-add4-183e239411ae","Type":"ContainerDied","Data":"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8"} Apr 17 17:29:10.617159 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.616947 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"014735e0-f64c-4e42-add4-183e239411ae","Type":"ContainerDied","Data":"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5"} Apr 17 17:29:10.617159 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.616961 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"014735e0-f64c-4e42-add4-183e239411ae","Type":"ContainerDied","Data":"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499"} Apr 17 17:29:10.617159 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.616966 2571 scope.go:117] "RemoveContainer" containerID="3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e" Apr 17 17:29:10.617159 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.616974 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"014735e0-f64c-4e42-add4-183e239411ae","Type":"ContainerDied","Data":"a5a41c5fc26a02fc034e52dac25ba2d762b70ae1c435b24c9789c0988fc3537d"} Apr 17 17:29:10.625682 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.625661 2571 scope.go:117] "RemoveContainer" containerID="d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f" Apr 17 17:29:10.632608 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.632465 2571 scope.go:117] "RemoveContainer" containerID="fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7" Apr 17 17:29:10.639090 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.639072 2571 scope.go:117] "RemoveContainer" containerID="9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8" Apr 17 17:29:10.642849 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.642824 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:29:10.647365 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.647323 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:29:10.659853 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.659833 2571 scope.go:117] "RemoveContainer" containerID="f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5" Apr 17 17:29:10.666597 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.666581 2571 scope.go:117] "RemoveContainer" containerID="9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499" Apr 17 17:29:10.673800 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.673783 2571 scope.go:117] "RemoveContainer" containerID="77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966" Apr 17 17:29:10.680318 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.680294 2571 scope.go:117] "RemoveContainer" containerID="3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e" Apr 17 17:29:10.680596 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:29:10.680578 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e\": container with ID starting with 3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e not found: ID does not exist" containerID="3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e" Apr 17 17:29:10.680648 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.680606 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e"} err="failed to get container status \"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e\": rpc error: code = NotFound desc = could not find container \"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e\": container with ID starting with 3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e not found: ID does not exist" Apr 17 17:29:10.680648 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.680626 2571 scope.go:117] "RemoveContainer" containerID="d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f" Apr 17 17:29:10.680923 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:29:10.680887 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f\": container with ID starting with d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f not found: ID does not exist" containerID="d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f" Apr 17 17:29:10.681026 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.680931 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f"} err="failed to get container status \"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f\": rpc error: code = NotFound desc = could not find container \"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f\": container with ID starting with d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f not found: ID does not exist" Apr 17 17:29:10.681026 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.680955 2571 scope.go:117] "RemoveContainer" containerID="fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7" Apr 17 17:29:10.681288 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:29:10.681260 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7\": container with ID starting with fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7 not found: ID does not exist" containerID="fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7" Apr 17 17:29:10.681395 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.681295 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7"} err="failed to get container status \"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7\": rpc error: code = NotFound desc = could not find container \"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7\": container with ID starting with fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7 not found: ID does not exist" Apr 17 17:29:10.681395 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.681314 2571 scope.go:117] "RemoveContainer" containerID="9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8" Apr 17 17:29:10.681616 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:29:10.681581 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8\": container with ID starting with 9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8 not found: ID does not exist" containerID="9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8" Apr 17 17:29:10.681659 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.681626 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8"} err="failed to get container status \"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8\": rpc error: code = NotFound desc = could not find container \"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8\": container with ID starting with 9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8 not found: ID does not exist" Apr 17 17:29:10.681659 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.681646 2571 scope.go:117] "RemoveContainer" containerID="f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5" Apr 17 17:29:10.681928 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:29:10.681899 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5\": container with ID starting with f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5 not found: ID does not exist" containerID="f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5" Apr 17 17:29:10.682060 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.681934 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5"} err="failed to get container status \"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5\": rpc error: code = NotFound desc = could not find container \"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5\": container with ID starting with f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5 not found: ID does not exist" Apr 17 17:29:10.682060 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.681953 2571 scope.go:117] "RemoveContainer" containerID="9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499" Apr 17 17:29:10.682267 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:29:10.682249 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499\": container with ID starting with 9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499 not found: ID does not exist" containerID="9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499" Apr 17 17:29:10.682308 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.682274 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499"} err="failed to get container status \"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499\": rpc error: code = NotFound desc = could not find container \"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499\": container with ID starting with 9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499 not found: ID does not exist" Apr 17 17:29:10.682308 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.682295 2571 scope.go:117] "RemoveContainer" containerID="77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966" Apr 17 17:29:10.682509 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:29:10.682489 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966\": container with ID starting with 77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966 not found: ID does not exist" containerID="77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966" Apr 17 17:29:10.682551 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.682515 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966"} err="failed to get container status \"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966\": rpc error: code = NotFound desc = could not find container \"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966\": container with ID starting with 77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966 not found: ID does not exist" Apr 17 17:29:10.682551 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.682534 2571 scope.go:117] "RemoveContainer" containerID="3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e" Apr 17 17:29:10.682777 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.682758 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e"} err="failed to get container status \"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e\": rpc error: code = NotFound desc = could not find container \"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e\": container with ID starting with 3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e not found: ID does not exist" Apr 17 17:29:10.682777 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.682777 2571 scope.go:117] "RemoveContainer" containerID="d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f" Apr 17 17:29:10.682965 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.682948 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f"} err="failed to get container status \"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f\": rpc error: code = NotFound desc = could not find container \"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f\": container with ID starting with d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f not found: ID does not exist" Apr 17 17:29:10.683043 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.682965 2571 scope.go:117] "RemoveContainer" containerID="fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7" Apr 17 17:29:10.683096 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683066 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:29:10.683217 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683196 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7"} err="failed to get container status \"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7\": rpc error: code = NotFound desc = could not find container \"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7\": container with ID starting with fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7 not found: ID does not exist" Apr 17 17:29:10.683217 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683215 2571 scope.go:117] "RemoveContainer" containerID="9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8" Apr 17 17:29:10.683394 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683382 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="kube-rbac-proxy-web" Apr 17 17:29:10.683434 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683396 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="kube-rbac-proxy-web" Apr 17 17:29:10.683434 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683406 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="config-reloader" Apr 17 17:29:10.683434 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683411 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="config-reloader" Apr 17 17:29:10.683434 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683419 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="thanos-sidecar" Apr 17 17:29:10.683434 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683424 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="thanos-sidecar" Apr 17 17:29:10.683434 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683420 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8"} err="failed to get container status \"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8\": rpc error: code = NotFound desc = could not find container \"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8\": container with ID starting with 9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8 not found: ID does not exist" Apr 17 17:29:10.683607 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683435 2571 scope.go:117] "RemoveContainer" containerID="f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5" Apr 17 17:29:10.683607 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683438 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="init-config-reloader" Apr 17 17:29:10.683607 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683444 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="init-config-reloader" Apr 17 17:29:10.683607 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683449 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="prometheus" Apr 17 17:29:10.683607 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683454 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="prometheus" Apr 17 17:29:10.683607 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683461 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a9c78c6-8531-4bfe-a000-a26721924369" containerName="console" Apr 17 17:29:10.683607 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683466 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9c78c6-8531-4bfe-a000-a26721924369" containerName="console" Apr 17 17:29:10.683607 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683474 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="kube-rbac-proxy-thanos" Apr 17 17:29:10.683607 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683479 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="kube-rbac-proxy-thanos" Apr 17 17:29:10.683607 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683487 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="kube-rbac-proxy" Apr 17 17:29:10.683607 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683495 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="kube-rbac-proxy" Apr 17 17:29:10.683607 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683544 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a9c78c6-8531-4bfe-a000-a26721924369" containerName="console" Apr 17 17:29:10.683607 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683556 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="kube-rbac-proxy-web" Apr 17 17:29:10.683607 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683566 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="config-reloader" Apr 17 17:29:10.683607 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683572 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="prometheus" Apr 17 17:29:10.683607 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683578 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="kube-rbac-proxy" Apr 17 17:29:10.683607 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683584 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="kube-rbac-proxy-thanos" Apr 17 17:29:10.683607 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683593 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="014735e0-f64c-4e42-add4-183e239411ae" containerName="thanos-sidecar" Apr 17 17:29:10.684287 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683651 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5"} err="failed to get container status \"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5\": rpc error: code = NotFound desc = could not find container \"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5\": container with ID starting with f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5 not found: ID does not exist" Apr 17 17:29:10.684287 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683674 2571 scope.go:117] "RemoveContainer" containerID="9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499" Apr 17 17:29:10.684287 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683869 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499"} err="failed to get container status \"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499\": rpc error: code = NotFound desc = could not find container \"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499\": container with ID starting with 9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499 not found: ID does not exist" Apr 17 17:29:10.684287 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.683892 2571 scope.go:117] "RemoveContainer" containerID="77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966" Apr 17 17:29:10.684287 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.684110 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966"} err="failed to get container status \"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966\": rpc error: code = NotFound desc = could not find container \"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966\": container with ID starting with 77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966 not found: ID does not exist" Apr 17 17:29:10.684287 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.684130 2571 scope.go:117] "RemoveContainer" containerID="3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e" Apr 17 17:29:10.684487 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.684360 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e"} err="failed to get container status \"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e\": rpc error: code = NotFound desc = could not find container \"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e\": container with ID starting with 3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e not found: ID does not exist" Apr 17 17:29:10.684487 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.684375 2571 scope.go:117] "RemoveContainer" containerID="d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f" Apr 17 17:29:10.684600 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.684583 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f"} err="failed to get container status \"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f\": rpc error: code = NotFound desc = could not find container \"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f\": container with ID starting with d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f not found: ID does not exist" Apr 17 17:29:10.684636 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.684608 2571 scope.go:117] "RemoveContainer" containerID="fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7" Apr 17 17:29:10.684828 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.684807 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7"} err="failed to get container status \"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7\": rpc error: code = NotFound desc = could not find container \"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7\": container with ID starting with fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7 not found: ID does not exist" Apr 17 17:29:10.684870 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.684831 2571 scope.go:117] "RemoveContainer" containerID="9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8" Apr 17 17:29:10.685102 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.685082 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8"} err="failed to get container status \"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8\": rpc error: code = NotFound desc = could not find container \"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8\": container with ID starting with 9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8 not found: ID does not exist" Apr 17 17:29:10.685102 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.685102 2571 scope.go:117] "RemoveContainer" containerID="f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5" Apr 17 17:29:10.685358 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.685337 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5"} err="failed to get container status \"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5\": rpc error: code = NotFound desc = could not find container \"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5\": container with ID starting with f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5 not found: ID does not exist" Apr 17 17:29:10.685417 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.685358 2571 scope.go:117] "RemoveContainer" containerID="9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499" Apr 17 17:29:10.685642 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.685624 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499"} err="failed to get container status \"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499\": rpc error: code = NotFound desc = could not find container \"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499\": container with ID starting with 9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499 not found: ID does not exist" Apr 17 17:29:10.685642 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.685641 2571 scope.go:117] "RemoveContainer" containerID="77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966" Apr 17 17:29:10.685853 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.685834 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966"} err="failed to get container status \"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966\": rpc error: code = NotFound desc = could not find container \"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966\": container with ID starting with 77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966 not found: ID does not exist" Apr 17 17:29:10.685895 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.685854 2571 scope.go:117] "RemoveContainer" containerID="3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e" Apr 17 17:29:10.686102 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.686082 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e"} err="failed to get container status \"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e\": rpc error: code = NotFound desc = could not find container \"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e\": container with ID starting with 3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e not found: ID does not exist" Apr 17 17:29:10.686164 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.686104 2571 scope.go:117] "RemoveContainer" containerID="d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f" Apr 17 17:29:10.686301 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.686287 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f"} err="failed to get container status \"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f\": rpc error: code = NotFound desc = could not find container \"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f\": container with ID starting with d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f not found: ID does not exist" Apr 17 17:29:10.686345 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.686302 2571 scope.go:117] "RemoveContainer" containerID="fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7" Apr 17 17:29:10.686521 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.686503 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7"} err="failed to get container status \"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7\": rpc error: code = NotFound desc = could not find container \"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7\": container with ID starting with fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7 not found: ID does not exist" Apr 17 17:29:10.686571 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.686522 2571 scope.go:117] "RemoveContainer" containerID="9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8" Apr 17 17:29:10.686771 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.686750 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8"} err="failed to get container status \"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8\": rpc error: code = NotFound desc = could not find container \"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8\": container with ID starting with 9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8 not found: ID does not exist" Apr 17 17:29:10.686851 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.686786 2571 scope.go:117] "RemoveContainer" containerID="f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5" Apr 17 17:29:10.687045 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.687011 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5"} err="failed to get container status \"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5\": rpc error: code = NotFound desc = could not find container \"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5\": container with ID starting with f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5 not found: ID does not exist" Apr 17 17:29:10.687045 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.687037 2571 scope.go:117] "RemoveContainer" containerID="9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499" Apr 17 17:29:10.687173 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.687066 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.687553 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.687316 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499"} err="failed to get container status \"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499\": rpc error: code = NotFound desc = could not find container \"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499\": container with ID starting with 9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499 not found: ID does not exist" Apr 17 17:29:10.687553 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.687344 2571 scope.go:117] "RemoveContainer" containerID="77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966" Apr 17 17:29:10.687719 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.687641 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966"} err="failed to get container status \"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966\": rpc error: code = NotFound desc = could not find container \"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966\": container with ID starting with 77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966 not found: ID does not exist" Apr 17 17:29:10.687719 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.687657 2571 scope.go:117] "RemoveContainer" containerID="3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e" Apr 17 17:29:10.687880 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.687861 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e"} err="failed to get container status \"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e\": rpc error: code = NotFound desc = could not find container \"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e\": container with ID starting with 3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e not found: ID does not exist" Apr 17 17:29:10.687880 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.687880 2571 scope.go:117] "RemoveContainer" containerID="d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f" Apr 17 17:29:10.688224 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.688192 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f"} err="failed to get container status \"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f\": rpc error: code = NotFound desc = could not find container \"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f\": container with ID starting with d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f not found: ID does not exist" Apr 17 17:29:10.688274 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.688226 2571 scope.go:117] "RemoveContainer" containerID="fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7" Apr 17 17:29:10.688454 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.688435 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7"} err="failed to get container status \"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7\": rpc error: code = NotFound desc = could not find container \"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7\": container with ID starting with fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7 not found: ID does not exist" Apr 17 17:29:10.688454 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.688453 2571 scope.go:117] "RemoveContainer" containerID="9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8" Apr 17 17:29:10.688668 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.688643 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8"} err="failed to get container status \"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8\": rpc error: code = NotFound desc = could not find container \"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8\": container with ID starting with 9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8 not found: ID does not exist" Apr 17 17:29:10.688732 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.688670 2571 scope.go:117] "RemoveContainer" containerID="f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5" Apr 17 17:29:10.688858 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.688838 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5"} err="failed to get container status \"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5\": rpc error: code = NotFound desc = could not find container \"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5\": container with ID starting with f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5 not found: ID does not exist" Apr 17 17:29:10.688858 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.688857 2571 scope.go:117] "RemoveContainer" containerID="9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499" Apr 17 17:29:10.689096 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.689079 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499"} err="failed to get container status \"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499\": rpc error: code = NotFound desc = could not find container \"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499\": container with ID starting with 9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499 not found: ID does not exist" Apr 17 17:29:10.689096 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.689094 2571 scope.go:117] "RemoveContainer" containerID="77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966" Apr 17 17:29:10.689266 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.689251 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966"} err="failed to get container status \"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966\": rpc error: code = NotFound desc = could not find container \"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966\": container with ID starting with 77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966 not found: ID does not exist" Apr 17 17:29:10.689266 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.689265 2571 scope.go:117] "RemoveContainer" containerID="3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e" Apr 17 17:29:10.689475 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.689406 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e"} err="failed to get container status \"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e\": rpc error: code = NotFound desc = could not find container \"3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e\": container with ID starting with 3056ac8250cebf0cae5800169e24097d2c5121a67c07d2e17abe08e80691c04e not found: ID does not exist" Apr 17 17:29:10.689475 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.689429 2571 scope.go:117] "RemoveContainer" containerID="d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f" Apr 17 17:29:10.689683 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.689651 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f"} err="failed to get container status \"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f\": rpc error: code = NotFound desc = could not find container \"d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f\": container with ID starting with d147c6fd547ed0f51b80b6d63a78c7ce1f8751fecc059f65e0f208b029bee48f not found: ID does not exist" Apr 17 17:29:10.689683 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.689677 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 17:29:10.689839 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.689684 2571 scope.go:117] "RemoveContainer" containerID="fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7" Apr 17 17:29:10.689839 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.689768 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jn6zl\"" Apr 17 17:29:10.689839 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.689797 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 17:29:10.689839 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.689829 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 17:29:10.690034 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.689895 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 17:29:10.690034 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.689808 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 17:29:10.690110 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.690036 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7"} err="failed to get container status \"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7\": rpc error: code = NotFound desc = could not find container \"fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7\": container with ID starting with fcc2a231614eab91e08acad051aa273b5b52eba69096f5415db60a778efb80e7 not found: ID does not exist" Apr 17 17:29:10.690110 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.690055 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-33rvvt8hga5fo\"" Apr 17 17:29:10.690110 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.690060 2571 scope.go:117] "RemoveContainer" containerID="9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8" Apr 17 17:29:10.690250 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.690234 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 17:29:10.690309 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.690250 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 17:29:10.690404 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.690337 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8"} err="failed to get container status \"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8\": rpc error: code = NotFound desc = could not find container \"9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8\": container with ID starting with 9a5a553e99c26f67e31ee841033b6b8b4d3683128f5dd7847aebe2cbabe96bc8 not found: ID does not exist" Apr 17 17:29:10.690404 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.690360 2571 scope.go:117] "RemoveContainer" containerID="f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5" Apr 17 17:29:10.690591 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.690577 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 17:29:10.690662 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.690609 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5"} err="failed to get container status \"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5\": rpc error: code = NotFound desc = could not find container \"f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5\": container with ID starting with f80c755c80c73de7fd016eaa21a7a62f234616b8909c9ba4846b06a8babb80c5 not found: ID does not exist" Apr 17 17:29:10.690662 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.690635 2571 scope.go:117] "RemoveContainer" containerID="9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499" Apr 17 17:29:10.690773 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.690718 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 17:29:10.691028 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.690856 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 17:29:10.691028 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.690924 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 17:29:10.691178 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.691056 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499"} err="failed to get container status \"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499\": rpc error: code = NotFound desc = could not find container \"9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499\": container with ID starting with 9cccb9c46b1b1f471775be7391d866fd68dce4e193229335b09ae04aa45f4499 not found: ID does not exist" Apr 17 17:29:10.691178 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.691078 2571 scope.go:117] "RemoveContainer" containerID="77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966" Apr 17 17:29:10.691367 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.691340 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966"} err="failed to get container status \"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966\": rpc error: code = NotFound desc = could not find container \"77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966\": container with ID starting with 77dc481ddfa8af5ddc12f408ab9bf139762279f970376c2363b033e360178966 not found: ID does not exist" Apr 17 17:29:10.699346 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.699321 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 17:29:10.700683 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.700661 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 17:29:10.702847 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.702825 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:29:10.804251 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.804211 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.804251 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.804254 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.804465 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.804288 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.804465 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.804344 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-config\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.804465 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.804382 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.804465 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.804407 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87qtq\" (UniqueName: \"kubernetes.io/projected/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-kube-api-access-87qtq\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.804465 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.804445 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-config-out\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.804465 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.804464 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.804705 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.804505 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-web-config\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.804705 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.804540 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.804705 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.804629 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.804705 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.804674 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.804894 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.804719 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.804894 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.804762 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.804894 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.804790 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.804894 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.804864 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.804894 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.804891 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.805117 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.804925 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.906155 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906057 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.906155 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906100 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87qtq\" (UniqueName: \"kubernetes.io/projected/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-kube-api-access-87qtq\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.906155 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906123 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-config-out\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.906155 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906142 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.906475 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906164 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-web-config\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.906475 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906181 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.906475 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906202 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.906475 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906217 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.906475 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906254 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.906475 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906292 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.906475 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906318 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.906475 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906362 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.906475 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906388 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.906475 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906421 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.906475 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906473 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.907201 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906504 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.907201 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906529 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.907201 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906558 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-config\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.907201 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.906672 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.907727 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.907701 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.909701 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.909402 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-web-config\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.909701 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.909439 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-config-out\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.909701 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.909624 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-config\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.909920 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.909786 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.911029 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.910151 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.911029 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.910297 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.911029 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.910321 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.911029 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.910509 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.911029 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.910722 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.911029 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.910965 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.911385 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.911352 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.912051 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.912029 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.912403 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.912373 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.912726 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.912708 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.912915 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.912897 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.914588 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.914570 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87qtq\" (UniqueName: \"kubernetes.io/projected/74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3-kube-api-access-87qtq\") pod \"prometheus-k8s-0\" (UID: \"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:10.999190 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:10.999134 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:11.134635 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:11.134602 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:29:11.138105 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:29:11.138077 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74bfa4d5_d22c_4b29_aae3_5f1fb1f86df3.slice/crio-fc802b822d03de472db268b4438a39af81585ad4aff7824d4cbe744d9efdf13b WatchSource:0}: Error finding container fc802b822d03de472db268b4438a39af81585ad4aff7824d4cbe744d9efdf13b: Status 404 returned error can't find the container with id fc802b822d03de472db268b4438a39af81585ad4aff7824d4cbe744d9efdf13b Apr 17 17:29:11.622255 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:11.622193 2571 generic.go:358] "Generic (PLEG): container finished" podID="74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3" containerID="c7efb4b952e425729faf529cb1f0ba5cce63e4c6014f900925da5b5488a97209" exitCode=0 Apr 17 17:29:11.622440 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:11.622278 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3","Type":"ContainerDied","Data":"c7efb4b952e425729faf529cb1f0ba5cce63e4c6014f900925da5b5488a97209"} Apr 17 17:29:11.622440 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:11.622321 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3","Type":"ContainerStarted","Data":"fc802b822d03de472db268b4438a39af81585ad4aff7824d4cbe744d9efdf13b"} Apr 17 17:29:11.657784 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:11.657758 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="014735e0-f64c-4e42-add4-183e239411ae" path="/var/lib/kubelet/pods/014735e0-f64c-4e42-add4-183e239411ae/volumes" Apr 17 17:29:12.627478 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:12.627439 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3","Type":"ContainerStarted","Data":"effd7130cdbaf7381139efec81c387dcedfbdb8145cca30ed1864a6d39b82236"} Apr 17 17:29:12.627478 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:12.627481 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3","Type":"ContainerStarted","Data":"829014860ba4d6bb00d973c3d12ca4f1963b082fe32dbfccd741528bdd5f0cae"} Apr 17 17:29:12.627683 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:12.627496 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3","Type":"ContainerStarted","Data":"6c7c6093198f2229f418fbf2449414eced133a4dba44a9ceff7cd06269a289f9"} Apr 17 17:29:12.627683 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:12.627510 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3","Type":"ContainerStarted","Data":"f54e8efa2e7f6d4cb7b4e3b866351238b988fc751b534d9cfbad7811682f2bde"} Apr 17 17:29:12.627683 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:12.627520 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3","Type":"ContainerStarted","Data":"1a34f24489129985cc63eafb099f5ccba320bcfde2d246496b8101d799ba52b2"} Apr 17 17:29:12.627683 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:12.627534 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3","Type":"ContainerStarted","Data":"b1192add5067efd46756668acbfcf5a73adde823660fe241f581fbeae11ab5b2"} Apr 17 17:29:12.658834 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:12.658787 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.658769372 podStartE2EDuration="2.658769372s" podCreationTimestamp="2026-04-17 17:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:29:12.656244847 +0000 UTC m=+245.595150110" watchObservedRunningTime="2026-04-17 17:29:12.658769372 +0000 UTC m=+245.597674669" Apr 17 17:29:16.000230 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:16.000183 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.481231 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:18.481187 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs\") pod \"network-metrics-daemon-zbptt\" (UID: \"a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4\") " pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:29:18.483621 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:18.483587 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4-metrics-certs\") pod \"network-metrics-daemon-zbptt\" (UID: \"a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4\") " pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:29:18.757162 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:18.757136 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lkqvr\"" Apr 17 17:29:18.765062 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:18.765035 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbptt" Apr 17 17:29:18.903168 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:18.903084 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zbptt"] Apr 17 17:29:18.905407 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:29:18.905380 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda93a2a1c_fc1b_4d1b_b47c_6b2140f30cf4.slice/crio-5da8b74d810534df84ef1fa7d4134191a9784aea13cb8eae0b264f1108a37f99 WatchSource:0}: Error finding container 5da8b74d810534df84ef1fa7d4134191a9784aea13cb8eae0b264f1108a37f99: Status 404 returned error can't find the container with id 5da8b74d810534df84ef1fa7d4134191a9784aea13cb8eae0b264f1108a37f99 Apr 17 17:29:19.652632 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:19.652589 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zbptt" event={"ID":"a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4","Type":"ContainerStarted","Data":"5da8b74d810534df84ef1fa7d4134191a9784aea13cb8eae0b264f1108a37f99"} Apr 17 17:29:20.657534 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:20.657492 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zbptt" event={"ID":"a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4","Type":"ContainerStarted","Data":"31e853e8e69e2bb2dc5269b2c204a3203667f3eb2d2057565de9cc967537faa7"} Apr 17 17:29:20.657534 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:20.657537 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zbptt" event={"ID":"a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4","Type":"ContainerStarted","Data":"7ad5040c370e6e644a0c4073f7a5b3eff484957d138dbe9dd99b7790adcc9aae"} Apr 17 17:29:20.677813 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:29:20.677042 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zbptt" podStartSLOduration=252.715909326 podStartE2EDuration="4m13.677023345s" podCreationTimestamp="2026-04-17 17:25:07 +0000 UTC" firstStartedPulling="2026-04-17 17:29:18.907148787 +0000 UTC m=+251.846054027" lastFinishedPulling="2026-04-17 17:29:19.868262799 +0000 UTC m=+252.807168046" observedRunningTime="2026-04-17 17:29:20.674658245 +0000 UTC m=+253.613563508" watchObservedRunningTime="2026-04-17 17:29:20.677023345 +0000 UTC m=+253.615928610" Apr 17 17:30:07.526147 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:30:07.526118 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nq8bf_9dd87d69-6559-458b-94cc-fc8a74f27e9a/console-operator/1.log" Apr 17 17:30:07.526690 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:30:07.526197 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nq8bf_9dd87d69-6559-458b-94cc-fc8a74f27e9a/console-operator/1.log" Apr 17 17:30:07.532718 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:30:07.532696 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/ovn-acl-logging/0.log" Apr 17 17:30:07.532830 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:30:07.532720 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/ovn-acl-logging/0.log" Apr 17 17:30:07.536243 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:30:07.536226 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 17:30:10.999838 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:30:10.999787 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:11.015922 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:30:11.015899 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:11.825197 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:30:11.825171 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:34:56.201860 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:34:56.201818 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-rfq7r"] Apr 17 17:34:56.205165 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:34:56.205143 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-rfq7r" Apr 17 17:34:56.207631 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:34:56.207604 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 17:34:56.207906 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:34:56.207877 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 17:34:56.208019 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:34:56.207935 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 17:34:56.208688 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:34:56.208667 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-g5wcj\"" Apr 17 17:34:56.212934 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:34:56.212915 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-rfq7r"] Apr 17 17:34:56.266034 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:34:56.265982 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j84gx\" (UniqueName: \"kubernetes.io/projected/002e26ed-3e74-4bbb-8e43-e15a8a2fc691-kube-api-access-j84gx\") pod \"s3-init-rfq7r\" (UID: \"002e26ed-3e74-4bbb-8e43-e15a8a2fc691\") " pod="kserve/s3-init-rfq7r" Apr 17 17:34:56.366719 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:34:56.366688 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j84gx\" (UniqueName: \"kubernetes.io/projected/002e26ed-3e74-4bbb-8e43-e15a8a2fc691-kube-api-access-j84gx\") pod \"s3-init-rfq7r\" (UID: \"002e26ed-3e74-4bbb-8e43-e15a8a2fc691\") " pod="kserve/s3-init-rfq7r" Apr 17 17:34:56.374832 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:34:56.374798 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j84gx\" (UniqueName: \"kubernetes.io/projected/002e26ed-3e74-4bbb-8e43-e15a8a2fc691-kube-api-access-j84gx\") pod \"s3-init-rfq7r\" (UID: \"002e26ed-3e74-4bbb-8e43-e15a8a2fc691\") " pod="kserve/s3-init-rfq7r" Apr 17 17:34:56.526648 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:34:56.526613 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-rfq7r" Apr 17 17:34:56.647817 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:34:56.647738 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-rfq7r"] Apr 17 17:34:56.650785 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:34:56.650751 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod002e26ed_3e74_4bbb_8e43_e15a8a2fc691.slice/crio-169af43fe11f15c59a6292debd237125139056e84d036d2fca96e29cbfb9379e WatchSource:0}: Error finding container 169af43fe11f15c59a6292debd237125139056e84d036d2fca96e29cbfb9379e: Status 404 returned error can't find the container with id 169af43fe11f15c59a6292debd237125139056e84d036d2fca96e29cbfb9379e Apr 17 17:34:56.652489 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:34:56.652468 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:34:57.659426 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:34:57.659379 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-rfq7r" event={"ID":"002e26ed-3e74-4bbb-8e43-e15a8a2fc691","Type":"ContainerStarted","Data":"169af43fe11f15c59a6292debd237125139056e84d036d2fca96e29cbfb9379e"} Apr 17 17:35:01.671289 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:35:01.671253 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-rfq7r" event={"ID":"002e26ed-3e74-4bbb-8e43-e15a8a2fc691","Type":"ContainerStarted","Data":"d5203025e9ecd903ff18b0753ccf0cde14a518b577c162f8263f927ebcac1cd9"} Apr 17 17:35:01.688449 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:35:01.688391 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-rfq7r" podStartSLOduration=1.326202364 podStartE2EDuration="5.688371507s" podCreationTimestamp="2026-04-17 17:34:56 +0000 UTC" firstStartedPulling="2026-04-17 17:34:56.652596374 +0000 UTC m=+589.591501615" lastFinishedPulling="2026-04-17 17:35:01.014765506 +0000 UTC m=+593.953670758" observedRunningTime="2026-04-17 17:35:01.687268198 +0000 UTC m=+594.626173454" watchObservedRunningTime="2026-04-17 17:35:01.688371507 +0000 UTC m=+594.627276771" Apr 17 17:35:04.680704 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:35:04.680666 2571 generic.go:358] "Generic (PLEG): container finished" podID="002e26ed-3e74-4bbb-8e43-e15a8a2fc691" containerID="d5203025e9ecd903ff18b0753ccf0cde14a518b577c162f8263f927ebcac1cd9" exitCode=0 Apr 17 17:35:04.681101 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:35:04.680713 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-rfq7r" event={"ID":"002e26ed-3e74-4bbb-8e43-e15a8a2fc691","Type":"ContainerDied","Data":"d5203025e9ecd903ff18b0753ccf0cde14a518b577c162f8263f927ebcac1cd9"} Apr 17 17:35:05.808712 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:35:05.808687 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-rfq7r" Apr 17 17:35:05.955519 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:35:05.955438 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j84gx\" (UniqueName: \"kubernetes.io/projected/002e26ed-3e74-4bbb-8e43-e15a8a2fc691-kube-api-access-j84gx\") pod \"002e26ed-3e74-4bbb-8e43-e15a8a2fc691\" (UID: \"002e26ed-3e74-4bbb-8e43-e15a8a2fc691\") " Apr 17 17:35:05.957595 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:35:05.957566 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/002e26ed-3e74-4bbb-8e43-e15a8a2fc691-kube-api-access-j84gx" (OuterVolumeSpecName: "kube-api-access-j84gx") pod "002e26ed-3e74-4bbb-8e43-e15a8a2fc691" (UID: "002e26ed-3e74-4bbb-8e43-e15a8a2fc691"). InnerVolumeSpecName "kube-api-access-j84gx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:35:06.056415 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:35:06.056374 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j84gx\" (UniqueName: \"kubernetes.io/projected/002e26ed-3e74-4bbb-8e43-e15a8a2fc691-kube-api-access-j84gx\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:35:06.687614 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:35:06.687578 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-rfq7r" event={"ID":"002e26ed-3e74-4bbb-8e43-e15a8a2fc691","Type":"ContainerDied","Data":"169af43fe11f15c59a6292debd237125139056e84d036d2fca96e29cbfb9379e"} Apr 17 17:35:06.687782 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:35:06.687621 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="169af43fe11f15c59a6292debd237125139056e84d036d2fca96e29cbfb9379e" Apr 17 17:35:06.687782 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:35:06.687596 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-rfq7r" Apr 17 17:35:07.549710 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:35:07.549685 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nq8bf_9dd87d69-6559-458b-94cc-fc8a74f27e9a/console-operator/1.log" Apr 17 17:35:07.550174 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:35:07.549964 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nq8bf_9dd87d69-6559-458b-94cc-fc8a74f27e9a/console-operator/1.log" Apr 17 17:35:07.555261 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:35:07.555237 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/ovn-acl-logging/0.log" Apr 17 17:35:07.555410 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:35:07.555311 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/ovn-acl-logging/0.log" Apr 17 17:40:07.571302 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:40:07.571227 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nq8bf_9dd87d69-6559-458b-94cc-fc8a74f27e9a/console-operator/1.log" Apr 17 17:40:07.572687 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:40:07.572652 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nq8bf_9dd87d69-6559-458b-94cc-fc8a74f27e9a/console-operator/1.log" Apr 17 17:40:07.577440 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:40:07.577416 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/ovn-acl-logging/0.log" Apr 17 17:40:07.579038 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:40:07.579014 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/ovn-acl-logging/0.log" Apr 17 17:45:07.599013 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:45:07.598959 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nq8bf_9dd87d69-6559-458b-94cc-fc8a74f27e9a/console-operator/1.log" Apr 17 17:45:07.601851 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:45:07.601828 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nq8bf_9dd87d69-6559-458b-94cc-fc8a74f27e9a/console-operator/1.log" Apr 17 17:45:07.605012 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:45:07.604975 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/ovn-acl-logging/0.log" Apr 17 17:45:07.607161 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:45:07.607144 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/ovn-acl-logging/0.log" Apr 17 17:49:05.330969 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:05.330889 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-z5bzk/must-gather-vlb4p"] Apr 17 17:49:05.331449 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:05.331293 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="002e26ed-3e74-4bbb-8e43-e15a8a2fc691" containerName="s3-init" Apr 17 17:49:05.331449 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:05.331306 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="002e26ed-3e74-4bbb-8e43-e15a8a2fc691" containerName="s3-init" Apr 17 17:49:05.331449 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:05.331356 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="002e26ed-3e74-4bbb-8e43-e15a8a2fc691" containerName="s3-init" Apr 17 17:49:05.333544 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:05.333527 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5bzk/must-gather-vlb4p" Apr 17 17:49:05.336169 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:05.336149 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-z5bzk\"/\"default-dockercfg-bfdjt\"" Apr 17 17:49:05.336169 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:05.336154 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-z5bzk\"/\"openshift-service-ca.crt\"" Apr 17 17:49:05.337012 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:05.336981 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-z5bzk\"/\"kube-root-ca.crt\"" Apr 17 17:49:05.343341 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:05.343315 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-z5bzk/must-gather-vlb4p"] Apr 17 17:49:05.446344 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:05.446307 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmcpb\" (UniqueName: \"kubernetes.io/projected/0377c4eb-521f-4a74-8b1b-055a939bc794-kube-api-access-mmcpb\") pod \"must-gather-vlb4p\" (UID: \"0377c4eb-521f-4a74-8b1b-055a939bc794\") " pod="openshift-must-gather-z5bzk/must-gather-vlb4p" Apr 17 17:49:05.446518 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:05.446417 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0377c4eb-521f-4a74-8b1b-055a939bc794-must-gather-output\") pod \"must-gather-vlb4p\" (UID: \"0377c4eb-521f-4a74-8b1b-055a939bc794\") " pod="openshift-must-gather-z5bzk/must-gather-vlb4p" Apr 17 17:49:05.547722 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:05.547683 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmcpb\" (UniqueName: \"kubernetes.io/projected/0377c4eb-521f-4a74-8b1b-055a939bc794-kube-api-access-mmcpb\") pod \"must-gather-vlb4p\" (UID: \"0377c4eb-521f-4a74-8b1b-055a939bc794\") " pod="openshift-must-gather-z5bzk/must-gather-vlb4p" Apr 17 17:49:05.547849 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:05.547743 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0377c4eb-521f-4a74-8b1b-055a939bc794-must-gather-output\") pod \"must-gather-vlb4p\" (UID: \"0377c4eb-521f-4a74-8b1b-055a939bc794\") " pod="openshift-must-gather-z5bzk/must-gather-vlb4p" Apr 17 17:49:05.548065 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:05.548049 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0377c4eb-521f-4a74-8b1b-055a939bc794-must-gather-output\") pod \"must-gather-vlb4p\" (UID: \"0377c4eb-521f-4a74-8b1b-055a939bc794\") " pod="openshift-must-gather-z5bzk/must-gather-vlb4p" Apr 17 17:49:05.558248 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:05.558223 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmcpb\" (UniqueName: \"kubernetes.io/projected/0377c4eb-521f-4a74-8b1b-055a939bc794-kube-api-access-mmcpb\") pod \"must-gather-vlb4p\" (UID: \"0377c4eb-521f-4a74-8b1b-055a939bc794\") " pod="openshift-must-gather-z5bzk/must-gather-vlb4p" Apr 17 17:49:05.657298 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:05.657216 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5bzk/must-gather-vlb4p" Apr 17 17:49:05.782548 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:05.782514 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-z5bzk/must-gather-vlb4p"] Apr 17 17:49:05.786061 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:49:05.786029 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0377c4eb_521f_4a74_8b1b_055a939bc794.slice/crio-8428c2c5271c82cb1e4a56c78ce448e4bc66cbcde614112f8160825b8a7d0822 WatchSource:0}: Error finding container 8428c2c5271c82cb1e4a56c78ce448e4bc66cbcde614112f8160825b8a7d0822: Status 404 returned error can't find the container with id 8428c2c5271c82cb1e4a56c78ce448e4bc66cbcde614112f8160825b8a7d0822 Apr 17 17:49:05.787663 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:05.787642 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:49:06.159775 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:06.159740 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5bzk/must-gather-vlb4p" event={"ID":"0377c4eb-521f-4a74-8b1b-055a939bc794","Type":"ContainerStarted","Data":"8428c2c5271c82cb1e4a56c78ce448e4bc66cbcde614112f8160825b8a7d0822"} Apr 17 17:49:10.175175 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:10.175115 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5bzk/must-gather-vlb4p" event={"ID":"0377c4eb-521f-4a74-8b1b-055a939bc794","Type":"ContainerStarted","Data":"e08b52ecd51798dde05ec4ff87d674e41119253260966dbe34688c2a5099f6f1"} Apr 17 17:49:11.184044 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:11.183985 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5bzk/must-gather-vlb4p" event={"ID":"0377c4eb-521f-4a74-8b1b-055a939bc794","Type":"ContainerStarted","Data":"13d0fc60898cdebab9537b99fd21f8a26495df170e945b423a77072c515b934c"} Apr 17 17:49:11.200164 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:11.200106 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-z5bzk/must-gather-vlb4p" podStartSLOduration=1.97500973 podStartE2EDuration="6.200091555s" podCreationTimestamp="2026-04-17 17:49:05 +0000 UTC" firstStartedPulling="2026-04-17 17:49:05.787794361 +0000 UTC m=+1438.726699603" lastFinishedPulling="2026-04-17 17:49:10.012876178 +0000 UTC m=+1442.951781428" observedRunningTime="2026-04-17 17:49:11.199374684 +0000 UTC m=+1444.138279948" watchObservedRunningTime="2026-04-17 17:49:11.200091555 +0000 UTC m=+1444.138996817" Apr 17 17:49:28.238959 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:28.238923 2571 generic.go:358] "Generic (PLEG): container finished" podID="0377c4eb-521f-4a74-8b1b-055a939bc794" containerID="e08b52ecd51798dde05ec4ff87d674e41119253260966dbe34688c2a5099f6f1" exitCode=0 Apr 17 17:49:28.239362 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:28.239007 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5bzk/must-gather-vlb4p" event={"ID":"0377c4eb-521f-4a74-8b1b-055a939bc794","Type":"ContainerDied","Data":"e08b52ecd51798dde05ec4ff87d674e41119253260966dbe34688c2a5099f6f1"} Apr 17 17:49:28.239362 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:28.239322 2571 scope.go:117] "RemoveContainer" containerID="e08b52ecd51798dde05ec4ff87d674e41119253260966dbe34688c2a5099f6f1" Apr 17 17:49:29.069693 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:29.069663 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z5bzk_must-gather-vlb4p_0377c4eb-521f-4a74-8b1b-055a939bc794/gather/0.log" Apr 17 17:49:32.338464 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:32.338436 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-lzfd4_78665d2b-9809-4082-abf9-62de202c8f2f/global-pull-secret-syncer/0.log" Apr 17 17:49:32.514669 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:32.514636 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-w7cnl_1f443eb6-ad37-49de-a7de-e96834d08311/konnectivity-agent/0.log" Apr 17 17:49:32.536457 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:32.536431 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-105.ec2.internal_2d0b942ad205f5fb58ca3113d3bd1bee/haproxy/0.log" Apr 17 17:49:34.469673 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:34.469611 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-z5bzk/must-gather-vlb4p"] Apr 17 17:49:34.470241 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:34.469964 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-z5bzk/must-gather-vlb4p" podUID="0377c4eb-521f-4a74-8b1b-055a939bc794" containerName="copy" containerID="cri-o://13d0fc60898cdebab9537b99fd21f8a26495df170e945b423a77072c515b934c" gracePeriod=2 Apr 17 17:49:34.471485 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:34.471459 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-z5bzk/must-gather-vlb4p"] Apr 17 17:49:34.472216 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:34.472173 2571 status_manager.go:895] "Failed to get status for pod" podUID="0377c4eb-521f-4a74-8b1b-055a939bc794" pod="openshift-must-gather-z5bzk/must-gather-vlb4p" err="pods \"must-gather-vlb4p\" is forbidden: User \"system:node:ip-10-0-135-105.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-z5bzk\": no relationship found between node 'ip-10-0-135-105.ec2.internal' and this object" Apr 17 17:49:34.699461 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:34.699438 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z5bzk_must-gather-vlb4p_0377c4eb-521f-4a74-8b1b-055a939bc794/copy/0.log" Apr 17 17:49:34.699804 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:34.699790 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5bzk/must-gather-vlb4p" Apr 17 17:49:34.701794 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:34.701769 2571 status_manager.go:895] "Failed to get status for pod" podUID="0377c4eb-521f-4a74-8b1b-055a939bc794" pod="openshift-must-gather-z5bzk/must-gather-vlb4p" err="pods \"must-gather-vlb4p\" is forbidden: User \"system:node:ip-10-0-135-105.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-z5bzk\": no relationship found between node 'ip-10-0-135-105.ec2.internal' and this object" Apr 17 17:49:34.790188 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:34.790151 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmcpb\" (UniqueName: \"kubernetes.io/projected/0377c4eb-521f-4a74-8b1b-055a939bc794-kube-api-access-mmcpb\") pod \"0377c4eb-521f-4a74-8b1b-055a939bc794\" (UID: \"0377c4eb-521f-4a74-8b1b-055a939bc794\") " Apr 17 17:49:34.792529 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:34.792492 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0377c4eb-521f-4a74-8b1b-055a939bc794-kube-api-access-mmcpb" (OuterVolumeSpecName: "kube-api-access-mmcpb") pod "0377c4eb-521f-4a74-8b1b-055a939bc794" (UID: "0377c4eb-521f-4a74-8b1b-055a939bc794"). InnerVolumeSpecName "kube-api-access-mmcpb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:49:34.890712 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:34.890681 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0377c4eb-521f-4a74-8b1b-055a939bc794-must-gather-output\") pod \"0377c4eb-521f-4a74-8b1b-055a939bc794\" (UID: \"0377c4eb-521f-4a74-8b1b-055a939bc794\") " Apr 17 17:49:34.890927 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:34.890911 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mmcpb\" (UniqueName: \"kubernetes.io/projected/0377c4eb-521f-4a74-8b1b-055a939bc794-kube-api-access-mmcpb\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:49:34.892048 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:34.892024 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0377c4eb-521f-4a74-8b1b-055a939bc794-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0377c4eb-521f-4a74-8b1b-055a939bc794" (UID: "0377c4eb-521f-4a74-8b1b-055a939bc794"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:49:34.992170 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:34.992128 2571 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0377c4eb-521f-4a74-8b1b-055a939bc794-must-gather-output\") on node \"ip-10-0-135-105.ec2.internal\" DevicePath \"\"" Apr 17 17:49:35.265944 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:35.265916 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z5bzk_must-gather-vlb4p_0377c4eb-521f-4a74-8b1b-055a939bc794/copy/0.log" Apr 17 17:49:35.266276 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:35.266240 2571 generic.go:358] "Generic (PLEG): container finished" podID="0377c4eb-521f-4a74-8b1b-055a939bc794" containerID="13d0fc60898cdebab9537b99fd21f8a26495df170e945b423a77072c515b934c" exitCode=143 Apr 17 17:49:35.266409 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:35.266299 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5bzk/must-gather-vlb4p" Apr 17 17:49:35.266409 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:35.266341 2571 scope.go:117] "RemoveContainer" containerID="13d0fc60898cdebab9537b99fd21f8a26495df170e945b423a77072c515b934c" Apr 17 17:49:35.268558 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:35.268531 2571 status_manager.go:895] "Failed to get status for pod" podUID="0377c4eb-521f-4a74-8b1b-055a939bc794" pod="openshift-must-gather-z5bzk/must-gather-vlb4p" err="pods \"must-gather-vlb4p\" is forbidden: User \"system:node:ip-10-0-135-105.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-z5bzk\": no relationship found between node 'ip-10-0-135-105.ec2.internal' and this object" Apr 17 17:49:35.274045 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:35.274022 2571 scope.go:117] "RemoveContainer" containerID="e08b52ecd51798dde05ec4ff87d674e41119253260966dbe34688c2a5099f6f1" Apr 17 17:49:35.277109 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:35.277085 2571 status_manager.go:895] "Failed to get status for pod" podUID="0377c4eb-521f-4a74-8b1b-055a939bc794" pod="openshift-must-gather-z5bzk/must-gather-vlb4p" err="pods \"must-gather-vlb4p\" is forbidden: User \"system:node:ip-10-0-135-105.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-z5bzk\": no relationship found between node 'ip-10-0-135-105.ec2.internal' and this object" Apr 17 17:49:35.286603 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:35.285866 2571 scope.go:117] "RemoveContainer" containerID="13d0fc60898cdebab9537b99fd21f8a26495df170e945b423a77072c515b934c" Apr 17 17:49:35.287210 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:49:35.287183 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d0fc60898cdebab9537b99fd21f8a26495df170e945b423a77072c515b934c\": container with ID starting with 13d0fc60898cdebab9537b99fd21f8a26495df170e945b423a77072c515b934c not found: ID does not exist" containerID="13d0fc60898cdebab9537b99fd21f8a26495df170e945b423a77072c515b934c" Apr 17 17:49:35.287297 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:35.287220 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d0fc60898cdebab9537b99fd21f8a26495df170e945b423a77072c515b934c"} err="failed to get container status \"13d0fc60898cdebab9537b99fd21f8a26495df170e945b423a77072c515b934c\": rpc error: code = NotFound desc = could not find container \"13d0fc60898cdebab9537b99fd21f8a26495df170e945b423a77072c515b934c\": container with ID starting with 13d0fc60898cdebab9537b99fd21f8a26495df170e945b423a77072c515b934c not found: ID does not exist" Apr 17 17:49:35.287297 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:35.287243 2571 scope.go:117] "RemoveContainer" containerID="e08b52ecd51798dde05ec4ff87d674e41119253260966dbe34688c2a5099f6f1" Apr 17 17:49:35.287639 ip-10-0-135-105 kubenswrapper[2571]: E0417 17:49:35.287621 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e08b52ecd51798dde05ec4ff87d674e41119253260966dbe34688c2a5099f6f1\": container with ID starting with e08b52ecd51798dde05ec4ff87d674e41119253260966dbe34688c2a5099f6f1 not found: ID does not exist" containerID="e08b52ecd51798dde05ec4ff87d674e41119253260966dbe34688c2a5099f6f1" Apr 17 17:49:35.287698 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:35.287648 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e08b52ecd51798dde05ec4ff87d674e41119253260966dbe34688c2a5099f6f1"} err="failed to get container status \"e08b52ecd51798dde05ec4ff87d674e41119253260966dbe34688c2a5099f6f1\": rpc error: code = NotFound desc = could not find container \"e08b52ecd51798dde05ec4ff87d674e41119253260966dbe34688c2a5099f6f1\": container with ID starting with e08b52ecd51798dde05ec4ff87d674e41119253260966dbe34688c2a5099f6f1 not found: ID does not exist" Apr 17 17:49:35.658201 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:35.658131 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0377c4eb-521f-4a74-8b1b-055a939bc794" path="/var/lib/kubelet/pods/0377c4eb-521f-4a74-8b1b-055a939bc794/volumes" Apr 17 17:49:36.231518 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.231487 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f8mfn_35d91680-e317-4b76-b153-a4f79eba959a/node-exporter/0.log" Apr 17 17:49:36.260040 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.260015 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f8mfn_35d91680-e317-4b76-b153-a4f79eba959a/kube-rbac-proxy/0.log" Apr 17 17:49:36.284285 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.284254 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f8mfn_35d91680-e317-4b76-b153-a4f79eba959a/init-textfile/0.log" Apr 17 17:49:36.393620 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.393540 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3/prometheus/0.log" Apr 17 17:49:36.414119 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.414097 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3/config-reloader/0.log" Apr 17 17:49:36.434345 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.434315 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3/thanos-sidecar/0.log" Apr 17 17:49:36.455186 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.455163 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3/kube-rbac-proxy-web/0.log" Apr 17 17:49:36.478011 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.477973 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3/kube-rbac-proxy/0.log" Apr 17 17:49:36.498123 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.498097 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3/kube-rbac-proxy-thanos/0.log" Apr 17 17:49:36.518732 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.518705 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_74bfa4d5-d22c-4b29-aae3-5f1fb1f86df3/init-config-reloader/0.log" Apr 17 17:49:36.547314 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.547288 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-k9kxz_841bb604-188c-482a-815c-d8da332a5b02/prometheus-operator/0.log" Apr 17 17:49:36.571055 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.571025 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-k9kxz_841bb604-188c-482a-815c-d8da332a5b02/kube-rbac-proxy/0.log" Apr 17 17:49:36.602958 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.602927 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-clqzj_050d613e-46b7-4949-ab46-ebcf42deeeb3/prometheus-operator-admission-webhook/0.log" Apr 17 17:49:36.704705 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.704632 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-55c6bf5658-7jfq4_a205f472-b21e-4f79-ae09-96715d56ac66/thanos-query/0.log" Apr 17 17:49:36.730682 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.730655 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-55c6bf5658-7jfq4_a205f472-b21e-4f79-ae09-96715d56ac66/kube-rbac-proxy-web/0.log" Apr 17 17:49:36.756396 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.756369 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-55c6bf5658-7jfq4_a205f472-b21e-4f79-ae09-96715d56ac66/kube-rbac-proxy/0.log" Apr 17 17:49:36.784284 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.784252 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-55c6bf5658-7jfq4_a205f472-b21e-4f79-ae09-96715d56ac66/prom-label-proxy/0.log" Apr 17 17:49:36.806913 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.806887 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-55c6bf5658-7jfq4_a205f472-b21e-4f79-ae09-96715d56ac66/kube-rbac-proxy-rules/0.log" Apr 17 17:49:36.834854 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:36.834820 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-55c6bf5658-7jfq4_a205f472-b21e-4f79-ae09-96715d56ac66/kube-rbac-proxy-metrics/0.log" Apr 17 17:49:37.961168 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:37.961144 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-92kx6_6a266ff4-f2ce-4fe5-b4cc-806ebc260ff6/networking-console-plugin/0.log" Apr 17 17:49:38.394050 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:38.394017 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nq8bf_9dd87d69-6559-458b-94cc-fc8a74f27e9a/console-operator/1.log" Apr 17 17:49:38.398925 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:38.398904 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nq8bf_9dd87d69-6559-458b-94cc-fc8a74f27e9a/console-operator/2.log" Apr 17 17:49:38.805973 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:38.805939 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-v6scj_c71e7458-3162-4d35-af2f-4cb2853de04b/download-server/0.log" Apr 17 17:49:39.187588 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.187506 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9"] Apr 17 17:49:39.187951 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.187878 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0377c4eb-521f-4a74-8b1b-055a939bc794" containerName="gather" Apr 17 17:49:39.187951 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.187889 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0377c4eb-521f-4a74-8b1b-055a939bc794" containerName="gather" Apr 17 17:49:39.187951 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.187897 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0377c4eb-521f-4a74-8b1b-055a939bc794" containerName="copy" Apr 17 17:49:39.187951 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.187902 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0377c4eb-521f-4a74-8b1b-055a939bc794" containerName="copy" Apr 17 17:49:39.188108 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.187960 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0377c4eb-521f-4a74-8b1b-055a939bc794" containerName="copy" Apr 17 17:49:39.188108 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.187972 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0377c4eb-521f-4a74-8b1b-055a939bc794" containerName="gather" Apr 17 17:49:39.191701 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.191679 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:39.194078 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.194055 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7ttz7\"/\"kube-root-ca.crt\"" Apr 17 17:49:39.195029 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.195010 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7ttz7\"/\"default-dockercfg-ml4hx\"" Apr 17 17:49:39.195129 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.195035 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7ttz7\"/\"openshift-service-ca.crt\"" Apr 17 17:49:39.198466 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.198440 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9"] Apr 17 17:49:39.218365 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.218340 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-mvh8f_390c586c-bb4c-4700-b4eb-ccd63ad31506/volume-data-source-validator/0.log" Apr 17 17:49:39.223698 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.223678 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7db85fe1-f6e5-4957-8a85-efde6f1ce406-proc\") pod \"perf-node-gather-daemonset-tdkn9\" (UID: \"7db85fe1-f6e5-4957-8a85-efde6f1ce406\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:39.223768 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.223728 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7db85fe1-f6e5-4957-8a85-efde6f1ce406-sys\") pod \"perf-node-gather-daemonset-tdkn9\" (UID: \"7db85fe1-f6e5-4957-8a85-efde6f1ce406\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:39.223816 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.223798 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7db85fe1-f6e5-4957-8a85-efde6f1ce406-lib-modules\") pod \"perf-node-gather-daemonset-tdkn9\" (UID: \"7db85fe1-f6e5-4957-8a85-efde6f1ce406\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:39.223854 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.223824 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7db85fe1-f6e5-4957-8a85-efde6f1ce406-podres\") pod \"perf-node-gather-daemonset-tdkn9\" (UID: \"7db85fe1-f6e5-4957-8a85-efde6f1ce406\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:39.223899 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.223884 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tjzh\" (UniqueName: \"kubernetes.io/projected/7db85fe1-f6e5-4957-8a85-efde6f1ce406-kube-api-access-6tjzh\") pod \"perf-node-gather-daemonset-tdkn9\" (UID: \"7db85fe1-f6e5-4957-8a85-efde6f1ce406\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:39.325202 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.325162 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7db85fe1-f6e5-4957-8a85-efde6f1ce406-proc\") pod \"perf-node-gather-daemonset-tdkn9\" (UID: \"7db85fe1-f6e5-4957-8a85-efde6f1ce406\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:39.325406 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.325224 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7db85fe1-f6e5-4957-8a85-efde6f1ce406-sys\") pod \"perf-node-gather-daemonset-tdkn9\" (UID: \"7db85fe1-f6e5-4957-8a85-efde6f1ce406\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:39.325406 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.325250 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7db85fe1-f6e5-4957-8a85-efde6f1ce406-lib-modules\") pod \"perf-node-gather-daemonset-tdkn9\" (UID: \"7db85fe1-f6e5-4957-8a85-efde6f1ce406\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:39.325406 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.325268 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7db85fe1-f6e5-4957-8a85-efde6f1ce406-podres\") pod \"perf-node-gather-daemonset-tdkn9\" (UID: \"7db85fe1-f6e5-4957-8a85-efde6f1ce406\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:39.325406 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.325293 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7db85fe1-f6e5-4957-8a85-efde6f1ce406-proc\") pod \"perf-node-gather-daemonset-tdkn9\" (UID: \"7db85fe1-f6e5-4957-8a85-efde6f1ce406\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:39.325406 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.325301 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tjzh\" (UniqueName: \"kubernetes.io/projected/7db85fe1-f6e5-4957-8a85-efde6f1ce406-kube-api-access-6tjzh\") pod \"perf-node-gather-daemonset-tdkn9\" (UID: \"7db85fe1-f6e5-4957-8a85-efde6f1ce406\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:39.325406 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.325372 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7db85fe1-f6e5-4957-8a85-efde6f1ce406-sys\") pod \"perf-node-gather-daemonset-tdkn9\" (UID: \"7db85fe1-f6e5-4957-8a85-efde6f1ce406\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:39.325633 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.325439 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7db85fe1-f6e5-4957-8a85-efde6f1ce406-lib-modules\") pod \"perf-node-gather-daemonset-tdkn9\" (UID: \"7db85fe1-f6e5-4957-8a85-efde6f1ce406\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:39.325633 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.325459 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7db85fe1-f6e5-4957-8a85-efde6f1ce406-podres\") pod \"perf-node-gather-daemonset-tdkn9\" (UID: \"7db85fe1-f6e5-4957-8a85-efde6f1ce406\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:39.332972 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.332948 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tjzh\" (UniqueName: \"kubernetes.io/projected/7db85fe1-f6e5-4957-8a85-efde6f1ce406-kube-api-access-6tjzh\") pod \"perf-node-gather-daemonset-tdkn9\" (UID: \"7db85fe1-f6e5-4957-8a85-efde6f1ce406\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:39.502023 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.501906 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:39.636480 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.636454 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9"] Apr 17 17:49:39.639263 ip-10-0-135-105 kubenswrapper[2571]: W0417 17:49:39.639229 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7db85fe1_f6e5_4957_8a85_efde6f1ce406.slice/crio-8a5c406626952f5b8e401de906b42052e7ee64e49dcc40d65abdc0e251675545 WatchSource:0}: Error finding container 8a5c406626952f5b8e401de906b42052e7ee64e49dcc40d65abdc0e251675545: Status 404 returned error can't find the container with id 8a5c406626952f5b8e401de906b42052e7ee64e49dcc40d65abdc0e251675545 Apr 17 17:49:39.850154 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.850126 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-frpbr_4efcb7d2-ede6-4e86-920c-19ff33a94f7b/dns/0.log" Apr 17 17:49:39.870414 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.870388 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-frpbr_4efcb7d2-ede6-4e86-920c-19ff33a94f7b/kube-rbac-proxy/0.log" Apr 17 17:49:39.997639 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:39.997610 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hzc4l_fd529c67-6434-4e26-bfa8-0edc94a3b098/dns-node-resolver/0.log" Apr 17 17:49:40.283289 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:40.283251 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" event={"ID":"7db85fe1-f6e5-4957-8a85-efde6f1ce406","Type":"ContainerStarted","Data":"83fbe3e14de3f380b280c2d3a89c8f04ef481e12c3a9ee9527b462ea6bbc0747"} Apr 17 17:49:40.283289 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:40.283290 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" event={"ID":"7db85fe1-f6e5-4957-8a85-efde6f1ce406","Type":"ContainerStarted","Data":"8a5c406626952f5b8e401de906b42052e7ee64e49dcc40d65abdc0e251675545"} Apr 17 17:49:40.283702 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:40.283319 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:40.298254 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:40.298207 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" podStartSLOduration=1.298192601 podStartE2EDuration="1.298192601s" podCreationTimestamp="2026-04-17 17:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:49:40.29674258 +0000 UTC m=+1473.235647854" watchObservedRunningTime="2026-04-17 17:49:40.298192601 +0000 UTC m=+1473.237097863" Apr 17 17:49:40.467674 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:40.467646 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hxcr5_e48539b9-0ec2-4c8c-bdd6-a57dbcc0fd2b/node-ca/0.log" Apr 17 17:49:41.522270 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:41.522241 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-r7dzq_32351f44-6fa2-44e0-81dd-549f2f19d705/serve-healthcheck-canary/0.log" Apr 17 17:49:41.892180 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:41.892074 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-244jf_457a7752-1528-46e3-a693-0e6eddb138c7/kube-rbac-proxy/0.log" Apr 17 17:49:41.913421 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:41.913392 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-244jf_457a7752-1528-46e3-a693-0e6eddb138c7/exporter/0.log" Apr 17 17:49:41.956797 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:41.956769 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-244jf_457a7752-1528-46e3-a693-0e6eddb138c7/extractor/0.log" Apr 17 17:49:44.140797 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:44.140771 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-rfq7r_002e26ed-3e74-4bbb-8e43-e15a8a2fc691/s3-init/0.log" Apr 17 17:49:46.296586 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:46.296556 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-tdkn9" Apr 17 17:49:47.805935 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:47.805905 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-x4dpv_5d8fa452-8b8f-48f3-90bf-a980adde658a/migrator/0.log" Apr 17 17:49:47.828610 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:47.828583 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-x4dpv_5d8fa452-8b8f-48f3-90bf-a980adde658a/graceful-termination/0.log" Apr 17 17:49:49.182186 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:49.182115 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5fv24_dcf417a0-db1f-4d94-9f13-bd789e955760/kube-multus-additional-cni-plugins/0.log" Apr 17 17:49:49.203245 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:49.203219 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5fv24_dcf417a0-db1f-4d94-9f13-bd789e955760/egress-router-binary-copy/0.log" Apr 17 17:49:49.224107 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:49.224076 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5fv24_dcf417a0-db1f-4d94-9f13-bd789e955760/cni-plugins/0.log" Apr 17 17:49:49.243459 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:49.243437 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5fv24_dcf417a0-db1f-4d94-9f13-bd789e955760/bond-cni-plugin/0.log" Apr 17 17:49:49.262291 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:49.262261 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5fv24_dcf417a0-db1f-4d94-9f13-bd789e955760/routeoverride-cni/0.log" Apr 17 17:49:49.281592 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:49.281562 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5fv24_dcf417a0-db1f-4d94-9f13-bd789e955760/whereabouts-cni-bincopy/0.log" Apr 17 17:49:49.300825 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:49.300802 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5fv24_dcf417a0-db1f-4d94-9f13-bd789e955760/whereabouts-cni/0.log" Apr 17 17:49:49.651093 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:49.651054 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kx7gk_54be9ce9-8e34-41ad-a9f8-f058d0ba0624/kube-multus/0.log" Apr 17 17:49:49.814524 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:49.814493 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zbptt_a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4/network-metrics-daemon/0.log" Apr 17 17:49:49.836789 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:49.836761 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zbptt_a93a2a1c-fc1b-4d1b-b47c-6b2140f30cf4/kube-rbac-proxy/0.log" Apr 17 17:49:50.956077 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:50.956045 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/ovn-controller/0.log" Apr 17 17:49:50.976311 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:50.976278 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/ovn-acl-logging/0.log" Apr 17 17:49:50.982098 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:50.982079 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/ovn-acl-logging/1.log" Apr 17 17:49:51.001561 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:51.001534 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/kube-rbac-proxy-node/0.log" Apr 17 17:49:51.021076 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:51.021034 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 17:49:51.038875 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:51.038831 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/northd/0.log" Apr 17 17:49:51.060695 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:51.060665 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/nbdb/0.log" Apr 17 17:49:51.081358 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:51.081314 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/sbdb/0.log" Apr 17 17:49:51.171191 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:51.171158 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hhm5l_354f25fd-1072-4dea-8ce2-1953b417053a/ovnkube-controller/0.log" Apr 17 17:49:52.352591 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:52.352558 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-6kv77_47718045-17b9-4963-8fe8-9bf81ac118cd/check-endpoints/0.log" Apr 17 17:49:52.373836 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:52.373815 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-lwq8d_5a407c11-3cbe-4521-8abd-48c6506368fb/network-check-target-container/0.log" Apr 17 17:49:53.298488 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:53.298457 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-2l9v7_ba50df31-c8c4-481e-8f5a-53f3fc99f52c/iptables-alerter/0.log" Apr 17 17:49:53.934945 ip-10-0-135-105 kubenswrapper[2571]: I0417 17:49:53.934918 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-j67sl_38a8697c-ea92-4488-90c5-13d587599dba/tuned/0.log"