Apr 20 20:05:25.524129 ip-10-0-140-155 systemd[1]: Starting Kubernetes Kubelet... Apr 20 20:05:25.975945 ip-10-0-140-155 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:25.975945 ip-10-0-140-155 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 20:05:25.975945 ip-10-0-140-155 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:25.975945 ip-10-0-140-155 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 20:05:25.975945 ip-10-0-140-155 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:25.976704 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.976622 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 20:05:25.978852 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978837 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:25.978852 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978852 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978856 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978860 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978863 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978866 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978869 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978871 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978875 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978878 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978880 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978883 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978886 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978888 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978891 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978894 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978897 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978901 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978904 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978907 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:25.978915 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978910 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978912 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978916 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978918 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978921 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978924 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978927 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978930 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978933 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978935 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978938 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978941 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978943 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978945 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978948 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978951 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978956 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978960 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978963 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:25.979416 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978965 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978968 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978971 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978973 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978976 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978978 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978980 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978983 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978985 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978988 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978990 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978993 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978995 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.978998 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979000 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979004 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979007 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979009 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979012 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979014 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:25.980151 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979017 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979019 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979022 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979024 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979026 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979029 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979033 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979035 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979038 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979040 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979043 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979045 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979047 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979050 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979052 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979055 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979058 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979061 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979064 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979067 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:25.980639 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979069 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979071 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979074 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979076 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979079 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979082 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.979084 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980819 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980829 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980833 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980836 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980840 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980844 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980847 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980850 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980852 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980855 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980858 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980861 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:25.981140 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980863 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980866 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980868 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980870 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980873 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980875 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980878 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980881 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980883 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980886 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980888 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980891 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980893 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980896 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980899 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980901 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980903 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980906 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980908 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980911 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:25.981601 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980918 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980922 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980925 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980927 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980930 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980933 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980935 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980938 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980941 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980943 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980955 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980959 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980961 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980964 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980967 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980970 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980972 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980975 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980977 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980980 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:25.982144 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980982 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980985 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980987 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980990 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980993 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980995 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.980998 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981000 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981003 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981005 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981007 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981010 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981012 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981017 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981020 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981023 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981025 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981029 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981031 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981034 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:25.982618 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981036 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981039 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981041 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981043 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981046 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981050 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981052 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981054 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981058 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981062 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981065 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981067 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981071 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.981073 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982260 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982269 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982276 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982281 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982285 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982289 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982293 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 20:05:25.983124 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982298 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982301 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982304 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982308 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982311 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982315 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982318 2575 flags.go:64] FLAG: --cgroup-root="" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982321 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982324 2575 flags.go:64] FLAG: --client-ca-file="" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982327 2575 flags.go:64] FLAG: --cloud-config="" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982329 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982332 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982336 2575 flags.go:64] FLAG: --cluster-domain="" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982339 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982342 2575 flags.go:64] FLAG: --config-dir="" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982344 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982348 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982352 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982355 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982359 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982362 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982364 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982367 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982370 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982373 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 20:05:25.983621 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982376 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982380 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982382 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982385 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982388 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982391 2575 flags.go:64] FLAG: --enable-server="true" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982393 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982398 2575 flags.go:64] FLAG: --event-burst="100" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982401 2575 flags.go:64] FLAG: --event-qps="50" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982404 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982407 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982410 2575 flags.go:64] FLAG: --eviction-hard="" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982414 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982417 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982420 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982423 2575 flags.go:64] FLAG: --eviction-soft="" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982426 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982429 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982432 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982435 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982437 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982440 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982443 2575 flags.go:64] FLAG: --feature-gates="" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982446 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982449 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 20:05:25.984228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982453 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982456 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982459 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982461 2575 flags.go:64] FLAG: --help="false" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982464 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-140-155.ec2.internal" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982467 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982470 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982473 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982476 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982479 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982482 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982485 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982487 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982490 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982493 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982496 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982499 2575 flags.go:64] FLAG: --kube-reserved="" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982502 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982509 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982512 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982515 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982518 2575 flags.go:64] FLAG: --lock-file="" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982522 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982524 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 20:05:25.984845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982527 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982538 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982540 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982543 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982546 2575 flags.go:64] FLAG: --logging-format="text" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982549 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982552 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982555 2575 flags.go:64] FLAG: --manifest-url="" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982558 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982563 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982566 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982571 2575 flags.go:64] FLAG: --max-pods="110" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982574 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982577 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982580 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982583 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982586 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982602 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982606 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982614 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982617 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982620 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982623 2575 flags.go:64] FLAG: --pod-cidr="" Apr 20 20:05:25.985407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982626 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982632 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982635 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982638 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982642 2575 flags.go:64] FLAG: --port="10250" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982645 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982647 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0d61ad74c05d8d999" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982651 2575 flags.go:64] FLAG: --qos-reserved="" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982653 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982656 2575 flags.go:64] FLAG: --register-node="true" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982659 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982662 2575 flags.go:64] FLAG: --register-with-taints="" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982665 2575 flags.go:64] FLAG: --registry-burst="10" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982668 2575 flags.go:64] FLAG: --registry-qps="5" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982671 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982674 2575 flags.go:64] FLAG: --reserved-memory="" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982677 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982680 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982683 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982686 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982689 2575 flags.go:64] FLAG: --runonce="false" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982691 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982694 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982697 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982699 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982702 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 20:05:25.985958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982705 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982708 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982711 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982713 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982716 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982719 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982721 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982724 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982727 2575 flags.go:64] FLAG: --system-cgroups="" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982734 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982740 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982743 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982746 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982749 2575 flags.go:64] FLAG: --tls-min-version="" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982752 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982755 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982758 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982760 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982763 2575 flags.go:64] FLAG: --v="2" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982767 2575 flags.go:64] FLAG: --version="false" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982771 2575 flags.go:64] FLAG: --vmodule="" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982775 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.982778 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982890 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982894 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:25.986603 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982897 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982901 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982904 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982907 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982909 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982912 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982915 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982918 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982920 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982923 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982925 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982929 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982933 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982935 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982938 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982941 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982944 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982947 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982950 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:25.987195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982952 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982954 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982957 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982960 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982962 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982964 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982967 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982970 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982972 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982975 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982977 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982979 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982982 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982984 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982987 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982990 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982992 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982994 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982997 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.982999 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:25.987666 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983002 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983004 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983007 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983009 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983011 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983014 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983017 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983021 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983023 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983027 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983030 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983033 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983036 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983038 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983041 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983043 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983046 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983048 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983050 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983053 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:25.988247 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983055 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983058 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983060 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983062 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983065 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983068 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983070 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983072 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983075 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983078 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983080 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983082 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983085 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983087 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983089 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983092 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983094 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983096 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983098 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:25.988736 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983101 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:25.989213 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983103 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:25.989213 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983107 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:25.989213 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983111 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:25.989213 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983114 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:25.989213 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.983116 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:25.989213 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.983825 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:25.990128 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.990110 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 20:05:25.990163 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.990129 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 20:05:25.990195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990178 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:25.990195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990184 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:25.990195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990187 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:25.990195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990189 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:25.990195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990192 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:25.990195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990195 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:25.990195 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990198 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990201 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990204 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990206 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990209 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990211 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990214 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990216 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990220 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990222 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990225 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990228 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990230 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990233 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990236 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990238 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990241 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990243 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990246 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990248 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:25.990368 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990251 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990253 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990256 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990258 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990261 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990264 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990267 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990269 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990271 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990274 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990276 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990279 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990281 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990284 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990287 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990289 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990292 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990294 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990297 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990299 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:25.990981 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990302 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:25.991457 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990305 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:25.991457 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990307 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:25.991457 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990310 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:25.991457 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990312 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:25.991457 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990314 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:25.991457 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990317 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:25.991457 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990321 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:25.991457 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990325 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:25.991457 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990328 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:25.991457 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990331 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:25.991457 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990334 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:25.991457 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990338 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:25.991457 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990341 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:25.991457 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990343 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:25.991457 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990346 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:25.991457 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990349 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:25.991457 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990351 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:25.991457 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990354 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990357 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990359 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990361 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990364 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990366 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990369 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990372 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990375 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990378 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990381 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990384 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990386 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990388 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990391 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990393 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990396 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990398 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990401 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990403 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:25.991895 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990405 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:25.992380 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990408 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:25.992380 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.990413 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:25.992380 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990513 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:25.992380 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990518 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:25.992380 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990521 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:25.992380 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990524 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:25.992380 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990526 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:25.992380 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990529 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:25.992380 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990531 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:25.992380 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990534 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:25.992380 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990536 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:25.992380 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990539 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:25.992380 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990542 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:25.992380 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990544 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:25.992380 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990547 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:25.992380 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990549 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990552 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990554 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990557 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990559 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990562 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990566 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990571 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990574 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990577 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990580 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990583 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990587 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990591 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990594 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990596 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990599 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990602 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990605 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:25.992783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990607 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990610 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990612 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990614 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990617 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990620 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990622 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990625 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990627 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990630 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990632 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990635 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990638 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990640 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990643 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990645 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990648 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990651 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990653 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990656 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:25.993262 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990658 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990660 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990663 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990665 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990668 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990670 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990673 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990675 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990678 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990680 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990683 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990685 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990687 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990690 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990693 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990695 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990698 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990700 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990703 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990705 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:25.993749 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990708 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:25.994274 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990710 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:25.994274 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990712 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:25.994274 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990715 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:25.994274 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990718 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:25.994274 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990721 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:25.994274 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990723 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:25.994274 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990726 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:25.994274 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990728 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:25.994274 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990731 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:25.994274 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990733 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:25.994274 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990735 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:25.994274 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990738 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:25.994274 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:25.990740 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:25.994274 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.990745 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:25.994274 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.991531 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 20:05:25.995442 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.995427 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 20:05:25.996590 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.996579 2575 server.go:1019] "Starting client certificate rotation" Apr 20 20:05:25.996688 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.996673 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:05:25.996720 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:25.996716 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:05:26.021474 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.021458 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:05:26.023915 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.023897 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:05:26.042031 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.042007 2575 log.go:25] "Validated CRI v1 runtime API" Apr 20 20:05:26.047973 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.047957 2575 log.go:25] "Validated CRI v1 image API" Apr 20 20:05:26.049206 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.049177 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 20:05:26.052225 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.052203 2575 fs.go:135] Filesystem UUIDs: map[3b71ed7a-1d71-4e12-8654-4ebf6612e473:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 9e2b9873-1aa4-408e-88ab-ac016c36b0b9:/dev/nvme0n1p3] Apr 20 20:05:26.052291 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.052224 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 20:05:26.054734 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.054718 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:05:26.056883 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.056759 2575 manager.go:217] Machine: {Timestamp:2026-04-20 20:05:26.055694428 +0000 UTC m=+0.410297165 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3072022 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28127c0a258e5a91488aed2919536e SystemUUID:ec28127c-0a25-8e5a-9148-8aed2919536e BootID:f68bb2ff-cc7e-44a7-a03f-d369bb3f7b3f Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:5f:ab:9f:06:03 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:5f:ab:9f:06:03 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:be:00:32:30:0e:9b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 20:05:26.056883 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.056878 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 20:05:26.057006 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.056961 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 20:05:26.058579 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.058559 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 20:05:26.058709 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.058580 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-155.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 20:05:26.058755 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.058721 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 20:05:26.058755 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.058728 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 20:05:26.058755 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.058742 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:05:26.059497 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.059487 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:05:26.061305 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.061294 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:05:26.061558 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.061549 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 20:05:26.063592 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.063582 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 20 20:05:26.063625 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.063601 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 20:05:26.063625 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.063612 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 20:05:26.063625 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.063621 2575 kubelet.go:397] "Adding apiserver pod source" Apr 20 20:05:26.063736 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.063629 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 20:05:26.064641 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.064628 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:05:26.064679 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.064653 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:05:26.068243 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.068221 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 20:05:26.069604 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.069590 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 20:05:26.071248 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.071237 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 20:05:26.071297 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.071253 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 20:05:26.071297 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.071260 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 20:05:26.071297 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.071265 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 20:05:26.071297 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.071271 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 20:05:26.071297 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.071276 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 20:05:26.071297 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.071282 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 20:05:26.071297 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.071287 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 20:05:26.071297 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.071295 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 20:05:26.071297 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.071301 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 20:05:26.071528 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.071311 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 20:05:26.071528 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.071319 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 20:05:26.072250 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.072240 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 20:05:26.072250 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.072250 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 20:05:26.074920 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.074894 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-155.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 20:05:26.075141 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.074907 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 20:05:26.075892 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.075879 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 20:05:26.075937 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.075915 2575 server.go:1295] "Started kubelet" Apr 20 20:05:26.076028 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.075990 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 20:05:26.076090 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.076057 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 20:05:26.076120 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.076109 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 20:05:26.076809 ip-10-0-140-155 systemd[1]: Started Kubernetes Kubelet. Apr 20 20:05:26.077044 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.077029 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 20:05:26.079781 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.079768 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 20 20:05:26.082548 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.082512 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 20:05:26.082686 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.082667 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 20:05:26.083689 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.083671 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 20:05:26.083837 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.083819 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 20:05:26.083924 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.083684 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 20:05:26.083991 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.083955 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 20 20:05:26.083991 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.083961 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 20 20:05:26.084607 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.084576 2575 factory.go:55] Registering systemd factory Apr 20 20:05:26.084697 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.084611 2575 factory.go:223] Registration of the systemd container factory successfully Apr 20 20:05:26.085346 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.085329 2575 factory.go:153] Registering CRI-O factory Apr 20 20:05:26.085400 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.085352 2575 factory.go:223] Registration of the crio container factory successfully Apr 20 20:05:26.085446 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.085403 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 20:05:26.085446 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.085434 2575 factory.go:103] Registering Raw factory Apr 20 20:05:26.085541 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.085450 2575 manager.go:1196] Started watching for new ooms in manager Apr 20 20:05:26.085947 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.085930 2575 manager.go:319] Starting recovery of all containers Apr 20 20:05:26.086321 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.086297 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-155.ec2.internal\" not found" Apr 20 20:05:26.087045 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.086888 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2r2fw" Apr 20 20:05:26.087870 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.087843 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-155.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 20:05:26.088850 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.088818 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 20:05:26.088941 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.088906 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-155.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 20:05:26.091583 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.087931 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-155.ec2.internal.18a8295b86acbf74 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-155.ec2.internal,UID:ip-10-0-140-155.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-155.ec2.internal,},FirstTimestamp:2026-04-20 20:05:26.075891572 +0000 UTC m=+0.430494309,LastTimestamp:2026-04-20 20:05:26.075891572 +0000 UTC m=+0.430494309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-155.ec2.internal,}" Apr 20 20:05:26.095603 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.095474 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2r2fw" Apr 20 20:05:26.099459 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.099439 2575 manager.go:324] Recovery completed Apr 20 20:05:26.103256 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.103244 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:26.105463 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.105447 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:26.105533 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.105477 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:26.105533 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.105490 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:26.105976 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.105961 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 20:05:26.105976 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.105973 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 20:05:26.106068 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.105991 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:05:26.107758 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.107696 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-155.ec2.internal.18a8295b886ff94a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-155.ec2.internal,UID:ip-10-0-140-155.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-155.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-155.ec2.internal,},FirstTimestamp:2026-04-20 20:05:26.105463114 +0000 UTC m=+0.460065851,LastTimestamp:2026-04-20 20:05:26.105463114 +0000 UTC m=+0.460065851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-155.ec2.internal,}" Apr 20 20:05:26.108330 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.108320 2575 policy_none.go:49] "None policy: Start" Apr 20 20:05:26.108372 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.108341 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 20:05:26.108372 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.108351 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 20 20:05:26.149849 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.142911 2575 manager.go:341] "Starting Device Plugin manager" Apr 20 20:05:26.149849 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.142937 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 20:05:26.149849 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.142947 2575 server.go:85] "Starting device plugin registration server" Apr 20 20:05:26.149849 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.143203 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 20:05:26.149849 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.143219 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 20:05:26.149849 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.143309 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 20:05:26.149849 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.143382 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 20:05:26.149849 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.143392 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 20:05:26.149849 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.143875 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 20:05:26.149849 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.143908 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-155.ec2.internal\" not found" Apr 20 20:05:26.214858 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.214821 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 20:05:26.216229 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.216214 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 20:05:26.216282 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.216242 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 20:05:26.216282 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.216268 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 20:05:26.216364 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.216287 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 20:05:26.216364 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.216328 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 20:05:26.219917 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.219898 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:26.244308 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.244255 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:26.246950 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.246934 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:26.247035 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.246962 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:26.247035 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.246973 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:26.247035 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.246995 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-155.ec2.internal" Apr 20 20:05:26.254869 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.254855 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-155.ec2.internal" Apr 20 20:05:26.254915 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.254880 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-155.ec2.internal\": node \"ip-10-0-140-155.ec2.internal\" not found" Apr 20 20:05:26.277411 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.277383 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-155.ec2.internal\" not found" Apr 20 20:05:26.317025 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.316997 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-155.ec2.internal"] Apr 20 20:05:26.317090 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.317078 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:26.318477 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.318459 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:26.318545 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.318491 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:26.318545 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.318502 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:26.319548 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.319536 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:26.319689 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.319675 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal" Apr 20 20:05:26.319735 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.319705 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:26.320319 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.320301 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:26.320414 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.320327 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:26.320414 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.320355 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:26.320414 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.320366 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:26.320414 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.320329 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:26.320414 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.320410 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:26.321382 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.321369 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-155.ec2.internal" Apr 20 20:05:26.321429 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.321391 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:26.322305 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.322285 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:26.323662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.323642 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:26.323758 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.323675 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:26.357463 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.357442 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-155.ec2.internal\" not found" node="ip-10-0-140-155.ec2.internal" Apr 20 20:05:26.362019 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.362003 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-155.ec2.internal\" not found" node="ip-10-0-140-155.ec2.internal" Apr 20 20:05:26.378234 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.378213 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-155.ec2.internal\" not found" Apr 20 20:05:26.385525 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.385509 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6b989593a346d1460ce3008799f9024b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal\" (UID: \"6b989593a346d1460ce3008799f9024b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal" Apr 20 20:05:26.385583 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.385533 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b989593a346d1460ce3008799f9024b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal\" (UID: \"6b989593a346d1460ce3008799f9024b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal" Apr 20 20:05:26.385583 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.385551 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fb94a1a385423d979d451957f03175cb-config\") pod \"kube-apiserver-proxy-ip-10-0-140-155.ec2.internal\" (UID: \"fb94a1a385423d979d451957f03175cb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-155.ec2.internal" Apr 20 20:05:26.479270 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.479244 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-155.ec2.internal\" not found" Apr 20 20:05:26.486614 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.486592 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6b989593a346d1460ce3008799f9024b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal\" (UID: \"6b989593a346d1460ce3008799f9024b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal" Apr 20 20:05:26.486698 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.486627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b989593a346d1460ce3008799f9024b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal\" (UID: \"6b989593a346d1460ce3008799f9024b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal" Apr 20 20:05:26.486698 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.486687 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6b989593a346d1460ce3008799f9024b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal\" (UID: \"6b989593a346d1460ce3008799f9024b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal" Apr 20 20:05:26.486817 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.486708 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fb94a1a385423d979d451957f03175cb-config\") pod \"kube-apiserver-proxy-ip-10-0-140-155.ec2.internal\" (UID: \"fb94a1a385423d979d451957f03175cb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-155.ec2.internal" Apr 20 20:05:26.486817 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.486689 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b989593a346d1460ce3008799f9024b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal\" (UID: \"6b989593a346d1460ce3008799f9024b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal" Apr 20 20:05:26.486817 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.486748 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fb94a1a385423d979d451957f03175cb-config\") pod \"kube-apiserver-proxy-ip-10-0-140-155.ec2.internal\" (UID: \"fb94a1a385423d979d451957f03175cb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-155.ec2.internal" Apr 20 20:05:26.580036 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.579970 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-155.ec2.internal\" not found" Apr 20 20:05:26.660560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.660536 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal" Apr 20 20:05:26.665220 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.665204 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-155.ec2.internal" Apr 20 20:05:26.680762 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.680742 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-155.ec2.internal\" not found" Apr 20 20:05:26.781356 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.781328 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-155.ec2.internal\" not found" Apr 20 20:05:26.881862 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.881810 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-155.ec2.internal\" not found" Apr 20 20:05:26.982397 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:26.982369 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-155.ec2.internal\" not found" Apr 20 20:05:26.995755 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.995735 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 20:05:26.995894 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:26.995878 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:05:27.083275 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:27.083251 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 20:05:27.083368 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:27.083263 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-155.ec2.internal\" not found" Apr 20 20:05:27.098082 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:27.098057 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 20:00:26 +0000 UTC" deadline="2027-10-05 15:41:08.417215611 +0000 UTC" Apr 20 20:05:27.098082 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:27.098079 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12787h35m41.319138945s" Apr 20 20:05:27.100106 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:27.100086 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:05:27.110583 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:27.110564 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:27.112624 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:27.112610 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:27.146079 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:27.146016 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-t662p" Apr 20 20:05:27.151298 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:27.151281 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-t662p" Apr 20 20:05:27.185345 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:27.185322 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal" Apr 20 20:05:27.199949 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:27.199925 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:05:27.200749 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:27.200737 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-155.ec2.internal" Apr 20 20:05:27.207933 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:27.207918 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:05:27.220978 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:27.220928 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb94a1a385423d979d451957f03175cb.slice/crio-df935c04e82f7e3b0164040abd0efecf15aca4432a627685ae95d4666b1a909f WatchSource:0}: Error finding container df935c04e82f7e3b0164040abd0efecf15aca4432a627685ae95d4666b1a909f: Status 404 returned error can't find the container with id df935c04e82f7e3b0164040abd0efecf15aca4432a627685ae95d4666b1a909f Apr 20 20:05:27.224419 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:27.224398 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:05:27.289223 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:27.289200 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b989593a346d1460ce3008799f9024b.slice/crio-433a74926a04a0ac1dd020bd662145e59e96c9da8dffb979b3fa4c41616ebff7 WatchSource:0}: Error finding container 433a74926a04a0ac1dd020bd662145e59e96c9da8dffb979b3fa4c41616ebff7: Status 404 returned error can't find the container with id 433a74926a04a0ac1dd020bd662145e59e96c9da8dffb979b3fa4c41616ebff7 Apr 20 20:05:27.546486 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:27.546408 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:27.845017 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:27.844925 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:28.064390 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.064357 2575 apiserver.go:52] "Watching apiserver" Apr 20 20:05:28.070089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.070066 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 20:05:28.072986 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.072960 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-140-155.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm","openshift-dns/node-resolver-pmx7s","openshift-image-registry/node-ca-vhc9b","openshift-multus/multus-additional-cni-plugins-zln2l","kube-system/konnectivity-agent-b2hzv","openshift-cluster-node-tuning-operator/tuned-98bc5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal","openshift-multus/multus-z2mjf","openshift-multus/network-metrics-daemon-tkqg4","openshift-network-diagnostics/network-check-target-zwtx2","openshift-network-operator/iptables-alerter-5ksn8","openshift-ovn-kubernetes/ovnkube-node-5k984"] Apr 20 20:05:28.075942 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.075917 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.078006 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.077986 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.078453 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.078438 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 20:05:28.078514 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.078458 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:05:28.078662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.078647 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-52c62\"" Apr 20 20:05:28.080247 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.080221 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 20:05:28.080704 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.080686 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 20:05:28.080767 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.080725 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-nr8hs\"" Apr 20 20:05:28.081007 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.080992 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 20:05:28.083808 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.083775 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pmx7s" Apr 20 20:05:28.083904 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.083873 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vhc9b" Apr 20 20:05:28.085806 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.085770 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 20:05:28.086245 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.086018 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wrl75\"" Apr 20 20:05:28.086245 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.086058 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 20:05:28.086245 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.086215 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 20:05:28.086245 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.086223 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 20:05:28.086245 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.086223 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 20:05:28.086497 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.086238 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-n2vgw\"" Apr 20 20:05:28.088237 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.088214 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.088339 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.088299 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-b2hzv" Apr 20 20:05:28.090820 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.090189 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 20:05:28.090820 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.090189 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 20:05:28.090820 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.090427 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 20:05:28.090820 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.090450 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.090820 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.090541 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jzfff\"" Apr 20 20:05:28.090820 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.090651 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 20:05:28.090820 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.090694 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 20:05:28.090820 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.090715 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 20:05:28.090820 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.090743 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 20:05:28.091114 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.090957 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-prfnm\"" Apr 20 20:05:28.092592 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.092575 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 20:05:28.092851 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.092835 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:28.092960 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:28.092907 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tkqg4" podUID="e5ae3fcb-9103-43ca-a45c-46f86f99016e" Apr 20 20:05:28.093048 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.093032 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-dgxxt\"" Apr 20 20:05:28.094485 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.094461 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-sysconfig\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.094590 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.094499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-systemd\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.094590 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.094544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-sys\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.094590 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.094568 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a163cdce-f73c-4e59-8bba-e14ea4b5515d-socket-dir\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.094724 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.094593 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a163cdce-f73c-4e59-8bba-e14ea4b5515d-etc-selinux\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.094724 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.094616 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-system-cni-dir\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.094724 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.094671 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-cnibin\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.094864 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.094721 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-modprobe-d\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.094864 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.094755 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-sysctl-conf\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.094864 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.094775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrn9k\" (UniqueName: \"kubernetes.io/projected/dc97d8be-f2be-4f95-b5f8-2f30bd207040-kube-api-access-nrn9k\") pod \"node-ca-vhc9b\" (UID: \"dc97d8be-f2be-4f95-b5f8-2f30bd207040\") " pod="openshift-image-registry/node-ca-vhc9b" Apr 20 20:05:28.094864 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.094821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-os-release\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.094864 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.094845 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-cni-binary-copy\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.095051 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.094865 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.095051 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.094892 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.095051 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.094929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-sysctl-d\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.095051 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.094959 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-lib-modules\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.095051 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.094981 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-tuned\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.095051 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.095015 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.095051 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.095034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/825cf132-8688-4d28-b93d-28380334363a-hosts-file\") pod \"node-resolver-pmx7s\" (UID: \"825cf132-8688-4d28-b93d-28380334363a\") " pod="openshift-dns/node-resolver-pmx7s" Apr 20 20:05:28.095051 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.095050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/825cf132-8688-4d28-b93d-28380334363a-tmp-dir\") pod \"node-resolver-pmx7s\" (UID: \"825cf132-8688-4d28-b93d-28380334363a\") " pod="openshift-dns/node-resolver-pmx7s" Apr 20 20:05:28.095434 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.095071 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-run\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.095434 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.095093 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-var-lib-kubelet\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.095434 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.095113 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-host\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.095434 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.095130 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/daf7a90d-c40a-4f5d-acec-d016728caae2-tmp\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.095434 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.095156 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a163cdce-f73c-4e59-8bba-e14ea4b5515d-registration-dir\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.095434 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.095172 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a163cdce-f73c-4e59-8bba-e14ea4b5515d-sys-fs\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.095434 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.095197 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8js5n\" (UniqueName: \"kubernetes.io/projected/a163cdce-f73c-4e59-8bba-e14ea4b5515d-kube-api-access-8js5n\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.095434 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.095211 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zc4x\" (UniqueName: \"kubernetes.io/projected/daf7a90d-c40a-4f5d-acec-d016728caae2-kube-api-access-7zc4x\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.095434 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.095224 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a163cdce-f73c-4e59-8bba-e14ea4b5515d-device-dir\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.095434 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.095248 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dc97d8be-f2be-4f95-b5f8-2f30bd207040-serviceca\") pod \"node-ca-vhc9b\" (UID: \"dc97d8be-f2be-4f95-b5f8-2f30bd207040\") " pod="openshift-image-registry/node-ca-vhc9b" Apr 20 20:05:28.095434 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.095275 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a163cdce-f73c-4e59-8bba-e14ea4b5515d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.095434 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.095289 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzqxc\" (UniqueName: \"kubernetes.io/projected/825cf132-8688-4d28-b93d-28380334363a-kube-api-access-nzqxc\") pod \"node-resolver-pmx7s\" (UID: \"825cf132-8688-4d28-b93d-28380334363a\") " pod="openshift-dns/node-resolver-pmx7s" Apr 20 20:05:28.095434 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.095326 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5xlp\" (UniqueName: \"kubernetes.io/projected/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-kube-api-access-p5xlp\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.095434 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.095341 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-kubernetes\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.095434 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.095357 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc97d8be-f2be-4f95-b5f8-2f30bd207040-host\") pod \"node-ca-vhc9b\" (UID: \"dc97d8be-f2be-4f95-b5f8-2f30bd207040\") " pod="openshift-image-registry/node-ca-vhc9b" Apr 20 20:05:28.097855 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.097835 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:28.097946 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:28.097895 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zwtx2" podUID="382abf9b-eb7a-4ef7-918c-50641d3558a8" Apr 20 20:05:28.100045 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.100012 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5ksn8" Apr 20 20:05:28.100148 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.100059 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.102500 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.102482 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 20:05:28.102616 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.102596 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:05:28.102720 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.102704 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-f8h82\"" Apr 20 20:05:28.102771 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.102763 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 20:05:28.103323 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.102964 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 20:05:28.103323 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.103311 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 20:05:28.103714 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.103695 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vl2jc\"" Apr 20 20:05:28.103942 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.103896 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 20:05:28.104085 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.104071 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 20:05:28.104277 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.104264 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 20:05:28.104935 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.104921 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 20:05:28.152757 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.152728 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:00:27 +0000 UTC" deadline="2027-10-30 14:25:16.388197301 +0000 UTC" Apr 20 20:05:28.152757 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.152755 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13386h19m48.235444506s" Apr 20 20:05:28.185383 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.185355 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 20:05:28.196346 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.196312 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/daf7a90d-c40a-4f5d-acec-d016728caae2-tmp\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.196509 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.196358 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a163cdce-f73c-4e59-8bba-e14ea4b5515d-registration-dir\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.196509 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.196390 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-run-ovn-kubernetes\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.196509 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.196435 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d96f75f-8097-4ea6-bf1c-c5bc31761969-env-overrides\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.196509 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.196456 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d96f75f-8097-4ea6-bf1c-c5bc31761969-ovnkube-script-lib\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.196509 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.196461 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a163cdce-f73c-4e59-8bba-e14ea4b5515d-registration-dir\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.196757 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.196518 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8js5n\" (UniqueName: \"kubernetes.io/projected/a163cdce-f73c-4e59-8bba-e14ea4b5515d-kube-api-access-8js5n\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.196757 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.196559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zc4x\" (UniqueName: \"kubernetes.io/projected/daf7a90d-c40a-4f5d-acec-d016728caae2-kube-api-access-7zc4x\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.196757 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.196585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a163cdce-f73c-4e59-8bba-e14ea4b5515d-device-dir\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.196757 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.196666 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a163cdce-f73c-4e59-8bba-e14ea4b5515d-device-dir\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.196757 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.196700 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 20:05:28.196757 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.196714 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-host-var-lib-cni-multus\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.197072 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.196761 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-host-var-lib-kubelet\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.197072 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.196782 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcwrf\" (UniqueName: \"kubernetes.io/projected/36ded35a-5271-4357-8314-e71ef065d1c8-kube-api-access-wcwrf\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.197072 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.196845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a163cdce-f73c-4e59-8bba-e14ea4b5515d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.197072 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.196900 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a163cdce-f73c-4e59-8bba-e14ea4b5515d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.197072 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.196911 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzqxc\" (UniqueName: \"kubernetes.io/projected/825cf132-8688-4d28-b93d-28380334363a-kube-api-access-nzqxc\") pod \"node-resolver-pmx7s\" (UID: \"825cf132-8688-4d28-b93d-28380334363a\") " pod="openshift-dns/node-resolver-pmx7s" Apr 20 20:05:28.197072 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.196974 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-log-socket\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.197072 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197002 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-kubernetes\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.197072 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197048 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc97d8be-f2be-4f95-b5f8-2f30bd207040-host\") pod \"node-ca-vhc9b\" (UID: \"dc97d8be-f2be-4f95-b5f8-2f30bd207040\") " pod="openshift-image-registry/node-ca-vhc9b" Apr 20 20:05:28.197072 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197073 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/36ded35a-5271-4357-8314-e71ef065d1c8-multus-daemon-config\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.197375 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197091 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-kubernetes\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.197375 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-systemd\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.197375 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197119 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-sys\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.197375 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a163cdce-f73c-4e59-8bba-e14ea4b5515d-socket-dir\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.197375 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197158 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc97d8be-f2be-4f95-b5f8-2f30bd207040-host\") pod \"node-ca-vhc9b\" (UID: \"dc97d8be-f2be-4f95-b5f8-2f30bd207040\") " pod="openshift-image-registry/node-ca-vhc9b" Apr 20 20:05:28.197375 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197164 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a163cdce-f73c-4e59-8bba-e14ea4b5515d-etc-selinux\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.197375 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197181 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-systemd\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.197375 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197186 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-cnibin\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.197375 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197194 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-sys\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.197375 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197218 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-cnibin\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.197375 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197216 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-multus-cni-dir\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.197375 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197242 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/13fedadd-686b-4e93-a2ae-cbab6edc3bff-iptables-alerter-script\") pod \"iptables-alerter-5ksn8\" (UID: \"13fedadd-686b-4e93-a2ae-cbab6edc3bff\") " pod="openshift-network-operator/iptables-alerter-5ksn8" Apr 20 20:05:28.197375 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197259 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-var-lib-openvswitch\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.197375 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197286 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-host-run-k8s-cni-cncf-io\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.197375 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197293 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a163cdce-f73c-4e59-8bba-e14ea4b5515d-socket-dir\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.197375 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197301 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-multus-conf-dir\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.197375 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197302 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a163cdce-f73c-4e59-8bba-e14ea4b5515d-etc-selinux\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.198238 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197317 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-run-systemd\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.198238 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197345 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-run-ovn\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.198238 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197377 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-sysctl-conf\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.198238 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197410 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-os-release\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.198238 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197441 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-cni-binary-copy\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.198238 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197462 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.198238 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197466 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-sysctl-conf\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.198238 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197486 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-os-release\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.198238 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197487 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-multus-socket-dir-parent\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.198238 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197526 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-host-var-lib-cni-bin\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.198238 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197563 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13fedadd-686b-4e93-a2ae-cbab6edc3bff-host-slash\") pod \"iptables-alerter-5ksn8\" (UID: \"13fedadd-686b-4e93-a2ae-cbab6edc3bff\") " pod="openshift-network-operator/iptables-alerter-5ksn8" Apr 20 20:05:28.198238 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197587 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-cni-netd\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.198238 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-lib-modules\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.198238 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197584 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.198238 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197720 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-tuned\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.198238 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197819 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-lib-modules\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.198238 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197817 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.199001 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197853 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pb7c\" (UniqueName: \"kubernetes.io/projected/e5ae3fcb-9103-43ca-a45c-46f86f99016e-kube-api-access-5pb7c\") pod \"network-metrics-daemon-tkqg4\" (UID: \"e5ae3fcb-9103-43ca-a45c-46f86f99016e\") " pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:28.199001 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197878 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-kubelet\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.199001 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197903 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-run-openvswitch\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.199001 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.197929 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/825cf132-8688-4d28-b93d-28380334363a-hosts-file\") pod \"node-resolver-pmx7s\" (UID: \"825cf132-8688-4d28-b93d-28380334363a\") " pod="openshift-dns/node-resolver-pmx7s" Apr 20 20:05:28.199001 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-run\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.199001 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198040 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-var-lib-kubelet\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.199001 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198062 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-host\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.199001 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198070 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-cni-binary-copy\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.199001 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a163cdce-f73c-4e59-8bba-e14ea4b5515d-sys-fs\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.199001 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198128 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dc97d8be-f2be-4f95-b5f8-2f30bd207040-serviceca\") pod \"node-ca-vhc9b\" (UID: \"dc97d8be-f2be-4f95-b5f8-2f30bd207040\") " pod="openshift-image-registry/node-ca-vhc9b" Apr 20 20:05:28.199001 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198168 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-system-cni-dir\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.199001 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/825cf132-8688-4d28-b93d-28380334363a-hosts-file\") pod \"node-resolver-pmx7s\" (UID: \"825cf132-8688-4d28-b93d-28380334363a\") " pod="openshift-dns/node-resolver-pmx7s" Apr 20 20:05:28.199001 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198228 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-run\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.199001 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198265 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-host\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.199001 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198261 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs\") pod \"network-metrics-daemon-tkqg4\" (UID: \"e5ae3fcb-9103-43ca-a45c-46f86f99016e\") " pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:28.199001 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198289 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.199001 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198305 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-hostroot\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.199560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198315 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-var-lib-kubelet\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.199560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198331 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85fws\" (UniqueName: \"kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws\") pod \"network-check-target-zwtx2\" (UID: \"382abf9b-eb7a-4ef7-918c-50641d3558a8\") " pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:28.199560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198357 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-systemd-units\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.199560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198387 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a163cdce-f73c-4e59-8bba-e14ea4b5515d-sys-fs\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.199560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198428 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5xlp\" (UniqueName: \"kubernetes.io/projected/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-kube-api-access-p5xlp\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.199560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198517 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dc97d8be-f2be-4f95-b5f8-2f30bd207040-serviceca\") pod \"node-ca-vhc9b\" (UID: \"dc97d8be-f2be-4f95-b5f8-2f30bd207040\") " pod="openshift-image-registry/node-ca-vhc9b" Apr 20 20:05:28.199560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198616 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-host-run-netns\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.199560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198683 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvg7g\" (UniqueName: \"kubernetes.io/projected/13fedadd-686b-4e93-a2ae-cbab6edc3bff-kube-api-access-mvg7g\") pod \"iptables-alerter-5ksn8\" (UID: \"13fedadd-686b-4e93-a2ae-cbab6edc3bff\") " pod="openshift-network-operator/iptables-alerter-5ksn8" Apr 20 20:05:28.199560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198708 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d96f75f-8097-4ea6-bf1c-c5bc31761969-ovnkube-config\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.199560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198734 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d96f75f-8097-4ea6-bf1c-c5bc31761969-ovn-node-metrics-cert\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.199560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198779 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2cdb632c-969a-4d5b-96a5-44863d4b1606-agent-certs\") pod \"konnectivity-agent-b2hzv\" (UID: \"2cdb632c-969a-4d5b-96a5-44863d4b1606\") " pod="kube-system/konnectivity-agent-b2hzv" Apr 20 20:05:28.199560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198826 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-host-run-multus-certs\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.199560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198854 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-etc-openvswitch\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.199560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198882 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-sysconfig\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.199560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198911 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-system-cni-dir\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.199560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198948 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/36ded35a-5271-4357-8314-e71ef065d1c8-cni-binary-copy\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.199560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198980 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-sysconfig\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.200249 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.198985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-modprobe-d\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.200249 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199022 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-system-cni-dir\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.200249 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199065 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-modprobe-d\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.200249 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199069 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-cnibin\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.200249 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199105 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrn9k\" (UniqueName: \"kubernetes.io/projected/dc97d8be-f2be-4f95-b5f8-2f30bd207040-kube-api-access-nrn9k\") pod \"node-ca-vhc9b\" (UID: \"dc97d8be-f2be-4f95-b5f8-2f30bd207040\") " pod="openshift-image-registry/node-ca-vhc9b" Apr 20 20:05:28.200249 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199132 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.200249 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199160 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-os-release\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.200249 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-etc-kubernetes\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.200249 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199206 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-run-netns\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.200249 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199230 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.200249 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199256 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6vsd\" (UniqueName: \"kubernetes.io/projected/3d96f75f-8097-4ea6-bf1c-c5bc31761969-kube-api-access-j6vsd\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.200249 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199281 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-sysctl-d\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.200249 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199317 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2cdb632c-969a-4d5b-96a5-44863d4b1606-konnectivity-ca\") pod \"konnectivity-agent-b2hzv\" (UID: \"2cdb632c-969a-4d5b-96a5-44863d4b1606\") " pod="kube-system/konnectivity-agent-b2hzv" Apr 20 20:05:28.200249 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199348 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-slash\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.200249 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199372 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-node-log\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.200249 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199394 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-cni-bin\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.200249 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199420 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/825cf132-8688-4d28-b93d-28380334363a-tmp-dir\") pod \"node-resolver-pmx7s\" (UID: \"825cf132-8688-4d28-b93d-28380334363a\") " pod="openshift-dns/node-resolver-pmx7s" Apr 20 20:05:28.200830 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199706 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/825cf132-8688-4d28-b93d-28380334363a-tmp-dir\") pod \"node-resolver-pmx7s\" (UID: \"825cf132-8688-4d28-b93d-28380334363a\") " pod="openshift-dns/node-resolver-pmx7s" Apr 20 20:05:28.200830 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199784 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.200830 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.199828 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-sysctl-d\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.200830 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.200251 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/daf7a90d-c40a-4f5d-acec-d016728caae2-tmp\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.200830 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.200411 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/daf7a90d-c40a-4f5d-acec-d016728caae2-etc-tuned\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.208815 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.208773 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzqxc\" (UniqueName: \"kubernetes.io/projected/825cf132-8688-4d28-b93d-28380334363a-kube-api-access-nzqxc\") pod \"node-resolver-pmx7s\" (UID: \"825cf132-8688-4d28-b93d-28380334363a\") " pod="openshift-dns/node-resolver-pmx7s" Apr 20 20:05:28.208958 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.208876 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8js5n\" (UniqueName: \"kubernetes.io/projected/a163cdce-f73c-4e59-8bba-e14ea4b5515d-kube-api-access-8js5n\") pod \"aws-ebs-csi-driver-node-wxbqm\" (UID: \"a163cdce-f73c-4e59-8bba-e14ea4b5515d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.209246 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.209225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zc4x\" (UniqueName: \"kubernetes.io/projected/daf7a90d-c40a-4f5d-acec-d016728caae2-kube-api-access-7zc4x\") pod \"tuned-98bc5\" (UID: \"daf7a90d-c40a-4f5d-acec-d016728caae2\") " pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.209313 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.209233 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5xlp\" (UniqueName: \"kubernetes.io/projected/77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87-kube-api-access-p5xlp\") pod \"multus-additional-cni-plugins-zln2l\" (UID: \"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87\") " pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.210255 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.210232 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrn9k\" (UniqueName: \"kubernetes.io/projected/dc97d8be-f2be-4f95-b5f8-2f30bd207040-kube-api-access-nrn9k\") pod \"node-ca-vhc9b\" (UID: \"dc97d8be-f2be-4f95-b5f8-2f30bd207040\") " pod="openshift-image-registry/node-ca-vhc9b" Apr 20 20:05:28.220486 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.220439 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal" event={"ID":"6b989593a346d1460ce3008799f9024b","Type":"ContainerStarted","Data":"433a74926a04a0ac1dd020bd662145e59e96c9da8dffb979b3fa4c41616ebff7"} Apr 20 20:05:28.221565 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.221540 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-155.ec2.internal" event={"ID":"fb94a1a385423d979d451957f03175cb","Type":"ContainerStarted","Data":"df935c04e82f7e3b0164040abd0efecf15aca4432a627685ae95d4666b1a909f"} Apr 20 20:05:28.300155 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-run-openvswitch\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.300334 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-system-cni-dir\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.300334 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs\") pod \"network-metrics-daemon-tkqg4\" (UID: \"e5ae3fcb-9103-43ca-a45c-46f86f99016e\") " pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:28.300334 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300239 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-run-openvswitch\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.300334 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-hostroot\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.300334 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300278 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-system-cni-dir\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.300334 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:28.300293 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:28.300334 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300304 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85fws\" (UniqueName: \"kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws\") pod \"network-check-target-zwtx2\" (UID: \"382abf9b-eb7a-4ef7-918c-50641d3558a8\") " pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:28.300334 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300330 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-systemd-units\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.300710 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300338 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-hostroot\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.300710 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300356 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-host-run-netns\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.300710 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:28.300382 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs podName:e5ae3fcb-9103-43ca-a45c-46f86f99016e nodeName:}" failed. No retries permitted until 2026-04-20 20:05:28.800358509 +0000 UTC m=+3.154961236 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs") pod "network-metrics-daemon-tkqg4" (UID: "e5ae3fcb-9103-43ca-a45c-46f86f99016e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:28.300710 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300396 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-host-run-netns\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.300710 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvg7g\" (UniqueName: \"kubernetes.io/projected/13fedadd-686b-4e93-a2ae-cbab6edc3bff-kube-api-access-mvg7g\") pod \"iptables-alerter-5ksn8\" (UID: \"13fedadd-686b-4e93-a2ae-cbab6edc3bff\") " pod="openshift-network-operator/iptables-alerter-5ksn8" Apr 20 20:05:28.300710 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300439 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d96f75f-8097-4ea6-bf1c-c5bc31761969-ovnkube-config\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.300710 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300441 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-systemd-units\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.300710 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300472 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d96f75f-8097-4ea6-bf1c-c5bc31761969-ovn-node-metrics-cert\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.300710 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300498 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2cdb632c-969a-4d5b-96a5-44863d4b1606-agent-certs\") pod \"konnectivity-agent-b2hzv\" (UID: \"2cdb632c-969a-4d5b-96a5-44863d4b1606\") " pod="kube-system/konnectivity-agent-b2hzv" Apr 20 20:05:28.300710 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-host-run-multus-certs\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.300710 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300584 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-host-run-multus-certs\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.300710 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300663 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-etc-openvswitch\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.300710 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300695 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/36ded35a-5271-4357-8314-e71ef065d1c8-cni-binary-copy\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.301333 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-cnibin\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.301333 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300753 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-os-release\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.301333 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300775 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-etc-kubernetes\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.301333 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300819 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-run-netns\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.301333 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300844 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.301333 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300868 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6vsd\" (UniqueName: \"kubernetes.io/projected/3d96f75f-8097-4ea6-bf1c-c5bc31761969-kube-api-access-j6vsd\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.301333 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300900 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2cdb632c-969a-4d5b-96a5-44863d4b1606-konnectivity-ca\") pod \"konnectivity-agent-b2hzv\" (UID: \"2cdb632c-969a-4d5b-96a5-44863d4b1606\") " pod="kube-system/konnectivity-agent-b2hzv" Apr 20 20:05:28.301333 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-slash\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.301333 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-node-log\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.301333 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.300978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-cni-bin\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.301333 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-run-ovn-kubernetes\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.301333 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301015 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-run-netns\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.301333 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d96f75f-8097-4ea6-bf1c-c5bc31761969-env-overrides\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.301333 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301056 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-etc-openvswitch\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.301333 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d96f75f-8097-4ea6-bf1c-c5bc31761969-ovnkube-script-lib\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.301333 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301095 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-host-var-lib-cni-multus\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.301333 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301120 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-host-var-lib-kubelet\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.302089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcwrf\" (UniqueName: \"kubernetes.io/projected/36ded35a-5271-4357-8314-e71ef065d1c8-kube-api-access-wcwrf\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.302089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-log-socket\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.302089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301199 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/36ded35a-5271-4357-8314-e71ef065d1c8-multus-daemon-config\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.302089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301228 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-multus-cni-dir\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.302089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301232 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d96f75f-8097-4ea6-bf1c-c5bc31761969-ovnkube-config\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.302089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301252 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/13fedadd-686b-4e93-a2ae-cbab6edc3bff-iptables-alerter-script\") pod \"iptables-alerter-5ksn8\" (UID: \"13fedadd-686b-4e93-a2ae-cbab6edc3bff\") " pod="openshift-network-operator/iptables-alerter-5ksn8" Apr 20 20:05:28.302089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-var-lib-openvswitch\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.302089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-host-run-k8s-cni-cncf-io\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.302089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301328 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-multus-conf-dir\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.302089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-run-systemd\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.302089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301377 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-run-ovn\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.302089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301408 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-multus-socket-dir-parent\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.302089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-host-var-lib-cni-bin\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.302089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301462 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13fedadd-686b-4e93-a2ae-cbab6edc3bff-host-slash\") pod \"iptables-alerter-5ksn8\" (UID: \"13fedadd-686b-4e93-a2ae-cbab6edc3bff\") " pod="openshift-network-operator/iptables-alerter-5ksn8" Apr 20 20:05:28.302089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301485 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-cni-netd\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.302089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301527 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pb7c\" (UniqueName: \"kubernetes.io/projected/e5ae3fcb-9103-43ca-a45c-46f86f99016e-kube-api-access-5pb7c\") pod \"network-metrics-daemon-tkqg4\" (UID: \"e5ae3fcb-9103-43ca-a45c-46f86f99016e\") " pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:28.302089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301553 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-kubelet\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.302763 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301557 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d96f75f-8097-4ea6-bf1c-c5bc31761969-ovnkube-script-lib\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.302763 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301584 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-cnibin\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.302763 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301627 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-run-ovn-kubernetes\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.302763 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301644 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.302763 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301671 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-run-ovn\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.302763 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301677 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-os-release\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.302763 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301721 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-host-var-lib-cni-bin\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.302763 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301735 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-etc-kubernetes\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.302763 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301781 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-host-var-lib-kubelet\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.302763 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301852 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-multus-socket-dir-parent\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.302763 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301874 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-cni-netd\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.302763 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.301893 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13fedadd-686b-4e93-a2ae-cbab6edc3bff-host-slash\") pod \"iptables-alerter-5ksn8\" (UID: \"13fedadd-686b-4e93-a2ae-cbab6edc3bff\") " pod="openshift-network-operator/iptables-alerter-5ksn8" Apr 20 20:05:28.302763 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.302079 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d96f75f-8097-4ea6-bf1c-c5bc31761969-env-overrides\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.302763 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.302086 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2cdb632c-969a-4d5b-96a5-44863d4b1606-konnectivity-ca\") pod \"konnectivity-agent-b2hzv\" (UID: \"2cdb632c-969a-4d5b-96a5-44863d4b1606\") " pod="kube-system/konnectivity-agent-b2hzv" Apr 20 20:05:28.302763 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.302130 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-run-systemd\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.302763 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.302146 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-host-run-k8s-cni-cncf-io\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.302763 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.302165 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-var-lib-openvswitch\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.302763 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.302189 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-multus-conf-dir\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.303468 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.302207 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-slash\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.303468 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.302247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-node-log\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.303468 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.302263 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-multus-cni-dir\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.303468 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.302273 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/36ded35a-5271-4357-8314-e71ef065d1c8-cni-binary-copy\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.303468 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.302279 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-cni-bin\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.303468 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.302296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-host-kubelet\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.303468 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.302329 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/36ded35a-5271-4357-8314-e71ef065d1c8-host-var-lib-cni-multus\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.303468 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.302366 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/36ded35a-5271-4357-8314-e71ef065d1c8-multus-daemon-config\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.303468 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.302420 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/13fedadd-686b-4e93-a2ae-cbab6edc3bff-iptables-alerter-script\") pod \"iptables-alerter-5ksn8\" (UID: \"13fedadd-686b-4e93-a2ae-cbab6edc3bff\") " pod="openshift-network-operator/iptables-alerter-5ksn8" Apr 20 20:05:28.303468 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.302429 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d96f75f-8097-4ea6-bf1c-c5bc31761969-log-socket\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.303955 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.303575 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d96f75f-8097-4ea6-bf1c-c5bc31761969-ovn-node-metrics-cert\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.303955 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.303924 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2cdb632c-969a-4d5b-96a5-44863d4b1606-agent-certs\") pod \"konnectivity-agent-b2hzv\" (UID: \"2cdb632c-969a-4d5b-96a5-44863d4b1606\") " pod="kube-system/konnectivity-agent-b2hzv" Apr 20 20:05:28.308368 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:28.308347 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:28.308368 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:28.308368 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:28.308539 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:28.308381 2575 projected.go:194] Error preparing data for projected volume kube-api-access-85fws for pod openshift-network-diagnostics/network-check-target-zwtx2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:28.308539 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:28.308444 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws podName:382abf9b-eb7a-4ef7-918c-50641d3558a8 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:28.808427823 +0000 UTC m=+3.163030562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-85fws" (UniqueName: "kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws") pod "network-check-target-zwtx2" (UID: "382abf9b-eb7a-4ef7-918c-50641d3558a8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:28.310704 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.310669 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvg7g\" (UniqueName: \"kubernetes.io/projected/13fedadd-686b-4e93-a2ae-cbab6edc3bff-kube-api-access-mvg7g\") pod \"iptables-alerter-5ksn8\" (UID: \"13fedadd-686b-4e93-a2ae-cbab6edc3bff\") " pod="openshift-network-operator/iptables-alerter-5ksn8" Apr 20 20:05:28.310877 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.310826 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pb7c\" (UniqueName: \"kubernetes.io/projected/e5ae3fcb-9103-43ca-a45c-46f86f99016e-kube-api-access-5pb7c\") pod \"network-metrics-daemon-tkqg4\" (UID: \"e5ae3fcb-9103-43ca-a45c-46f86f99016e\") " pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:28.310947 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.310933 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcwrf\" (UniqueName: \"kubernetes.io/projected/36ded35a-5271-4357-8314-e71ef065d1c8-kube-api-access-wcwrf\") pod \"multus-z2mjf\" (UID: \"36ded35a-5271-4357-8314-e71ef065d1c8\") " pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.311028 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.311007 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6vsd\" (UniqueName: \"kubernetes.io/projected/3d96f75f-8097-4ea6-bf1c-c5bc31761969-kube-api-access-j6vsd\") pod \"ovnkube-node-5k984\" (UID: \"3d96f75f-8097-4ea6-bf1c-c5bc31761969\") " pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.393012 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.392926 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-98bc5" Apr 20 20:05:28.399806 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.399763 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" Apr 20 20:05:28.408335 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.408311 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pmx7s" Apr 20 20:05:28.414890 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.414867 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5ksn8" Apr 20 20:05:28.423548 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.423531 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:28.429148 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.429130 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zln2l" Apr 20 20:05:28.435839 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.435820 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-b2hzv" Apr 20 20:05:28.442470 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.442445 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z2mjf" Apr 20 20:05:28.448070 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.448046 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vhc9b" Apr 20 20:05:28.804619 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.804537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs\") pod \"network-metrics-daemon-tkqg4\" (UID: \"e5ae3fcb-9103-43ca-a45c-46f86f99016e\") " pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:28.804810 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:28.804711 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:28.804810 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:28.804807 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs podName:e5ae3fcb-9103-43ca-a45c-46f86f99016e nodeName:}" failed. No retries permitted until 2026-04-20 20:05:29.804771675 +0000 UTC m=+4.159374399 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs") pod "network-metrics-daemon-tkqg4" (UID: "e5ae3fcb-9103-43ca-a45c-46f86f99016e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:28.903626 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:28.903515 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaf7a90d_c40a_4f5d_acec_d016728caae2.slice/crio-442c713118985c09ea91caa756357ea003d20773c3d3246ab0b622390d6db93c WatchSource:0}: Error finding container 442c713118985c09ea91caa756357ea003d20773c3d3246ab0b622390d6db93c: Status 404 returned error can't find the container with id 442c713118985c09ea91caa756357ea003d20773c3d3246ab0b622390d6db93c Apr 20 20:05:28.904867 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:28.904840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85fws\" (UniqueName: \"kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws\") pod \"network-check-target-zwtx2\" (UID: \"382abf9b-eb7a-4ef7-918c-50641d3558a8\") " pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:28.905009 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:28.904993 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:28.905062 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:28.905012 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:28.905062 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:28.905022 2575 projected.go:194] Error preparing data for projected volume kube-api-access-85fws for pod openshift-network-diagnostics/network-check-target-zwtx2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:28.905137 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:28.905073 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws podName:382abf9b-eb7a-4ef7-918c-50641d3558a8 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:29.905055937 +0000 UTC m=+4.259658662 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-85fws" (UniqueName: "kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws") pod "network-check-target-zwtx2" (UID: "382abf9b-eb7a-4ef7-918c-50641d3558a8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:28.905826 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:28.905781 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d96f75f_8097_4ea6_bf1c_c5bc31761969.slice/crio-00e24b1034d5b09622e5c199a3b86d857196330c2eb035cbab44137fac74ee4a WatchSource:0}: Error finding container 00e24b1034d5b09622e5c199a3b86d857196330c2eb035cbab44137fac74ee4a: Status 404 returned error can't find the container with id 00e24b1034d5b09622e5c199a3b86d857196330c2eb035cbab44137fac74ee4a Apr 20 20:05:28.908138 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:28.908120 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc97d8be_f2be_4f95_b5f8_2f30bd207040.slice/crio-644faed72f1f3885cde62570b0dff2ac28f67c47730cadb37b3fc1d881946bae WatchSource:0}: Error finding container 644faed72f1f3885cde62570b0dff2ac28f67c47730cadb37b3fc1d881946bae: Status 404 returned error can't find the container with id 644faed72f1f3885cde62570b0dff2ac28f67c47730cadb37b3fc1d881946bae Apr 20 20:05:28.908874 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:28.908846 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13fedadd_686b_4e93_a2ae_cbab6edc3bff.slice/crio-81ccfb3354709dde766a5e835a1783b6e5d19af227bddd40af74ed0a55cc5623 WatchSource:0}: Error finding container 81ccfb3354709dde766a5e835a1783b6e5d19af227bddd40af74ed0a55cc5623: Status 404 returned error can't find the container with id 81ccfb3354709dde766a5e835a1783b6e5d19af227bddd40af74ed0a55cc5623 Apr 20 20:05:28.910233 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:28.909757 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod825cf132_8688_4d28_b93d_28380334363a.slice/crio-df0965e628d7fc2e220ac27d8965443738415f2819a5d8f199e2fe56aee01a2f WatchSource:0}: Error finding container df0965e628d7fc2e220ac27d8965443738415f2819a5d8f199e2fe56aee01a2f: Status 404 returned error can't find the container with id df0965e628d7fc2e220ac27d8965443738415f2819a5d8f199e2fe56aee01a2f Apr 20 20:05:28.910890 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:28.910837 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77159cf3_9f8b_4a2f_8ff1_7210fcaf5e87.slice/crio-823f83c1fa10904cf370a182c994a44987c45b0bb64ceff04df7f3cc057f2ec5 WatchSource:0}: Error finding container 823f83c1fa10904cf370a182c994a44987c45b0bb64ceff04df7f3cc057f2ec5: Status 404 returned error can't find the container with id 823f83c1fa10904cf370a182c994a44987c45b0bb64ceff04df7f3cc057f2ec5 Apr 20 20:05:28.913868 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:28.913747 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cdb632c_969a_4d5b_96a5_44863d4b1606.slice/crio-368d2eced799fb02eb613e40dfd835761a40b6fbf31a280c41d69ce520870b58 WatchSource:0}: Error finding container 368d2eced799fb02eb613e40dfd835761a40b6fbf31a280c41d69ce520870b58: Status 404 returned error can't find the container with id 368d2eced799fb02eb613e40dfd835761a40b6fbf31a280c41d69ce520870b58 Apr 20 20:05:28.914606 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:05:28.914585 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda163cdce_f73c_4e59_8bba_e14ea4b5515d.slice/crio-c7cd1a321525862d420c247f38e43a9a5ac149192da1ee3cd6be6da33fe4f8c8 WatchSource:0}: Error finding container c7cd1a321525862d420c247f38e43a9a5ac149192da1ee3cd6be6da33fe4f8c8: Status 404 returned error can't find the container with id c7cd1a321525862d420c247f38e43a9a5ac149192da1ee3cd6be6da33fe4f8c8 Apr 20 20:05:29.153192 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:29.152954 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:00:27 +0000 UTC" deadline="2027-12-12 00:25:17.648803745 +0000 UTC" Apr 20 20:05:29.153192 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:29.153128 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14404h19m48.495679402s" Apr 20 20:05:29.223872 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:29.223813 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-155.ec2.internal" event={"ID":"fb94a1a385423d979d451957f03175cb","Type":"ContainerStarted","Data":"8defd0941767a74df7c7891ba7ea58fceb66ec4729f3ca7b20d4a164494697c6"} Apr 20 20:05:29.224702 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:29.224683 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" event={"ID":"a163cdce-f73c-4e59-8bba-e14ea4b5515d","Type":"ContainerStarted","Data":"c7cd1a321525862d420c247f38e43a9a5ac149192da1ee3cd6be6da33fe4f8c8"} Apr 20 20:05:29.225659 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:29.225640 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-b2hzv" event={"ID":"2cdb632c-969a-4d5b-96a5-44863d4b1606","Type":"ContainerStarted","Data":"368d2eced799fb02eb613e40dfd835761a40b6fbf31a280c41d69ce520870b58"} Apr 20 20:05:29.226601 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:29.226573 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z2mjf" event={"ID":"36ded35a-5271-4357-8314-e71ef065d1c8","Type":"ContainerStarted","Data":"baf7e8f0b190117ada83410e5e8a1938a6ec709479e5f86b14a9497bdb1a77f8"} Apr 20 20:05:29.228735 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:29.228708 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5ksn8" event={"ID":"13fedadd-686b-4e93-a2ae-cbab6edc3bff","Type":"ContainerStarted","Data":"81ccfb3354709dde766a5e835a1783b6e5d19af227bddd40af74ed0a55cc5623"} Apr 20 20:05:29.230568 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:29.230538 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vhc9b" event={"ID":"dc97d8be-f2be-4f95-b5f8-2f30bd207040","Type":"ContainerStarted","Data":"644faed72f1f3885cde62570b0dff2ac28f67c47730cadb37b3fc1d881946bae"} Apr 20 20:05:29.231627 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:29.231609 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" event={"ID":"3d96f75f-8097-4ea6-bf1c-c5bc31761969","Type":"ContainerStarted","Data":"00e24b1034d5b09622e5c199a3b86d857196330c2eb035cbab44137fac74ee4a"} Apr 20 20:05:29.232484 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:29.232464 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zln2l" event={"ID":"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87","Type":"ContainerStarted","Data":"823f83c1fa10904cf370a182c994a44987c45b0bb64ceff04df7f3cc057f2ec5"} Apr 20 20:05:29.234176 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:29.234152 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pmx7s" event={"ID":"825cf132-8688-4d28-b93d-28380334363a","Type":"ContainerStarted","Data":"df0965e628d7fc2e220ac27d8965443738415f2819a5d8f199e2fe56aee01a2f"} Apr 20 20:05:29.238629 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:29.236546 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-98bc5" event={"ID":"daf7a90d-c40a-4f5d-acec-d016728caae2","Type":"ContainerStarted","Data":"442c713118985c09ea91caa756357ea003d20773c3d3246ab0b622390d6db93c"} Apr 20 20:05:29.238629 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:29.237283 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-155.ec2.internal" podStartSLOduration=2.237269285 podStartE2EDuration="2.237269285s" podCreationTimestamp="2026-04-20 20:05:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:05:29.236383697 +0000 UTC m=+3.590986445" watchObservedRunningTime="2026-04-20 20:05:29.237269285 +0000 UTC m=+3.591872034" Apr 20 20:05:29.812046 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:29.812014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs\") pod \"network-metrics-daemon-tkqg4\" (UID: \"e5ae3fcb-9103-43ca-a45c-46f86f99016e\") " pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:29.812196 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:29.812168 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:29.812253 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:29.812229 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs podName:e5ae3fcb-9103-43ca-a45c-46f86f99016e nodeName:}" failed. No retries permitted until 2026-04-20 20:05:31.812211998 +0000 UTC m=+6.166814726 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs") pod "network-metrics-daemon-tkqg4" (UID: "e5ae3fcb-9103-43ca-a45c-46f86f99016e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:29.912671 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:29.912591 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85fws\" (UniqueName: \"kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws\") pod \"network-check-target-zwtx2\" (UID: \"382abf9b-eb7a-4ef7-918c-50641d3558a8\") " pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:29.912851 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:29.912748 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:29.912851 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:29.912766 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:29.912851 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:29.912779 2575 projected.go:194] Error preparing data for projected volume kube-api-access-85fws for pod openshift-network-diagnostics/network-check-target-zwtx2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:29.913034 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:29.912853 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws podName:382abf9b-eb7a-4ef7-918c-50641d3558a8 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:31.912833745 +0000 UTC m=+6.267436483 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-85fws" (UniqueName: "kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws") pod "network-check-target-zwtx2" (UID: "382abf9b-eb7a-4ef7-918c-50641d3558a8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:30.220677 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:30.219955 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:30.220677 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:30.220092 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tkqg4" podUID="e5ae3fcb-9103-43ca-a45c-46f86f99016e" Apr 20 20:05:30.220677 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:30.220494 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:30.220677 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:30.220582 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zwtx2" podUID="382abf9b-eb7a-4ef7-918c-50641d3558a8" Apr 20 20:05:30.260914 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:30.260816 2575 generic.go:358] "Generic (PLEG): container finished" podID="6b989593a346d1460ce3008799f9024b" containerID="2df496a3b315a21378903b8879454d72642030ef67acad6ee77a18368ffd25c9" exitCode=0 Apr 20 20:05:30.260914 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:30.260856 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal" event={"ID":"6b989593a346d1460ce3008799f9024b","Type":"ContainerDied","Data":"2df496a3b315a21378903b8879454d72642030ef67acad6ee77a18368ffd25c9"} Apr 20 20:05:31.297823 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:31.297485 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal" event={"ID":"6b989593a346d1460ce3008799f9024b","Type":"ContainerStarted","Data":"1b74fd189e98e3ce177f4bcfa4033fffbd4238c77b479ded4e74c930954c79e7"} Apr 20 20:05:31.833289 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:31.833249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs\") pod \"network-metrics-daemon-tkqg4\" (UID: \"e5ae3fcb-9103-43ca-a45c-46f86f99016e\") " pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:31.833457 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:31.833415 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:31.833515 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:31.833485 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs podName:e5ae3fcb-9103-43ca-a45c-46f86f99016e nodeName:}" failed. No retries permitted until 2026-04-20 20:05:35.833465523 +0000 UTC m=+10.188068247 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs") pod "network-metrics-daemon-tkqg4" (UID: "e5ae3fcb-9103-43ca-a45c-46f86f99016e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:31.934192 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:31.934144 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85fws\" (UniqueName: \"kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws\") pod \"network-check-target-zwtx2\" (UID: \"382abf9b-eb7a-4ef7-918c-50641d3558a8\") " pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:31.934332 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:31.934292 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:31.934332 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:31.934313 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:31.934332 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:31.934322 2575 projected.go:194] Error preparing data for projected volume kube-api-access-85fws for pod openshift-network-diagnostics/network-check-target-zwtx2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:31.934450 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:31.934367 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws podName:382abf9b-eb7a-4ef7-918c-50641d3558a8 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:35.934352894 +0000 UTC m=+10.288955619 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-85fws" (UniqueName: "kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws") pod "network-check-target-zwtx2" (UID: "382abf9b-eb7a-4ef7-918c-50641d3558a8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:32.218006 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:32.217927 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:32.218168 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:32.218103 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tkqg4" podUID="e5ae3fcb-9103-43ca-a45c-46f86f99016e" Apr 20 20:05:32.218168 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:32.218124 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:32.218279 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:32.218245 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zwtx2" podUID="382abf9b-eb7a-4ef7-918c-50641d3558a8" Apr 20 20:05:34.216636 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:34.216596 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:34.217017 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:34.216727 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zwtx2" podUID="382abf9b-eb7a-4ef7-918c-50641d3558a8" Apr 20 20:05:34.217193 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:34.217174 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:34.217309 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:34.217289 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tkqg4" podUID="e5ae3fcb-9103-43ca-a45c-46f86f99016e" Apr 20 20:05:35.868293 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:35.867699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs\") pod \"network-metrics-daemon-tkqg4\" (UID: \"e5ae3fcb-9103-43ca-a45c-46f86f99016e\") " pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:35.868293 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:35.867881 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:35.868293 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:35.867950 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs podName:e5ae3fcb-9103-43ca-a45c-46f86f99016e nodeName:}" failed. No retries permitted until 2026-04-20 20:05:43.867929213 +0000 UTC m=+18.222531944 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs") pod "network-metrics-daemon-tkqg4" (UID: "e5ae3fcb-9103-43ca-a45c-46f86f99016e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:35.968333 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:35.968290 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85fws\" (UniqueName: \"kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws\") pod \"network-check-target-zwtx2\" (UID: \"382abf9b-eb7a-4ef7-918c-50641d3558a8\") " pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:35.968496 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:35.968448 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:35.968496 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:35.968464 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:35.968496 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:35.968477 2575 projected.go:194] Error preparing data for projected volume kube-api-access-85fws for pod openshift-network-diagnostics/network-check-target-zwtx2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:35.968598 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:35.968524 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws podName:382abf9b-eb7a-4ef7-918c-50641d3558a8 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:43.968510614 +0000 UTC m=+18.323113338 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-85fws" (UniqueName: "kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws") pod "network-check-target-zwtx2" (UID: "382abf9b-eb7a-4ef7-918c-50641d3558a8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:36.217742 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:36.217294 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:36.217742 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:36.217318 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:36.217742 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:36.217407 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tkqg4" podUID="e5ae3fcb-9103-43ca-a45c-46f86f99016e" Apr 20 20:05:36.217742 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:36.217463 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zwtx2" podUID="382abf9b-eb7a-4ef7-918c-50641d3558a8" Apr 20 20:05:38.216572 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:38.216535 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:38.217027 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:38.216656 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zwtx2" podUID="382abf9b-eb7a-4ef7-918c-50641d3558a8" Apr 20 20:05:38.217094 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:38.216535 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:38.217192 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:38.217162 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tkqg4" podUID="e5ae3fcb-9103-43ca-a45c-46f86f99016e" Apr 20 20:05:40.217245 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:40.217203 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:40.217245 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:40.217254 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:40.217756 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:40.217351 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zwtx2" podUID="382abf9b-eb7a-4ef7-918c-50641d3558a8" Apr 20 20:05:40.217756 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:40.217518 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tkqg4" podUID="e5ae3fcb-9103-43ca-a45c-46f86f99016e" Apr 20 20:05:42.217324 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:42.217292 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:42.217324 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:42.217324 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:42.217906 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:42.217429 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tkqg4" podUID="e5ae3fcb-9103-43ca-a45c-46f86f99016e" Apr 20 20:05:42.217906 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:42.217545 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zwtx2" podUID="382abf9b-eb7a-4ef7-918c-50641d3558a8" Apr 20 20:05:43.924935 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:43.924886 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs\") pod \"network-metrics-daemon-tkqg4\" (UID: \"e5ae3fcb-9103-43ca-a45c-46f86f99016e\") " pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:43.925328 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:43.925036 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:43.925328 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:43.925099 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs podName:e5ae3fcb-9103-43ca-a45c-46f86f99016e nodeName:}" failed. No retries permitted until 2026-04-20 20:05:59.925079979 +0000 UTC m=+34.279682719 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs") pod "network-metrics-daemon-tkqg4" (UID: "e5ae3fcb-9103-43ca-a45c-46f86f99016e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:44.025512 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:44.025474 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85fws\" (UniqueName: \"kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws\") pod \"network-check-target-zwtx2\" (UID: \"382abf9b-eb7a-4ef7-918c-50641d3558a8\") " pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:44.025691 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:44.025634 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:44.025691 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:44.025659 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:44.025691 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:44.025672 2575 projected.go:194] Error preparing data for projected volume kube-api-access-85fws for pod openshift-network-diagnostics/network-check-target-zwtx2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:44.025874 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:44.025739 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws podName:382abf9b-eb7a-4ef7-918c-50641d3558a8 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:00.025720001 +0000 UTC m=+34.380322726 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-85fws" (UniqueName: "kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws") pod "network-check-target-zwtx2" (UID: "382abf9b-eb7a-4ef7-918c-50641d3558a8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:44.220140 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:44.220061 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:44.220393 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:44.220067 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:44.220393 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:44.220186 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zwtx2" podUID="382abf9b-eb7a-4ef7-918c-50641d3558a8" Apr 20 20:05:44.220393 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:44.220266 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tkqg4" podUID="e5ae3fcb-9103-43ca-a45c-46f86f99016e" Apr 20 20:05:45.743196 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:45.743145 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-155.ec2.internal" podStartSLOduration=18.743124018 podStartE2EDuration="18.743124018s" podCreationTimestamp="2026-04-20 20:05:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:05:31.313321729 +0000 UTC m=+5.667924477" watchObservedRunningTime="2026-04-20 20:05:45.743124018 +0000 UTC m=+20.097726765" Apr 20 20:05:45.743944 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:45.743925 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9rdlg"] Apr 20 20:05:45.812685 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:45.812653 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:45.812862 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:45.812737 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9rdlg" podUID="44d1f99f-7189-4247-a28d-b55ea139aca0" Apr 20 20:05:45.938930 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:45.938903 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/44d1f99f-7189-4247-a28d-b55ea139aca0-original-pull-secret\") pod \"global-pull-secret-syncer-9rdlg\" (UID: \"44d1f99f-7189-4247-a28d-b55ea139aca0\") " pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:45.938930 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:45.938935 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/44d1f99f-7189-4247-a28d-b55ea139aca0-kubelet-config\") pod \"global-pull-secret-syncer-9rdlg\" (UID: \"44d1f99f-7189-4247-a28d-b55ea139aca0\") " pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:45.939108 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:45.938959 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/44d1f99f-7189-4247-a28d-b55ea139aca0-dbus\") pod \"global-pull-secret-syncer-9rdlg\" (UID: \"44d1f99f-7189-4247-a28d-b55ea139aca0\") " pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:46.040033 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:46.039969 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/44d1f99f-7189-4247-a28d-b55ea139aca0-original-pull-secret\") pod \"global-pull-secret-syncer-9rdlg\" (UID: \"44d1f99f-7189-4247-a28d-b55ea139aca0\") " pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:46.040033 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:46.040004 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/44d1f99f-7189-4247-a28d-b55ea139aca0-kubelet-config\") pod \"global-pull-secret-syncer-9rdlg\" (UID: \"44d1f99f-7189-4247-a28d-b55ea139aca0\") " pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:46.040033 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:46.040028 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/44d1f99f-7189-4247-a28d-b55ea139aca0-dbus\") pod \"global-pull-secret-syncer-9rdlg\" (UID: \"44d1f99f-7189-4247-a28d-b55ea139aca0\") " pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:46.040200 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:46.040110 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/44d1f99f-7189-4247-a28d-b55ea139aca0-kubelet-config\") pod \"global-pull-secret-syncer-9rdlg\" (UID: \"44d1f99f-7189-4247-a28d-b55ea139aca0\") " pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:46.040200 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:46.040133 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:46.040200 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:46.040157 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/44d1f99f-7189-4247-a28d-b55ea139aca0-dbus\") pod \"global-pull-secret-syncer-9rdlg\" (UID: \"44d1f99f-7189-4247-a28d-b55ea139aca0\") " pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:46.040200 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:46.040196 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d1f99f-7189-4247-a28d-b55ea139aca0-original-pull-secret podName:44d1f99f-7189-4247-a28d-b55ea139aca0 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:46.540176379 +0000 UTC m=+20.894779117 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/44d1f99f-7189-4247-a28d-b55ea139aca0-original-pull-secret") pod "global-pull-secret-syncer-9rdlg" (UID: "44d1f99f-7189-4247-a28d-b55ea139aca0") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:46.217734 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:46.217702 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:46.217903 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:46.217865 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tkqg4" podUID="e5ae3fcb-9103-43ca-a45c-46f86f99016e" Apr 20 20:05:46.217972 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:46.217916 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:46.218066 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:46.218041 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zwtx2" podUID="382abf9b-eb7a-4ef7-918c-50641d3558a8" Apr 20 20:05:46.331732 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:46.331019 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-98bc5" event={"ID":"daf7a90d-c40a-4f5d-acec-d016728caae2","Type":"ContainerStarted","Data":"a88ffac087736514c864ee08ad34d47935955e8590a81505be1cdc276a19ba03"} Apr 20 20:05:46.544039 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:46.543978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/44d1f99f-7189-4247-a28d-b55ea139aca0-original-pull-secret\") pod \"global-pull-secret-syncer-9rdlg\" (UID: \"44d1f99f-7189-4247-a28d-b55ea139aca0\") " pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:46.544135 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:46.544097 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:46.544176 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:46.544147 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d1f99f-7189-4247-a28d-b55ea139aca0-original-pull-secret podName:44d1f99f-7189-4247-a28d-b55ea139aca0 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:47.544133822 +0000 UTC m=+21.898736546 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/44d1f99f-7189-4247-a28d-b55ea139aca0-original-pull-secret") pod "global-pull-secret-syncer-9rdlg" (UID: "44d1f99f-7189-4247-a28d-b55ea139aca0") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:47.217079 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.216910 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:47.217673 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:47.217141 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9rdlg" podUID="44d1f99f-7189-4247-a28d-b55ea139aca0" Apr 20 20:05:47.352477 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.352428 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" event={"ID":"a163cdce-f73c-4e59-8bba-e14ea4b5515d","Type":"ContainerStarted","Data":"c623b1d8b9648a86cb54b5fc16d5dad6d51a12e53626b426636d88a426c95c94"} Apr 20 20:05:47.356759 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.356730 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-b2hzv" event={"ID":"2cdb632c-969a-4d5b-96a5-44863d4b1606","Type":"ContainerStarted","Data":"f283de7f26d84b5a0c80f65ecdac4a5fce0ec72476bd241d992541221d58dcdc"} Apr 20 20:05:47.357981 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.357952 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z2mjf" event={"ID":"36ded35a-5271-4357-8314-e71ef065d1c8","Type":"ContainerStarted","Data":"76bb93a10a559fb721497b886f0ece21a049b04ec8d68adf98f3387a8e6472cc"} Apr 20 20:05:47.359231 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.359210 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vhc9b" event={"ID":"dc97d8be-f2be-4f95-b5f8-2f30bd207040","Type":"ContainerStarted","Data":"f8cd2d38b33c6f880b834964686ffc42f485b48007a56af119e1463cecb2f380"} Apr 20 20:05:47.361433 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.361418 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:05:47.361690 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.361672 2575 generic.go:358] "Generic (PLEG): container finished" podID="3d96f75f-8097-4ea6-bf1c-c5bc31761969" containerID="2d2c0d73059e0f56d4aa2a02e2736b0b27886f781adf8328d9505281a5f988f3" exitCode=1 Apr 20 20:05:47.361751 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.361728 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" event={"ID":"3d96f75f-8097-4ea6-bf1c-c5bc31761969","Type":"ContainerStarted","Data":"f8f41b1e26ff34c0109353bf1eccb39837cb5b9c73f521cd95e1464519a7ee9b"} Apr 20 20:05:47.361751 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.361747 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" event={"ID":"3d96f75f-8097-4ea6-bf1c-c5bc31761969","Type":"ContainerStarted","Data":"cabf22bd87777f6de14b287fc528d6bc7bbc805b92ff7e24802aaf94fc8ae1ad"} Apr 20 20:05:47.361869 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.361760 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" event={"ID":"3d96f75f-8097-4ea6-bf1c-c5bc31761969","Type":"ContainerStarted","Data":"843875ce0050863c8d90a03c2beb01a957b2a4d8eaef332b58f883e42b5ac172"} Apr 20 20:05:47.361869 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.361784 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" event={"ID":"3d96f75f-8097-4ea6-bf1c-c5bc31761969","Type":"ContainerStarted","Data":"2a0ee8d4c6bd712375ee75adaf822dabdf3952c662a1dc8d32935fcde7901247"} Apr 20 20:05:47.361869 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.361818 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" event={"ID":"3d96f75f-8097-4ea6-bf1c-c5bc31761969","Type":"ContainerDied","Data":"2d2c0d73059e0f56d4aa2a02e2736b0b27886f781adf8328d9505281a5f988f3"} Apr 20 20:05:47.361869 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.361831 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" event={"ID":"3d96f75f-8097-4ea6-bf1c-c5bc31761969","Type":"ContainerStarted","Data":"5438f37f821bc2abf33f9bb6a02e3a40e923bd88854a862c093e618d87c6e573"} Apr 20 20:05:47.362897 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.362878 2575 generic.go:358] "Generic (PLEG): container finished" podID="77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87" containerID="106696b7f3bdb15a825608731e0734a0649c3cb045ca00444d382b28d7fa8ec9" exitCode=0 Apr 20 20:05:47.362979 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.362945 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zln2l" event={"ID":"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87","Type":"ContainerDied","Data":"106696b7f3bdb15a825608731e0734a0649c3cb045ca00444d382b28d7fa8ec9"} Apr 20 20:05:47.364094 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.364071 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pmx7s" event={"ID":"825cf132-8688-4d28-b93d-28380334363a","Type":"ContainerStarted","Data":"c0b29e01f4da17e4f2689f14bbfbe8a10fcf27a821f91d7fa58398f0f0e964d6"} Apr 20 20:05:47.372008 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.371971 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-b2hzv" podStartSLOduration=4.18613387 podStartE2EDuration="21.371961287s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.916143227 +0000 UTC m=+3.270745966" lastFinishedPulling="2026-04-20 20:05:46.101970658 +0000 UTC m=+20.456573383" observedRunningTime="2026-04-20 20:05:47.371955846 +0000 UTC m=+21.726558593" watchObservedRunningTime="2026-04-20 20:05:47.371961287 +0000 UTC m=+21.726564033" Apr 20 20:05:47.372110 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.372045 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-98bc5" podStartSLOduration=4.176110884 podStartE2EDuration="21.37203961s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.905976027 +0000 UTC m=+3.260578766" lastFinishedPulling="2026-04-20 20:05:46.101904754 +0000 UTC m=+20.456507492" observedRunningTime="2026-04-20 20:05:46.348084848 +0000 UTC m=+20.702687596" watchObservedRunningTime="2026-04-20 20:05:47.37203961 +0000 UTC m=+21.726642358" Apr 20 20:05:47.413033 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.412951 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pmx7s" podStartSLOduration=4.190008546 podStartE2EDuration="21.412937394s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.911763645 +0000 UTC m=+3.266366371" lastFinishedPulling="2026-04-20 20:05:46.134692481 +0000 UTC m=+20.489295219" observedRunningTime="2026-04-20 20:05:47.401357557 +0000 UTC m=+21.755960304" watchObservedRunningTime="2026-04-20 20:05:47.412937394 +0000 UTC m=+21.767540140" Apr 20 20:05:47.413033 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.413025 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vhc9b" podStartSLOduration=4.190656526 podStartE2EDuration="21.413022595s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.91004137 +0000 UTC m=+3.264644099" lastFinishedPulling="2026-04-20 20:05:46.132407428 +0000 UTC m=+20.487010168" observedRunningTime="2026-04-20 20:05:47.4126003 +0000 UTC m=+21.767203046" watchObservedRunningTime="2026-04-20 20:05:47.413022595 +0000 UTC m=+21.767625341" Apr 20 20:05:47.552243 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.552218 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/44d1f99f-7189-4247-a28d-b55ea139aca0-original-pull-secret\") pod \"global-pull-secret-syncer-9rdlg\" (UID: \"44d1f99f-7189-4247-a28d-b55ea139aca0\") " pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:47.552342 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:47.552330 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:47.552395 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:47.552386 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d1f99f-7189-4247-a28d-b55ea139aca0-original-pull-secret podName:44d1f99f-7189-4247-a28d-b55ea139aca0 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:49.552369827 +0000 UTC m=+23.906972551 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/44d1f99f-7189-4247-a28d-b55ea139aca0-original-pull-secret") pod "global-pull-secret-syncer-9rdlg" (UID: "44d1f99f-7189-4247-a28d-b55ea139aca0") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:47.681313 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:47.681285 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 20:05:48.156429 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:48.156307 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T20:05:47.681304363Z","UUID":"6cbb61c8-0f59-43e9-bb21-e8d02e237c49","Handler":null,"Name":"","Endpoint":""} Apr 20 20:05:48.158702 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:48.158675 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 20:05:48.158702 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:48.158709 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 20:05:48.218990 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:48.218946 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:48.219424 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:48.218952 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:48.219424 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:48.219048 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tkqg4" podUID="e5ae3fcb-9103-43ca-a45c-46f86f99016e" Apr 20 20:05:48.219424 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:48.219143 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zwtx2" podUID="382abf9b-eb7a-4ef7-918c-50641d3558a8" Apr 20 20:05:48.368461 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:48.368422 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5ksn8" event={"ID":"13fedadd-686b-4e93-a2ae-cbab6edc3bff","Type":"ContainerStarted","Data":"98794819160fc9fe412d69abec3a3aeba64c17258dd56c99368c06a1c6e9eef7"} Apr 20 20:05:48.370472 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:48.370441 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" event={"ID":"a163cdce-f73c-4e59-8bba-e14ea4b5515d","Type":"ContainerStarted","Data":"7107ac2fbcbf0c955870a323835548c538d9c457c996a0537d161671563cd44e"} Apr 20 20:05:48.380733 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:48.380682 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-z2mjf" podStartSLOduration=5.124996148 podStartE2EDuration="22.380666686s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.914017362 +0000 UTC m=+3.268620099" lastFinishedPulling="2026-04-20 20:05:46.169687902 +0000 UTC m=+20.524290637" observedRunningTime="2026-04-20 20:05:47.431626722 +0000 UTC m=+21.786229468" watchObservedRunningTime="2026-04-20 20:05:48.380666686 +0000 UTC m=+22.735269435" Apr 20 20:05:49.217417 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:49.217369 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:49.217579 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:49.217515 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9rdlg" podUID="44d1f99f-7189-4247-a28d-b55ea139aca0" Apr 20 20:05:49.375170 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:49.375137 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:05:49.375617 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:49.375556 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" event={"ID":"3d96f75f-8097-4ea6-bf1c-c5bc31761969","Type":"ContainerStarted","Data":"22e9c30063ffb91453f29172f609b4de9587129ea6a4e67fc080d8a3a790c1ba"} Apr 20 20:05:49.377515 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:49.377495 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" event={"ID":"a163cdce-f73c-4e59-8bba-e14ea4b5515d","Type":"ContainerStarted","Data":"b79bd3d5d9d6de4eb69d07b8170b03cbcae2f9d8aec7ecf4a73473327cf7c14b"} Apr 20 20:05:49.395400 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:49.395354 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5ksn8" podStartSLOduration=6.173821611 podStartE2EDuration="23.39534367s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.910890342 +0000 UTC m=+3.265493067" lastFinishedPulling="2026-04-20 20:05:46.132412387 +0000 UTC m=+20.487015126" observedRunningTime="2026-04-20 20:05:48.381555664 +0000 UTC m=+22.736158410" watchObservedRunningTime="2026-04-20 20:05:49.39534367 +0000 UTC m=+23.749946416" Apr 20 20:05:49.395641 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:49.395613 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxbqm" podStartSLOduration=3.586340834 podStartE2EDuration="23.395605943s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.9170469 +0000 UTC m=+3.271649624" lastFinishedPulling="2026-04-20 20:05:48.726312007 +0000 UTC m=+23.080914733" observedRunningTime="2026-04-20 20:05:49.395218056 +0000 UTC m=+23.749820802" watchObservedRunningTime="2026-04-20 20:05:49.395605943 +0000 UTC m=+23.750208686" Apr 20 20:05:49.566976 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:49.566942 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/44d1f99f-7189-4247-a28d-b55ea139aca0-original-pull-secret\") pod \"global-pull-secret-syncer-9rdlg\" (UID: \"44d1f99f-7189-4247-a28d-b55ea139aca0\") " pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:49.567149 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:49.567059 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:49.567149 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:49.567118 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d1f99f-7189-4247-a28d-b55ea139aca0-original-pull-secret podName:44d1f99f-7189-4247-a28d-b55ea139aca0 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:53.567104308 +0000 UTC m=+27.921707045 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/44d1f99f-7189-4247-a28d-b55ea139aca0-original-pull-secret") pod "global-pull-secret-syncer-9rdlg" (UID: "44d1f99f-7189-4247-a28d-b55ea139aca0") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:50.217612 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:50.217381 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:50.217777 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:50.217381 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:50.217777 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:50.217680 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tkqg4" podUID="e5ae3fcb-9103-43ca-a45c-46f86f99016e" Apr 20 20:05:50.217777 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:50.217748 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zwtx2" podUID="382abf9b-eb7a-4ef7-918c-50641d3558a8" Apr 20 20:05:50.735013 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:50.734976 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-b2hzv" Apr 20 20:05:50.735626 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:50.735599 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-b2hzv" Apr 20 20:05:51.217026 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:51.216996 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:51.217176 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:51.217126 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9rdlg" podUID="44d1f99f-7189-4247-a28d-b55ea139aca0" Apr 20 20:05:51.381674 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:51.381640 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-b2hzv" Apr 20 20:05:51.382232 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:51.382211 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-b2hzv" Apr 20 20:05:52.216771 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:52.216736 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:52.217250 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:52.216830 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:52.217250 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:52.216957 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zwtx2" podUID="382abf9b-eb7a-4ef7-918c-50641d3558a8" Apr 20 20:05:52.217250 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:52.217075 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tkqg4" podUID="e5ae3fcb-9103-43ca-a45c-46f86f99016e" Apr 20 20:05:52.388548 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:52.388298 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:05:52.389176 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:52.388897 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" event={"ID":"3d96f75f-8097-4ea6-bf1c-c5bc31761969","Type":"ContainerStarted","Data":"758a24d0eeacac2322c9dc5dac055581d8f17c2a609bc5bcc1f4a2d65fc86d6a"} Apr 20 20:05:52.389320 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:52.389305 2575 scope.go:117] "RemoveContainer" containerID="2d2c0d73059e0f56d4aa2a02e2736b0b27886f781adf8328d9505281a5f988f3" Apr 20 20:05:53.217470 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:53.217300 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:53.218191 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:53.217566 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9rdlg" podUID="44d1f99f-7189-4247-a28d-b55ea139aca0" Apr 20 20:05:53.393762 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:53.393733 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:05:53.394041 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:53.394016 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" event={"ID":"3d96f75f-8097-4ea6-bf1c-c5bc31761969","Type":"ContainerStarted","Data":"d34ed32bfe83f3fb5f8975b4fe97acc0f7aa3e672c33b4376ee525cc94b752d9"} Apr 20 20:05:53.394158 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:53.394143 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 20:05:53.394434 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:53.394411 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:53.394534 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:53.394445 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:53.395727 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:53.395706 2575 generic.go:358] "Generic (PLEG): container finished" podID="77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87" containerID="bb8821a690b60e42582e67ff863752b68dd3a034569d0793b797413dbdb92c81" exitCode=0 Apr 20 20:05:53.395859 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:53.395772 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zln2l" event={"ID":"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87","Type":"ContainerDied","Data":"bb8821a690b60e42582e67ff863752b68dd3a034569d0793b797413dbdb92c81"} Apr 20 20:05:53.410046 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:53.410021 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:53.410178 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:53.410097 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:53.427832 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:53.427764 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" podStartSLOduration=10.146525358 podStartE2EDuration="27.427718099s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.907526605 +0000 UTC m=+3.262129329" lastFinishedPulling="2026-04-20 20:05:46.188719342 +0000 UTC m=+20.543322070" observedRunningTime="2026-04-20 20:05:53.427252875 +0000 UTC m=+27.781855645" watchObservedRunningTime="2026-04-20 20:05:53.427718099 +0000 UTC m=+27.782320845" Apr 20 20:05:53.596905 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:53.596851 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/44d1f99f-7189-4247-a28d-b55ea139aca0-original-pull-secret\") pod \"global-pull-secret-syncer-9rdlg\" (UID: \"44d1f99f-7189-4247-a28d-b55ea139aca0\") " pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:53.597089 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:53.597028 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:53.597163 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:53.597140 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d1f99f-7189-4247-a28d-b55ea139aca0-original-pull-secret podName:44d1f99f-7189-4247-a28d-b55ea139aca0 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:01.597118828 +0000 UTC m=+35.951721567 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/44d1f99f-7189-4247-a28d-b55ea139aca0-original-pull-secret") pod "global-pull-secret-syncer-9rdlg" (UID: "44d1f99f-7189-4247-a28d-b55ea139aca0") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:54.216919 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:54.216895 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:54.217043 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:54.216895 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:54.217043 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:54.217019 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zwtx2" podUID="382abf9b-eb7a-4ef7-918c-50641d3558a8" Apr 20 20:05:54.217149 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:54.217129 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tkqg4" podUID="e5ae3fcb-9103-43ca-a45c-46f86f99016e" Apr 20 20:05:54.277995 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:54.277972 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9rdlg"] Apr 20 20:05:54.278268 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:54.278087 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:54.278268 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:54.278170 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9rdlg" podUID="44d1f99f-7189-4247-a28d-b55ea139aca0" Apr 20 20:05:54.281114 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:54.281092 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tkqg4"] Apr 20 20:05:54.294786 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:54.294757 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zwtx2"] Apr 20 20:05:54.401677 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:54.401638 2575 generic.go:358] "Generic (PLEG): container finished" podID="77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87" containerID="7ee08a1b7d7e01a46d44d08c20c61ad6f3a3189bd62276a0720b430ba9838b03" exitCode=0 Apr 20 20:05:54.401853 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:54.401679 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zln2l" event={"ID":"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87","Type":"ContainerDied","Data":"7ee08a1b7d7e01a46d44d08c20c61ad6f3a3189bd62276a0720b430ba9838b03"} Apr 20 20:05:54.401853 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:54.401822 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:54.401985 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:54.401913 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 20:05:54.401985 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:54.401924 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:54.401985 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:54.401924 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zwtx2" podUID="382abf9b-eb7a-4ef7-918c-50641d3558a8" Apr 20 20:05:54.402936 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:54.402388 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tkqg4" podUID="e5ae3fcb-9103-43ca-a45c-46f86f99016e" Apr 20 20:05:55.405497 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:55.405471 2575 generic.go:358] "Generic (PLEG): container finished" podID="77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87" containerID="0a7ce67196f8118c0d1becda0f5457a9a34180b36891188f34ce29cfddea9bc4" exitCode=0 Apr 20 20:05:55.405884 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:55.405556 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zln2l" event={"ID":"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87","Type":"ContainerDied","Data":"0a7ce67196f8118c0d1becda0f5457a9a34180b36891188f34ce29cfddea9bc4"} Apr 20 20:05:55.405884 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:55.405621 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 20:05:56.217581 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:56.217407 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:56.217754 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:56.217471 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:56.217754 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:56.217673 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9rdlg" podUID="44d1f99f-7189-4247-a28d-b55ea139aca0" Apr 20 20:05:56.217754 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:56.217488 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:56.217914 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:56.217723 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zwtx2" podUID="382abf9b-eb7a-4ef7-918c-50641d3558a8" Apr 20 20:05:56.217914 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:56.217824 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tkqg4" podUID="e5ae3fcb-9103-43ca-a45c-46f86f99016e" Apr 20 20:05:57.657053 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:57.657023 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:57.657512 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:57.657232 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 20:05:57.669415 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:57.669390 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5k984" Apr 20 20:05:58.216504 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:58.216467 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:58.216688 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:58.216468 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:05:58.216688 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:58.216596 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tkqg4" podUID="e5ae3fcb-9103-43ca-a45c-46f86f99016e" Apr 20 20:05:58.216688 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:58.216654 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:05:58.216899 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:58.216766 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9rdlg" podUID="44d1f99f-7189-4247-a28d-b55ea139aca0" Apr 20 20:05:58.216899 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:58.216871 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zwtx2" podUID="382abf9b-eb7a-4ef7-918c-50641d3558a8" Apr 20 20:05:58.977284 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:58.977251 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-155.ec2.internal" event="NodeReady" Apr 20 20:05:58.977818 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:58.977413 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 20:05:59.029172 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.029136 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mz59j"] Apr 20 20:05:59.049940 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.049912 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vw9v8"] Apr 20 20:05:59.050103 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.050083 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mz59j" Apr 20 20:05:59.052600 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.052570 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m22kk\"" Apr 20 20:05:59.053216 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.053191 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 20:05:59.053331 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.053222 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 20:05:59.063841 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.063770 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mz59j"] Apr 20 20:05:59.063841 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.063812 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vw9v8"] Apr 20 20:05:59.063984 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.063888 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vw9v8" Apr 20 20:05:59.066638 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.066617 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 20:05:59.066638 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.066635 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 20:05:59.066870 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.066702 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wkgqm\"" Apr 20 20:05:59.066870 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.066711 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 20:05:59.141869 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.141832 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fphbk\" (UniqueName: \"kubernetes.io/projected/857f5f33-f93c-4621-bf95-3178c8f078b1-kube-api-access-fphbk\") pod \"ingress-canary-vw9v8\" (UID: \"857f5f33-f93c-4621-bf95-3178c8f078b1\") " pod="openshift-ingress-canary/ingress-canary-vw9v8" Apr 20 20:05:59.142047 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.141874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86fdf4f4-4135-45ec-8ea1-ade53c77d176-config-volume\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:05:59.142047 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.141905 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86fdf4f4-4135-45ec-8ea1-ade53c77d176-tmp-dir\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:05:59.142047 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.141959 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:05:59.142047 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.142017 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjd4w\" (UniqueName: \"kubernetes.io/projected/86fdf4f4-4135-45ec-8ea1-ade53c77d176-kube-api-access-rjd4w\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:05:59.142047 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.142033 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert\") pod \"ingress-canary-vw9v8\" (UID: \"857f5f33-f93c-4621-bf95-3178c8f078b1\") " pod="openshift-ingress-canary/ingress-canary-vw9v8" Apr 20 20:05:59.243088 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.243047 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:05:59.243253 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.243145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjd4w\" (UniqueName: \"kubernetes.io/projected/86fdf4f4-4135-45ec-8ea1-ade53c77d176-kube-api-access-rjd4w\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:05:59.243253 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:59.243165 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:59.243253 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.243171 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert\") pod \"ingress-canary-vw9v8\" (UID: \"857f5f33-f93c-4621-bf95-3178c8f078b1\") " pod="openshift-ingress-canary/ingress-canary-vw9v8" Apr 20 20:05:59.243253 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:59.243231 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls podName:86fdf4f4-4135-45ec-8ea1-ade53c77d176 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:59.743209902 +0000 UTC m=+34.097812641 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls") pod "dns-default-mz59j" (UID: "86fdf4f4-4135-45ec-8ea1-ade53c77d176") : secret "dns-default-metrics-tls" not found Apr 20 20:05:59.243253 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:59.243247 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:59.243516 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.243288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fphbk\" (UniqueName: \"kubernetes.io/projected/857f5f33-f93c-4621-bf95-3178c8f078b1-kube-api-access-fphbk\") pod \"ingress-canary-vw9v8\" (UID: \"857f5f33-f93c-4621-bf95-3178c8f078b1\") " pod="openshift-ingress-canary/ingress-canary-vw9v8" Apr 20 20:05:59.243516 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:59.243299 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert podName:857f5f33-f93c-4621-bf95-3178c8f078b1 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:59.743287878 +0000 UTC m=+34.097890623 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert") pod "ingress-canary-vw9v8" (UID: "857f5f33-f93c-4621-bf95-3178c8f078b1") : secret "canary-serving-cert" not found Apr 20 20:05:59.243516 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.243400 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86fdf4f4-4135-45ec-8ea1-ade53c77d176-config-volume\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:05:59.243516 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.243428 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86fdf4f4-4135-45ec-8ea1-ade53c77d176-tmp-dir\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:05:59.243895 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.243856 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86fdf4f4-4135-45ec-8ea1-ade53c77d176-tmp-dir\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:05:59.243991 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.243970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86fdf4f4-4135-45ec-8ea1-ade53c77d176-config-volume\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:05:59.254236 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.254213 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjd4w\" (UniqueName: \"kubernetes.io/projected/86fdf4f4-4135-45ec-8ea1-ade53c77d176-kube-api-access-rjd4w\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:05:59.254386 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.254319 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fphbk\" (UniqueName: \"kubernetes.io/projected/857f5f33-f93c-4621-bf95-3178c8f078b1-kube-api-access-fphbk\") pod \"ingress-canary-vw9v8\" (UID: \"857f5f33-f93c-4621-bf95-3178c8f078b1\") " pod="openshift-ingress-canary/ingress-canary-vw9v8" Apr 20 20:05:59.747583 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.747541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:05:59.747767 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.747608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert\") pod \"ingress-canary-vw9v8\" (UID: \"857f5f33-f93c-4621-bf95-3178c8f078b1\") " pod="openshift-ingress-canary/ingress-canary-vw9v8" Apr 20 20:05:59.747767 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:59.747698 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:59.747767 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:59.747701 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:59.747767 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:59.747751 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert podName:857f5f33-f93c-4621-bf95-3178c8f078b1 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:00.747738107 +0000 UTC m=+35.102340831 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert") pod "ingress-canary-vw9v8" (UID: "857f5f33-f93c-4621-bf95-3178c8f078b1") : secret "canary-serving-cert" not found Apr 20 20:05:59.747767 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:59.747764 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls podName:86fdf4f4-4135-45ec-8ea1-ade53c77d176 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:00.747758346 +0000 UTC m=+35.102361070 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls") pod "dns-default-mz59j" (UID: "86fdf4f4-4135-45ec-8ea1-ade53c77d176") : secret "dns-default-metrics-tls" not found Apr 20 20:05:59.948207 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:05:59.948160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs\") pod \"network-metrics-daemon-tkqg4\" (UID: \"e5ae3fcb-9103-43ca-a45c-46f86f99016e\") " pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:05:59.948381 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:59.948312 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:59.948381 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:05:59.948380 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs podName:e5ae3fcb-9103-43ca-a45c-46f86f99016e nodeName:}" failed. No retries permitted until 2026-04-20 20:06:31.948364463 +0000 UTC m=+66.302967186 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs") pod "network-metrics-daemon-tkqg4" (UID: "e5ae3fcb-9103-43ca-a45c-46f86f99016e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:00.049162 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:00.049063 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85fws\" (UniqueName: \"kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws\") pod \"network-check-target-zwtx2\" (UID: \"382abf9b-eb7a-4ef7-918c-50641d3558a8\") " pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:06:00.049835 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:00.049218 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:00.049835 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:00.049239 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:00.049835 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:00.049249 2575 projected.go:194] Error preparing data for projected volume kube-api-access-85fws for pod openshift-network-diagnostics/network-check-target-zwtx2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:00.049835 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:00.049302 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws podName:382abf9b-eb7a-4ef7-918c-50641d3558a8 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:32.049287582 +0000 UTC m=+66.403890321 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-85fws" (UniqueName: "kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws") pod "network-check-target-zwtx2" (UID: "382abf9b-eb7a-4ef7-918c-50641d3558a8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:00.217990 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:00.217952 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:06:00.217990 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:00.217981 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:06:00.218225 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:00.217955 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:06:00.220747 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:00.220723 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:06:00.220879 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:00.220775 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:06:00.221871 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:00.221593 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 20:06:00.221871 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:00.221628 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:06:00.221871 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:00.221668 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gwc22\"" Apr 20 20:06:00.221871 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:00.221705 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5w7vj\"" Apr 20 20:06:00.755752 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:00.755719 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:06:00.755946 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:00.755821 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert\") pod \"ingress-canary-vw9v8\" (UID: \"857f5f33-f93c-4621-bf95-3178c8f078b1\") " pod="openshift-ingress-canary/ingress-canary-vw9v8" Apr 20 20:06:00.755946 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:00.755857 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:00.755946 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:00.755905 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:00.755946 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:00.755923 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls podName:86fdf4f4-4135-45ec-8ea1-ade53c77d176 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:02.755904676 +0000 UTC m=+37.110507417 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls") pod "dns-default-mz59j" (UID: "86fdf4f4-4135-45ec-8ea1-ade53c77d176") : secret "dns-default-metrics-tls" not found Apr 20 20:06:00.755946 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:00.755940 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert podName:857f5f33-f93c-4621-bf95-3178c8f078b1 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:02.755930045 +0000 UTC m=+37.110532774 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert") pod "ingress-canary-vw9v8" (UID: "857f5f33-f93c-4621-bf95-3178c8f078b1") : secret "canary-serving-cert" not found Apr 20 20:06:01.663342 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:01.663287 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/44d1f99f-7189-4247-a28d-b55ea139aca0-original-pull-secret\") pod \"global-pull-secret-syncer-9rdlg\" (UID: \"44d1f99f-7189-4247-a28d-b55ea139aca0\") " pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:06:01.666310 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:01.666275 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/44d1f99f-7189-4247-a28d-b55ea139aca0-original-pull-secret\") pod \"global-pull-secret-syncer-9rdlg\" (UID: \"44d1f99f-7189-4247-a28d-b55ea139aca0\") " pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:06:01.736501 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:01.736462 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9rdlg" Apr 20 20:06:02.322229 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:02.322069 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9rdlg"] Apr 20 20:06:02.358418 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:06:02.358380 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44d1f99f_7189_4247_a28d_b55ea139aca0.slice/crio-9977e74e382e22f0f1c5bc1204f1bcf55db7b1159fe3659580c791c78025e59d WatchSource:0}: Error finding container 9977e74e382e22f0f1c5bc1204f1bcf55db7b1159fe3659580c791c78025e59d: Status 404 returned error can't find the container with id 9977e74e382e22f0f1c5bc1204f1bcf55db7b1159fe3659580c791c78025e59d Apr 20 20:06:02.420742 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:02.420619 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9rdlg" event={"ID":"44d1f99f-7189-4247-a28d-b55ea139aca0","Type":"ContainerStarted","Data":"9977e74e382e22f0f1c5bc1204f1bcf55db7b1159fe3659580c791c78025e59d"} Apr 20 20:06:02.771221 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:02.771132 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:06:02.771608 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:02.771240 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert\") pod \"ingress-canary-vw9v8\" (UID: \"857f5f33-f93c-4621-bf95-3178c8f078b1\") " pod="openshift-ingress-canary/ingress-canary-vw9v8" Apr 20 20:06:02.771608 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:02.771286 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:02.771608 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:02.771350 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:02.771608 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:02.771352 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls podName:86fdf4f4-4135-45ec-8ea1-ade53c77d176 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:06.771334869 +0000 UTC m=+41.125937598 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls") pod "dns-default-mz59j" (UID: "86fdf4f4-4135-45ec-8ea1-ade53c77d176") : secret "dns-default-metrics-tls" not found Apr 20 20:06:02.771608 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:02.771413 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert podName:857f5f33-f93c-4621-bf95-3178c8f078b1 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:06.771397183 +0000 UTC m=+41.125999909 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert") pod "ingress-canary-vw9v8" (UID: "857f5f33-f93c-4621-bf95-3178c8f078b1") : secret "canary-serving-cert" not found Apr 20 20:06:03.425877 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:03.425844 2575 generic.go:358] "Generic (PLEG): container finished" podID="77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87" containerID="9cfbab7a3ba0dccc0902d2f5a5bcb9619c82a9653b99614580ee7f959f644ddc" exitCode=0 Apr 20 20:06:03.426040 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:03.425921 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zln2l" event={"ID":"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87","Type":"ContainerDied","Data":"9cfbab7a3ba0dccc0902d2f5a5bcb9619c82a9653b99614580ee7f959f644ddc"} Apr 20 20:06:04.430650 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:04.430618 2575 generic.go:358] "Generic (PLEG): container finished" podID="77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87" containerID="38b990837deae2babd615d32a5a4d327f53c787d0b336f193db1f78d6e0c7b16" exitCode=0 Apr 20 20:06:04.431126 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:04.430671 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zln2l" event={"ID":"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87","Type":"ContainerDied","Data":"38b990837deae2babd615d32a5a4d327f53c787d0b336f193db1f78d6e0c7b16"} Apr 20 20:06:05.436193 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:05.435995 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zln2l" event={"ID":"77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87","Type":"ContainerStarted","Data":"7dcaf67b79d8ffaa5674695470e6b253bec4c2e01a16ba55c4f2bbda577138db"} Apr 20 20:06:05.462273 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:05.462209 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zln2l" podStartSLOduration=5.990094462 podStartE2EDuration="39.462188219s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.912852482 +0000 UTC m=+3.267455221" lastFinishedPulling="2026-04-20 20:06:02.38494624 +0000 UTC m=+36.739548978" observedRunningTime="2026-04-20 20:06:05.460588074 +0000 UTC m=+39.815190845" watchObservedRunningTime="2026-04-20 20:06:05.462188219 +0000 UTC m=+39.816790967" Apr 20 20:06:06.806280 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:06.806185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:06:06.806280 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:06.806246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert\") pod \"ingress-canary-vw9v8\" (UID: \"857f5f33-f93c-4621-bf95-3178c8f078b1\") " pod="openshift-ingress-canary/ingress-canary-vw9v8" Apr 20 20:06:06.806653 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:06.806331 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:06.806653 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:06.806395 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls podName:86fdf4f4-4135-45ec-8ea1-ade53c77d176 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:14.806376689 +0000 UTC m=+49.160979413 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls") pod "dns-default-mz59j" (UID: "86fdf4f4-4135-45ec-8ea1-ade53c77d176") : secret "dns-default-metrics-tls" not found Apr 20 20:06:06.806653 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:06.806333 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:06.806653 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:06.806473 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert podName:857f5f33-f93c-4621-bf95-3178c8f078b1 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:14.806458881 +0000 UTC m=+49.161061628 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert") pod "ingress-canary-vw9v8" (UID: "857f5f33-f93c-4621-bf95-3178c8f078b1") : secret "canary-serving-cert" not found Apr 20 20:06:07.099759 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.099729 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f98685b8f-gl7d4"] Apr 20 20:06:07.125607 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.125580 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt"] Apr 20 20:06:07.125761 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.125722 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f98685b8f-gl7d4" Apr 20 20:06:07.129023 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.128933 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 20:06:07.129023 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.128985 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 20:06:07.129023 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.128991 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 20:06:07.129491 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.129052 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 20:06:07.129491 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.128993 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-kgxw5\"" Apr 20 20:06:07.137578 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.137558 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f98685b8f-gl7d4"] Apr 20 20:06:07.137578 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.137579 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt"] Apr 20 20:06:07.137701 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.137659 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.139571 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.139556 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 20:06:07.139866 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.139848 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 20:06:07.139944 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.139911 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 20:06:07.139944 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.139934 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 20:06:07.309815 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.309764 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/267cbc4f-38d3-482c-8cd4-ecdd11902c20-hub\") pod \"cluster-proxy-proxy-agent-f486444b4-gczbt\" (UID: \"267cbc4f-38d3-482c-8cd4-ecdd11902c20\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.310004 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.309833 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/267cbc4f-38d3-482c-8cd4-ecdd11902c20-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f486444b4-gczbt\" (UID: \"267cbc4f-38d3-482c-8cd4-ecdd11902c20\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.310004 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.309855 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q76lv\" (UniqueName: \"kubernetes.io/projected/267cbc4f-38d3-482c-8cd4-ecdd11902c20-kube-api-access-q76lv\") pod \"cluster-proxy-proxy-agent-f486444b4-gczbt\" (UID: \"267cbc4f-38d3-482c-8cd4-ecdd11902c20\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.310004 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.309880 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dscsk\" (UniqueName: \"kubernetes.io/projected/64e243ca-f853-4687-a4d9-07d517f6cda5-kube-api-access-dscsk\") pod \"managed-serviceaccount-addon-agent-5f98685b8f-gl7d4\" (UID: \"64e243ca-f853-4687-a4d9-07d517f6cda5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f98685b8f-gl7d4" Apr 20 20:06:07.310004 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.309915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/267cbc4f-38d3-482c-8cd4-ecdd11902c20-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f486444b4-gczbt\" (UID: \"267cbc4f-38d3-482c-8cd4-ecdd11902c20\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.310004 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.309970 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/64e243ca-f853-4687-a4d9-07d517f6cda5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5f98685b8f-gl7d4\" (UID: \"64e243ca-f853-4687-a4d9-07d517f6cda5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f98685b8f-gl7d4" Apr 20 20:06:07.310004 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.309987 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/267cbc4f-38d3-482c-8cd4-ecdd11902c20-ca\") pod \"cluster-proxy-proxy-agent-f486444b4-gczbt\" (UID: \"267cbc4f-38d3-482c-8cd4-ecdd11902c20\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.310004 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.310006 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/267cbc4f-38d3-482c-8cd4-ecdd11902c20-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f486444b4-gczbt\" (UID: \"267cbc4f-38d3-482c-8cd4-ecdd11902c20\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.410670 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.410584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/267cbc4f-38d3-482c-8cd4-ecdd11902c20-hub\") pod \"cluster-proxy-proxy-agent-f486444b4-gczbt\" (UID: \"267cbc4f-38d3-482c-8cd4-ecdd11902c20\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.410670 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.410624 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/267cbc4f-38d3-482c-8cd4-ecdd11902c20-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f486444b4-gczbt\" (UID: \"267cbc4f-38d3-482c-8cd4-ecdd11902c20\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.410670 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.410642 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q76lv\" (UniqueName: \"kubernetes.io/projected/267cbc4f-38d3-482c-8cd4-ecdd11902c20-kube-api-access-q76lv\") pod \"cluster-proxy-proxy-agent-f486444b4-gczbt\" (UID: \"267cbc4f-38d3-482c-8cd4-ecdd11902c20\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.410670 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.410668 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dscsk\" (UniqueName: \"kubernetes.io/projected/64e243ca-f853-4687-a4d9-07d517f6cda5-kube-api-access-dscsk\") pod \"managed-serviceaccount-addon-agent-5f98685b8f-gl7d4\" (UID: \"64e243ca-f853-4687-a4d9-07d517f6cda5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f98685b8f-gl7d4" Apr 20 20:06:07.410999 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.410713 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/267cbc4f-38d3-482c-8cd4-ecdd11902c20-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f486444b4-gczbt\" (UID: \"267cbc4f-38d3-482c-8cd4-ecdd11902c20\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.410999 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.410764 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/64e243ca-f853-4687-a4d9-07d517f6cda5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5f98685b8f-gl7d4\" (UID: \"64e243ca-f853-4687-a4d9-07d517f6cda5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f98685b8f-gl7d4" Apr 20 20:06:07.410999 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.410810 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/267cbc4f-38d3-482c-8cd4-ecdd11902c20-ca\") pod \"cluster-proxy-proxy-agent-f486444b4-gczbt\" (UID: \"267cbc4f-38d3-482c-8cd4-ecdd11902c20\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.410999 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.410839 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/267cbc4f-38d3-482c-8cd4-ecdd11902c20-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f486444b4-gczbt\" (UID: \"267cbc4f-38d3-482c-8cd4-ecdd11902c20\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.411503 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.411464 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/267cbc4f-38d3-482c-8cd4-ecdd11902c20-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f486444b4-gczbt\" (UID: \"267cbc4f-38d3-482c-8cd4-ecdd11902c20\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.415228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.415196 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/267cbc4f-38d3-482c-8cd4-ecdd11902c20-ca\") pod \"cluster-proxy-proxy-agent-f486444b4-gczbt\" (UID: \"267cbc4f-38d3-482c-8cd4-ecdd11902c20\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.415228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.415202 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/267cbc4f-38d3-482c-8cd4-ecdd11902c20-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f486444b4-gczbt\" (UID: \"267cbc4f-38d3-482c-8cd4-ecdd11902c20\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.415408 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.415281 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/64e243ca-f853-4687-a4d9-07d517f6cda5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5f98685b8f-gl7d4\" (UID: \"64e243ca-f853-4687-a4d9-07d517f6cda5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f98685b8f-gl7d4" Apr 20 20:06:07.415408 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.415327 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/267cbc4f-38d3-482c-8cd4-ecdd11902c20-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f486444b4-gczbt\" (UID: \"267cbc4f-38d3-482c-8cd4-ecdd11902c20\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.415408 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.415327 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/267cbc4f-38d3-482c-8cd4-ecdd11902c20-hub\") pod \"cluster-proxy-proxy-agent-f486444b4-gczbt\" (UID: \"267cbc4f-38d3-482c-8cd4-ecdd11902c20\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.419002 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.418981 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dscsk\" (UniqueName: \"kubernetes.io/projected/64e243ca-f853-4687-a4d9-07d517f6cda5-kube-api-access-dscsk\") pod \"managed-serviceaccount-addon-agent-5f98685b8f-gl7d4\" (UID: \"64e243ca-f853-4687-a4d9-07d517f6cda5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f98685b8f-gl7d4" Apr 20 20:06:07.420842 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.420819 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q76lv\" (UniqueName: \"kubernetes.io/projected/267cbc4f-38d3-482c-8cd4-ecdd11902c20-kube-api-access-q76lv\") pod \"cluster-proxy-proxy-agent-f486444b4-gczbt\" (UID: \"267cbc4f-38d3-482c-8cd4-ecdd11902c20\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.441839 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.441786 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9rdlg" event={"ID":"44d1f99f-7189-4247-a28d-b55ea139aca0","Type":"ContainerStarted","Data":"463f6ae50e153a11bfa07092af4b5a6321fe95882ab7d38c18654b22106b30be"} Apr 20 20:06:07.444169 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.444155 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f98685b8f-gl7d4" Apr 20 20:06:07.454256 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.454227 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:06:07.456448 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.456402 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9rdlg" podStartSLOduration=18.409052498 podStartE2EDuration="22.456388927s" podCreationTimestamp="2026-04-20 20:05:45 +0000 UTC" firstStartedPulling="2026-04-20 20:06:02.363323595 +0000 UTC m=+36.717926320" lastFinishedPulling="2026-04-20 20:06:06.410660016 +0000 UTC m=+40.765262749" observedRunningTime="2026-04-20 20:06:07.45594333 +0000 UTC m=+41.810546077" watchObservedRunningTime="2026-04-20 20:06:07.456388927 +0000 UTC m=+41.810991673" Apr 20 20:06:07.593454 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.593426 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f98685b8f-gl7d4"] Apr 20 20:06:07.596840 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:06:07.596767 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64e243ca_f853_4687_a4d9_07d517f6cda5.slice/crio-933b2e592ff3290933491dfed889494a5727a890f130b63999921749d77ce3df WatchSource:0}: Error finding container 933b2e592ff3290933491dfed889494a5727a890f130b63999921749d77ce3df: Status 404 returned error can't find the container with id 933b2e592ff3290933491dfed889494a5727a890f130b63999921749d77ce3df Apr 20 20:06:07.614175 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:07.614147 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt"] Apr 20 20:06:07.616658 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:06:07.616630 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod267cbc4f_38d3_482c_8cd4_ecdd11902c20.slice/crio-de79d6c6afa4211d83857b0f9829b8938193e8d4a17b61001c1350c094f71b35 WatchSource:0}: Error finding container de79d6c6afa4211d83857b0f9829b8938193e8d4a17b61001c1350c094f71b35: Status 404 returned error can't find the container with id de79d6c6afa4211d83857b0f9829b8938193e8d4a17b61001c1350c094f71b35 Apr 20 20:06:08.444973 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:08.444934 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" event={"ID":"267cbc4f-38d3-482c-8cd4-ecdd11902c20","Type":"ContainerStarted","Data":"de79d6c6afa4211d83857b0f9829b8938193e8d4a17b61001c1350c094f71b35"} Apr 20 20:06:08.448254 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:08.448220 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f98685b8f-gl7d4" event={"ID":"64e243ca-f853-4687-a4d9-07d517f6cda5","Type":"ContainerStarted","Data":"933b2e592ff3290933491dfed889494a5727a890f130b63999921749d77ce3df"} Apr 20 20:06:12.457343 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:12.457289 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" event={"ID":"267cbc4f-38d3-482c-8cd4-ecdd11902c20","Type":"ContainerStarted","Data":"0f3ab394f7450d8996c82dd6a4b0557b92450b5e49fc02d07409b2c0684c582a"} Apr 20 20:06:12.458809 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:12.458762 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f98685b8f-gl7d4" event={"ID":"64e243ca-f853-4687-a4d9-07d517f6cda5","Type":"ContainerStarted","Data":"73b8a1b532f60c6d5a74530c0ad1b95fcc358f890f266cc086ed7690d7fa5ca7"} Apr 20 20:06:12.475580 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:12.475531 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f98685b8f-gl7d4" podStartSLOduration=1.593094281 podStartE2EDuration="5.475518407s" podCreationTimestamp="2026-04-20 20:06:07 +0000 UTC" firstStartedPulling="2026-04-20 20:06:07.598877283 +0000 UTC m=+41.953480008" lastFinishedPulling="2026-04-20 20:06:11.48130141 +0000 UTC m=+45.835904134" observedRunningTime="2026-04-20 20:06:12.475234633 +0000 UTC m=+46.829837380" watchObservedRunningTime="2026-04-20 20:06:12.475518407 +0000 UTC m=+46.830121131" Apr 20 20:06:14.464355 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:14.464324 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" event={"ID":"267cbc4f-38d3-482c-8cd4-ecdd11902c20","Type":"ContainerStarted","Data":"3ae6821467b0f580343d4ebaee7af31abbb221d3c35fe32dfcccce06eb974973"} Apr 20 20:06:14.871992 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:14.871953 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:06:14.872182 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:14.872017 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert\") pod \"ingress-canary-vw9v8\" (UID: \"857f5f33-f93c-4621-bf95-3178c8f078b1\") " pod="openshift-ingress-canary/ingress-canary-vw9v8" Apr 20 20:06:14.872182 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:14.872122 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:14.872182 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:14.872122 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:14.872182 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:14.872183 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls podName:86fdf4f4-4135-45ec-8ea1-ade53c77d176 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:30.872169419 +0000 UTC m=+65.226772155 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls") pod "dns-default-mz59j" (UID: "86fdf4f4-4135-45ec-8ea1-ade53c77d176") : secret "dns-default-metrics-tls" not found Apr 20 20:06:14.872383 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:14.872198 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert podName:857f5f33-f93c-4621-bf95-3178c8f078b1 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:30.872191483 +0000 UTC m=+65.226794218 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert") pod "ingress-canary-vw9v8" (UID: "857f5f33-f93c-4621-bf95-3178c8f078b1") : secret "canary-serving-cert" not found Apr 20 20:06:15.467578 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:15.467543 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" event={"ID":"267cbc4f-38d3-482c-8cd4-ecdd11902c20","Type":"ContainerStarted","Data":"491ca9451e93304103ab83f2cc71623a4b43c0338f06d29c81c8784fa1d988d9"} Apr 20 20:06:15.485647 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:15.485598 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" podStartSLOduration=1.790353848 podStartE2EDuration="8.485584861s" podCreationTimestamp="2026-04-20 20:06:07 +0000 UTC" firstStartedPulling="2026-04-20 20:06:07.618400307 +0000 UTC m=+41.973003031" lastFinishedPulling="2026-04-20 20:06:14.313631313 +0000 UTC m=+48.668234044" observedRunningTime="2026-04-20 20:06:15.485108191 +0000 UTC m=+49.839710936" watchObservedRunningTime="2026-04-20 20:06:15.485584861 +0000 UTC m=+49.840187607" Apr 20 20:06:30.880020 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:30.879966 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:06:30.880445 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:30.880041 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert\") pod \"ingress-canary-vw9v8\" (UID: \"857f5f33-f93c-4621-bf95-3178c8f078b1\") " pod="openshift-ingress-canary/ingress-canary-vw9v8" Apr 20 20:06:30.880445 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:30.880114 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:30.880445 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:30.880151 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:30.880445 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:30.880195 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls podName:86fdf4f4-4135-45ec-8ea1-ade53c77d176 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:02.880175615 +0000 UTC m=+97.234778358 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls") pod "dns-default-mz59j" (UID: "86fdf4f4-4135-45ec-8ea1-ade53c77d176") : secret "dns-default-metrics-tls" not found Apr 20 20:06:30.880445 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:30.880210 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert podName:857f5f33-f93c-4621-bf95-3178c8f078b1 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:02.880203353 +0000 UTC m=+97.234806077 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert") pod "ingress-canary-vw9v8" (UID: "857f5f33-f93c-4621-bf95-3178c8f078b1") : secret "canary-serving-cert" not found Apr 20 20:06:31.986981 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:31.986948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs\") pod \"network-metrics-daemon-tkqg4\" (UID: \"e5ae3fcb-9103-43ca-a45c-46f86f99016e\") " pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:06:31.989929 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:31.989909 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:06:31.997637 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:31.997615 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:06:31.997716 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:06:31.997678 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs podName:e5ae3fcb-9103-43ca-a45c-46f86f99016e nodeName:}" failed. No retries permitted until 2026-04-20 20:07:35.997656295 +0000 UTC m=+130.352259033 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs") pod "network-metrics-daemon-tkqg4" (UID: "e5ae3fcb-9103-43ca-a45c-46f86f99016e") : secret "metrics-daemon-secret" not found Apr 20 20:06:32.087614 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:32.087584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85fws\" (UniqueName: \"kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws\") pod \"network-check-target-zwtx2\" (UID: \"382abf9b-eb7a-4ef7-918c-50641d3558a8\") " pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:06:32.090174 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:32.090158 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:06:32.100031 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:32.100016 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:06:32.111243 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:32.111214 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85fws\" (UniqueName: \"kubernetes.io/projected/382abf9b-eb7a-4ef7-918c-50641d3558a8-kube-api-access-85fws\") pod \"network-check-target-zwtx2\" (UID: \"382abf9b-eb7a-4ef7-918c-50641d3558a8\") " pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:06:32.333365 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:32.333286 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5w7vj\"" Apr 20 20:06:32.341347 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:32.341334 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:06:32.457537 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:32.457510 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zwtx2"] Apr 20 20:06:32.460663 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:06:32.460631 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod382abf9b_eb7a_4ef7_918c_50641d3558a8.slice/crio-9d9740d4bbbceb9bb23fdba4d303dc219136e03cce7ac7c9e28baa57d7e652e9 WatchSource:0}: Error finding container 9d9740d4bbbceb9bb23fdba4d303dc219136e03cce7ac7c9e28baa57d7e652e9: Status 404 returned error can't find the container with id 9d9740d4bbbceb9bb23fdba4d303dc219136e03cce7ac7c9e28baa57d7e652e9 Apr 20 20:06:32.500653 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:32.500624 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zwtx2" event={"ID":"382abf9b-eb7a-4ef7-918c-50641d3558a8","Type":"ContainerStarted","Data":"9d9740d4bbbceb9bb23fdba4d303dc219136e03cce7ac7c9e28baa57d7e652e9"} Apr 20 20:06:35.509747 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:35.509705 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zwtx2" event={"ID":"382abf9b-eb7a-4ef7-918c-50641d3558a8","Type":"ContainerStarted","Data":"18aed95df8d88e4b5dae709800a37580365773d71f01fad2ef3c917d353e1dbb"} Apr 20 20:06:35.510156 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:35.509916 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:06:35.528512 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:06:35.528472 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zwtx2" podStartSLOduration=66.676963053 podStartE2EDuration="1m9.528459804s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:06:32.462274367 +0000 UTC m=+66.816877093" lastFinishedPulling="2026-04-20 20:06:35.313771106 +0000 UTC m=+69.668373844" observedRunningTime="2026-04-20 20:06:35.527862872 +0000 UTC m=+69.882465615" watchObservedRunningTime="2026-04-20 20:06:35.528459804 +0000 UTC m=+69.883062547" Apr 20 20:07:02.896052 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:07:02.896001 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:07:02.896052 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:07:02.896060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert\") pod \"ingress-canary-vw9v8\" (UID: \"857f5f33-f93c-4621-bf95-3178c8f078b1\") " pod="openshift-ingress-canary/ingress-canary-vw9v8" Apr 20 20:07:02.896592 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:07:02.896151 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:07:02.896592 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:07:02.896182 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:07:02.896592 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:07:02.896225 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls podName:86fdf4f4-4135-45ec-8ea1-ade53c77d176 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:06.896205723 +0000 UTC m=+161.250808447 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls") pod "dns-default-mz59j" (UID: "86fdf4f4-4135-45ec-8ea1-ade53c77d176") : secret "dns-default-metrics-tls" not found Apr 20 20:07:02.896592 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:07:02.896241 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert podName:857f5f33-f93c-4621-bf95-3178c8f078b1 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:06.896234505 +0000 UTC m=+161.250837229 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert") pod "ingress-canary-vw9v8" (UID: "857f5f33-f93c-4621-bf95-3178c8f078b1") : secret "canary-serving-cert" not found Apr 20 20:07:06.514108 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:07:06.514078 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zwtx2" Apr 20 20:07:36.016238 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:07:36.016180 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs\") pod \"network-metrics-daemon-tkqg4\" (UID: \"e5ae3fcb-9103-43ca-a45c-46f86f99016e\") " pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:07:36.016765 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:07:36.016333 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:07:36.016765 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:07:36.016422 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs podName:e5ae3fcb-9103-43ca-a45c-46f86f99016e nodeName:}" failed. No retries permitted until 2026-04-20 20:09:38.016402129 +0000 UTC m=+252.371004852 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs") pod "network-metrics-daemon-tkqg4" (UID: "e5ae3fcb-9103-43ca-a45c-46f86f99016e") : secret "metrics-daemon-secret" not found Apr 20 20:07:46.574137 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:07:46.574105 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pmx7s_825cf132-8688-4d28-b93d-28380334363a/dns-node-resolver/0.log" Apr 20 20:07:47.376150 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:07:47.376124 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vhc9b_dc97d8be-f2be-4f95-b5f8-2f30bd207040/node-ca/0.log" Apr 20 20:08:02.061619 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:08:02.061580 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-mz59j" podUID="86fdf4f4-4135-45ec-8ea1-ade53c77d176" Apr 20 20:08:02.072875 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:08:02.072855 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-vw9v8" podUID="857f5f33-f93c-4621-bf95-3178c8f078b1" Apr 20 20:08:02.724053 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:02.724020 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vw9v8" Apr 20 20:08:02.724236 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:02.724025 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mz59j" Apr 20 20:08:03.242498 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:08:03.242461 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-tkqg4" podUID="e5ae3fcb-9103-43ca-a45c-46f86f99016e" Apr 20 20:08:06.929276 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:06.929238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:08:06.929276 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:06.929282 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert\") pod \"ingress-canary-vw9v8\" (UID: \"857f5f33-f93c-4621-bf95-3178c8f078b1\") " pod="openshift-ingress-canary/ingress-canary-vw9v8" Apr 20 20:08:06.931659 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:06.931630 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86fdf4f4-4135-45ec-8ea1-ade53c77d176-metrics-tls\") pod \"dns-default-mz59j\" (UID: \"86fdf4f4-4135-45ec-8ea1-ade53c77d176\") " pod="openshift-dns/dns-default-mz59j" Apr 20 20:08:06.931812 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:06.931774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/857f5f33-f93c-4621-bf95-3178c8f078b1-cert\") pod \"ingress-canary-vw9v8\" (UID: \"857f5f33-f93c-4621-bf95-3178c8f078b1\") " pod="openshift-ingress-canary/ingress-canary-vw9v8" Apr 20 20:08:07.227300 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:07.227233 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m22kk\"" Apr 20 20:08:07.228087 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:07.228072 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wkgqm\"" Apr 20 20:08:07.235072 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:07.235052 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mz59j" Apr 20 20:08:07.235072 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:07.235072 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vw9v8" Apr 20 20:08:07.363139 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:07.363112 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mz59j"] Apr 20 20:08:07.366832 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:08:07.366771 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86fdf4f4_4135_45ec_8ea1_ade53c77d176.slice/crio-cfc7294a2e3e52531add5e30bccb4a3d7a88ca007bb2ced4217a45c6a7476cf3 WatchSource:0}: Error finding container cfc7294a2e3e52531add5e30bccb4a3d7a88ca007bb2ced4217a45c6a7476cf3: Status 404 returned error can't find the container with id cfc7294a2e3e52531add5e30bccb4a3d7a88ca007bb2ced4217a45c6a7476cf3 Apr 20 20:08:07.377560 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:07.377536 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vw9v8"] Apr 20 20:08:07.379745 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:08:07.379726 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod857f5f33_f93c_4621_bf95_3178c8f078b1.slice/crio-25406032bc73eaea538f3ca65e30c43254b5af9cc26b3564b231ba5c580b48dd WatchSource:0}: Error finding container 25406032bc73eaea538f3ca65e30c43254b5af9cc26b3564b231ba5c580b48dd: Status 404 returned error can't find the container with id 25406032bc73eaea538f3ca65e30c43254b5af9cc26b3564b231ba5c580b48dd Apr 20 20:08:07.739974 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:07.739939 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vw9v8" event={"ID":"857f5f33-f93c-4621-bf95-3178c8f078b1","Type":"ContainerStarted","Data":"25406032bc73eaea538f3ca65e30c43254b5af9cc26b3564b231ba5c580b48dd"} Apr 20 20:08:07.740886 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:07.740863 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mz59j" event={"ID":"86fdf4f4-4135-45ec-8ea1-ade53c77d176","Type":"ContainerStarted","Data":"cfc7294a2e3e52531add5e30bccb4a3d7a88ca007bb2ced4217a45c6a7476cf3"} Apr 20 20:08:09.746982 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:09.746941 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vw9v8" event={"ID":"857f5f33-f93c-4621-bf95-3178c8f078b1","Type":"ContainerStarted","Data":"c67946c6b55cda15d092b6e376343931a64b516ba4a3e31fa983abf1c4336b0b"} Apr 20 20:08:09.748476 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:09.748449 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mz59j" event={"ID":"86fdf4f4-4135-45ec-8ea1-ade53c77d176","Type":"ContainerStarted","Data":"5e0f5ea06710d756e3d08254d24b0f606db6595ff99b8e7549db002c97df44bd"} Apr 20 20:08:09.748584 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:09.748483 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mz59j" event={"ID":"86fdf4f4-4135-45ec-8ea1-ade53c77d176","Type":"ContainerStarted","Data":"48f5a3794b901cb438cf3dd34cd40848af4d774f28d14679fbabaf7a96d0d08c"} Apr 20 20:08:09.748622 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:09.748591 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-mz59j" Apr 20 20:08:09.763258 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:09.763226 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vw9v8" podStartSLOduration=128.779133633 podStartE2EDuration="2m10.76321594s" podCreationTimestamp="2026-04-20 20:05:59 +0000 UTC" firstStartedPulling="2026-04-20 20:08:07.381382789 +0000 UTC m=+161.735985514" lastFinishedPulling="2026-04-20 20:08:09.365465083 +0000 UTC m=+163.720067821" observedRunningTime="2026-04-20 20:08:09.762775443 +0000 UTC m=+164.117378189" watchObservedRunningTime="2026-04-20 20:08:09.76321594 +0000 UTC m=+164.117818685" Apr 20 20:08:09.779906 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:09.779873 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mz59j" podStartSLOduration=129.365061418 podStartE2EDuration="2m10.779862247s" podCreationTimestamp="2026-04-20 20:05:59 +0000 UTC" firstStartedPulling="2026-04-20 20:08:07.368622584 +0000 UTC m=+161.723225321" lastFinishedPulling="2026-04-20 20:08:08.783423415 +0000 UTC m=+163.138026150" observedRunningTime="2026-04-20 20:08:09.779397159 +0000 UTC m=+164.133999905" watchObservedRunningTime="2026-04-20 20:08:09.779862247 +0000 UTC m=+164.134464994" Apr 20 20:08:11.754776 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:11.754747 2575 generic.go:358] "Generic (PLEG): container finished" podID="64e243ca-f853-4687-a4d9-07d517f6cda5" containerID="73b8a1b532f60c6d5a74530c0ad1b95fcc358f890f266cc086ed7690d7fa5ca7" exitCode=255 Apr 20 20:08:11.755093 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:11.754818 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f98685b8f-gl7d4" event={"ID":"64e243ca-f853-4687-a4d9-07d517f6cda5","Type":"ContainerDied","Data":"73b8a1b532f60c6d5a74530c0ad1b95fcc358f890f266cc086ed7690d7fa5ca7"} Apr 20 20:08:11.755155 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:11.755098 2575 scope.go:117] "RemoveContainer" containerID="73b8a1b532f60c6d5a74530c0ad1b95fcc358f890f266cc086ed7690d7fa5ca7" Apr 20 20:08:12.759159 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:12.759118 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f98685b8f-gl7d4" event={"ID":"64e243ca-f853-4687-a4d9-07d517f6cda5","Type":"ContainerStarted","Data":"1bbde78ff50e541684be7b1012cfc86aa68fe097ca82a17e0cdece2af0faa767"} Apr 20 20:08:17.217201 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:17.217167 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:08:19.754072 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:19.754047 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mz59j" Apr 20 20:08:21.069936 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.069903 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-7n2qv"] Apr 20 20:08:21.073161 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.073142 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7n2qv" Apr 20 20:08:21.075917 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.075882 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 20:08:21.075917 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.075893 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 20:08:21.076071 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.075936 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 20:08:21.076931 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.076915 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-lxzp7\"" Apr 20 20:08:21.076992 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.076928 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 20:08:21.098970 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.098938 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7n2qv"] Apr 20 20:08:21.124976 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.124956 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4f69caa4-e0fd-4758-a26d-5de4cae89837-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7n2qv\" (UID: \"4f69caa4-e0fd-4758-a26d-5de4cae89837\") " pod="openshift-insights/insights-runtime-extractor-7n2qv" Apr 20 20:08:21.125091 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.124990 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8bzl\" (UniqueName: \"kubernetes.io/projected/4f69caa4-e0fd-4758-a26d-5de4cae89837-kube-api-access-v8bzl\") pod \"insights-runtime-extractor-7n2qv\" (UID: \"4f69caa4-e0fd-4758-a26d-5de4cae89837\") " pod="openshift-insights/insights-runtime-extractor-7n2qv" Apr 20 20:08:21.125091 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.125019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f69caa4-e0fd-4758-a26d-5de4cae89837-data-volume\") pod \"insights-runtime-extractor-7n2qv\" (UID: \"4f69caa4-e0fd-4758-a26d-5de4cae89837\") " pod="openshift-insights/insights-runtime-extractor-7n2qv" Apr 20 20:08:21.125210 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.125088 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4f69caa4-e0fd-4758-a26d-5de4cae89837-crio-socket\") pod \"insights-runtime-extractor-7n2qv\" (UID: \"4f69caa4-e0fd-4758-a26d-5de4cae89837\") " pod="openshift-insights/insights-runtime-extractor-7n2qv" Apr 20 20:08:21.125210 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.125116 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4f69caa4-e0fd-4758-a26d-5de4cae89837-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7n2qv\" (UID: \"4f69caa4-e0fd-4758-a26d-5de4cae89837\") " pod="openshift-insights/insights-runtime-extractor-7n2qv" Apr 20 20:08:21.225733 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.225711 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4f69caa4-e0fd-4758-a26d-5de4cae89837-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7n2qv\" (UID: \"4f69caa4-e0fd-4758-a26d-5de4cae89837\") " pod="openshift-insights/insights-runtime-extractor-7n2qv" Apr 20 20:08:21.225878 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.225740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4f69caa4-e0fd-4758-a26d-5de4cae89837-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7n2qv\" (UID: \"4f69caa4-e0fd-4758-a26d-5de4cae89837\") " pod="openshift-insights/insights-runtime-extractor-7n2qv" Apr 20 20:08:21.225878 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.225773 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8bzl\" (UniqueName: \"kubernetes.io/projected/4f69caa4-e0fd-4758-a26d-5de4cae89837-kube-api-access-v8bzl\") pod \"insights-runtime-extractor-7n2qv\" (UID: \"4f69caa4-e0fd-4758-a26d-5de4cae89837\") " pod="openshift-insights/insights-runtime-extractor-7n2qv" Apr 20 20:08:21.225878 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.225818 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f69caa4-e0fd-4758-a26d-5de4cae89837-data-volume\") pod \"insights-runtime-extractor-7n2qv\" (UID: \"4f69caa4-e0fd-4758-a26d-5de4cae89837\") " pod="openshift-insights/insights-runtime-extractor-7n2qv" Apr 20 20:08:21.226026 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.225877 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4f69caa4-e0fd-4758-a26d-5de4cae89837-crio-socket\") pod \"insights-runtime-extractor-7n2qv\" (UID: \"4f69caa4-e0fd-4758-a26d-5de4cae89837\") " pod="openshift-insights/insights-runtime-extractor-7n2qv" Apr 20 20:08:21.226026 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.225949 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4f69caa4-e0fd-4758-a26d-5de4cae89837-crio-socket\") pod \"insights-runtime-extractor-7n2qv\" (UID: \"4f69caa4-e0fd-4758-a26d-5de4cae89837\") " pod="openshift-insights/insights-runtime-extractor-7n2qv" Apr 20 20:08:21.226259 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.226238 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f69caa4-e0fd-4758-a26d-5de4cae89837-data-volume\") pod \"insights-runtime-extractor-7n2qv\" (UID: \"4f69caa4-e0fd-4758-a26d-5de4cae89837\") " pod="openshift-insights/insights-runtime-extractor-7n2qv" Apr 20 20:08:21.226397 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.226376 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4f69caa4-e0fd-4758-a26d-5de4cae89837-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7n2qv\" (UID: \"4f69caa4-e0fd-4758-a26d-5de4cae89837\") " pod="openshift-insights/insights-runtime-extractor-7n2qv" Apr 20 20:08:21.228214 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.228197 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4f69caa4-e0fd-4758-a26d-5de4cae89837-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7n2qv\" (UID: \"4f69caa4-e0fd-4758-a26d-5de4cae89837\") " pod="openshift-insights/insights-runtime-extractor-7n2qv" Apr 20 20:08:21.244201 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.244183 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8bzl\" (UniqueName: \"kubernetes.io/projected/4f69caa4-e0fd-4758-a26d-5de4cae89837-kube-api-access-v8bzl\") pod \"insights-runtime-extractor-7n2qv\" (UID: \"4f69caa4-e0fd-4758-a26d-5de4cae89837\") " pod="openshift-insights/insights-runtime-extractor-7n2qv" Apr 20 20:08:21.381351 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.381324 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7n2qv" Apr 20 20:08:21.511060 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.511030 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7n2qv"] Apr 20 20:08:21.513956 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:08:21.513929 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f69caa4_e0fd_4758_a26d_5de4cae89837.slice/crio-fa3964a4edd972b13b318884b840f39e24f67d20a7293b761ec1a5361fd60b91 WatchSource:0}: Error finding container fa3964a4edd972b13b318884b840f39e24f67d20a7293b761ec1a5361fd60b91: Status 404 returned error can't find the container with id fa3964a4edd972b13b318884b840f39e24f67d20a7293b761ec1a5361fd60b91 Apr 20 20:08:21.783948 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.783866 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7n2qv" event={"ID":"4f69caa4-e0fd-4758-a26d-5de4cae89837","Type":"ContainerStarted","Data":"84f77d0cb637a265bbbd5e9eeaf217ea3d7c86d99cc784dee06f740528321aa5"} Apr 20 20:08:21.783948 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:21.783905 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7n2qv" event={"ID":"4f69caa4-e0fd-4758-a26d-5de4cae89837","Type":"ContainerStarted","Data":"fa3964a4edd972b13b318884b840f39e24f67d20a7293b761ec1a5361fd60b91"} Apr 20 20:08:22.788211 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:22.788121 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7n2qv" event={"ID":"4f69caa4-e0fd-4758-a26d-5de4cae89837","Type":"ContainerStarted","Data":"c3171d71ef98f6051006bdb0728c43927c4d0a2dfdae4176fddfac35d0ed6354"} Apr 20 20:08:24.257893 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.257857 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9npmw"] Apr 20 20:08:24.260832 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.260815 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" Apr 20 20:08:24.264674 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.264645 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 20:08:24.264856 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.264696 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 20:08:24.265102 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.265086 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 20:08:24.265607 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.265592 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 20:08:24.265666 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.265592 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 20:08:24.265902 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.265880 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-nptq6\"" Apr 20 20:08:24.284876 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.284849 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9npmw"] Apr 20 20:08:24.346196 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.346174 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/95353743-5969-4ff5-8f9b-7bceb016d992-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9npmw\" (UID: \"95353743-5969-4ff5-8f9b-7bceb016d992\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" Apr 20 20:08:24.346302 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.346215 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95353743-5969-4ff5-8f9b-7bceb016d992-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9npmw\" (UID: \"95353743-5969-4ff5-8f9b-7bceb016d992\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" Apr 20 20:08:24.346302 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.346243 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/95353743-5969-4ff5-8f9b-7bceb016d992-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9npmw\" (UID: \"95353743-5969-4ff5-8f9b-7bceb016d992\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" Apr 20 20:08:24.346372 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.346293 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx62c\" (UniqueName: \"kubernetes.io/projected/95353743-5969-4ff5-8f9b-7bceb016d992-kube-api-access-tx62c\") pod \"prometheus-operator-5676c8c784-9npmw\" (UID: \"95353743-5969-4ff5-8f9b-7bceb016d992\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" Apr 20 20:08:24.447041 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.447018 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/95353743-5969-4ff5-8f9b-7bceb016d992-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9npmw\" (UID: \"95353743-5969-4ff5-8f9b-7bceb016d992\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" Apr 20 20:08:24.447133 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.447058 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95353743-5969-4ff5-8f9b-7bceb016d992-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9npmw\" (UID: \"95353743-5969-4ff5-8f9b-7bceb016d992\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" Apr 20 20:08:24.447133 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.447080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/95353743-5969-4ff5-8f9b-7bceb016d992-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9npmw\" (UID: \"95353743-5969-4ff5-8f9b-7bceb016d992\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" Apr 20 20:08:24.447133 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.447098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx62c\" (UniqueName: \"kubernetes.io/projected/95353743-5969-4ff5-8f9b-7bceb016d992-kube-api-access-tx62c\") pod \"prometheus-operator-5676c8c784-9npmw\" (UID: \"95353743-5969-4ff5-8f9b-7bceb016d992\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" Apr 20 20:08:24.447275 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:08:24.447205 2575 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 20 20:08:24.447335 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:08:24.447281 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95353743-5969-4ff5-8f9b-7bceb016d992-prometheus-operator-tls podName:95353743-5969-4ff5-8f9b-7bceb016d992 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:24.947258728 +0000 UTC m=+179.301861454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/95353743-5969-4ff5-8f9b-7bceb016d992-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-9npmw" (UID: "95353743-5969-4ff5-8f9b-7bceb016d992") : secret "prometheus-operator-tls" not found Apr 20 20:08:24.447667 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.447647 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95353743-5969-4ff5-8f9b-7bceb016d992-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9npmw\" (UID: \"95353743-5969-4ff5-8f9b-7bceb016d992\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" Apr 20 20:08:24.449520 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.449494 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/95353743-5969-4ff5-8f9b-7bceb016d992-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9npmw\" (UID: \"95353743-5969-4ff5-8f9b-7bceb016d992\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" Apr 20 20:08:24.456549 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.456529 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx62c\" (UniqueName: \"kubernetes.io/projected/95353743-5969-4ff5-8f9b-7bceb016d992-kube-api-access-tx62c\") pod \"prometheus-operator-5676c8c784-9npmw\" (UID: \"95353743-5969-4ff5-8f9b-7bceb016d992\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" Apr 20 20:08:24.797667 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.797632 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7n2qv" event={"ID":"4f69caa4-e0fd-4758-a26d-5de4cae89837","Type":"ContainerStarted","Data":"c2bc09120707df614394768d6af1b7d16e1e5e027d83a8c27f827858cb01403e"} Apr 20 20:08:24.816959 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.816916 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-7n2qv" podStartSLOduration=1.557180352 podStartE2EDuration="3.816900394s" podCreationTimestamp="2026-04-20 20:08:21 +0000 UTC" firstStartedPulling="2026-04-20 20:08:21.567372362 +0000 UTC m=+175.921975085" lastFinishedPulling="2026-04-20 20:08:23.827092399 +0000 UTC m=+178.181695127" observedRunningTime="2026-04-20 20:08:24.816163309 +0000 UTC m=+179.170766055" watchObservedRunningTime="2026-04-20 20:08:24.816900394 +0000 UTC m=+179.171503139" Apr 20 20:08:24.949925 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.949895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/95353743-5969-4ff5-8f9b-7bceb016d992-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9npmw\" (UID: \"95353743-5969-4ff5-8f9b-7bceb016d992\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" Apr 20 20:08:24.952397 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:24.952369 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/95353743-5969-4ff5-8f9b-7bceb016d992-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9npmw\" (UID: \"95353743-5969-4ff5-8f9b-7bceb016d992\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" Apr 20 20:08:25.169948 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:25.169923 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" Apr 20 20:08:25.288142 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:25.288114 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9npmw"] Apr 20 20:08:25.290854 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:08:25.290828 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95353743_5969_4ff5_8f9b_7bceb016d992.slice/crio-bb1ecb3440ea38ac0e76a2fe10888c144a8cd54b2f254c71ee8ccd3d5dd05ab9 WatchSource:0}: Error finding container bb1ecb3440ea38ac0e76a2fe10888c144a8cd54b2f254c71ee8ccd3d5dd05ab9: Status 404 returned error can't find the container with id bb1ecb3440ea38ac0e76a2fe10888c144a8cd54b2f254c71ee8ccd3d5dd05ab9 Apr 20 20:08:25.801591 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:25.801553 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" event={"ID":"95353743-5969-4ff5-8f9b-7bceb016d992","Type":"ContainerStarted","Data":"bb1ecb3440ea38ac0e76a2fe10888c144a8cd54b2f254c71ee8ccd3d5dd05ab9"} Apr 20 20:08:26.806405 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:26.806366 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" event={"ID":"95353743-5969-4ff5-8f9b-7bceb016d992","Type":"ContainerStarted","Data":"739b2248c36f4def09ceac98f183a5671b28cf3ae97186682c99f470a7263364"} Apr 20 20:08:26.806405 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:26.806412 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" event={"ID":"95353743-5969-4ff5-8f9b-7bceb016d992","Type":"ContainerStarted","Data":"fb53ebbb1f734b5ba65ee73f083459bc9ebe3dc75aa0ff59198d9c2a5111f4b0"} Apr 20 20:08:26.826981 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:26.826931 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-9npmw" podStartSLOduration=1.801363334 podStartE2EDuration="2.826915536s" podCreationTimestamp="2026-04-20 20:08:24 +0000 UTC" firstStartedPulling="2026-04-20 20:08:25.293160139 +0000 UTC m=+179.647762863" lastFinishedPulling="2026-04-20 20:08:26.31871234 +0000 UTC m=+180.673315065" observedRunningTime="2026-04-20 20:08:26.826305162 +0000 UTC m=+181.180907910" watchObservedRunningTime="2026-04-20 20:08:26.826915536 +0000 UTC m=+181.181518281" Apr 20 20:08:28.755102 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.755069 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-76tnk"] Apr 20 20:08:28.758201 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.758180 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.760951 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.760930 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 20:08:28.761083 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.760993 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 20:08:28.761083 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.761048 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7dbd8\"" Apr 20 20:08:28.761297 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.761280 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 20:08:28.777636 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.777613 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5sxt\" (UniqueName: \"kubernetes.io/projected/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-kube-api-access-h5sxt\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.777720 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.777642 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-metrics-client-ca\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.777720 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.777660 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-tls\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.777720 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.777677 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-root\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.777851 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.777716 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-wtmp\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.777851 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.777764 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.777851 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.777803 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-accelerators-collector-config\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.777851 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.777848 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-textfile\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.777972 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.777874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-sys\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.878100 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.878079 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-textfile\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.878185 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.878113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-sys\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.878185 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.878137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5sxt\" (UniqueName: \"kubernetes.io/projected/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-kube-api-access-h5sxt\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.878253 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.878199 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-sys\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.878253 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.878233 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-metrics-client-ca\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.878329 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.878261 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-tls\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.878329 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.878278 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-root\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.878329 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.878296 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-wtmp\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.878329 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.878321 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.878500 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.878351 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-root\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.878500 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.878357 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-accelerators-collector-config\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.878500 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:08:28.878394 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 20:08:28.878500 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.878407 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-textfile\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.878500 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.878441 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-wtmp\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.878500 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:08:28.878469 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-tls podName:40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce nodeName:}" failed. No retries permitted until 2026-04-20 20:08:29.378448704 +0000 UTC m=+183.733051431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-tls") pod "node-exporter-76tnk" (UID: "40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce") : secret "node-exporter-tls" not found Apr 20 20:08:28.878816 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.878780 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-metrics-client-ca\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.878858 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.878813 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-accelerators-collector-config\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.880751 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.880735 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:28.890980 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:28.890958 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5sxt\" (UniqueName: \"kubernetes.io/projected/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-kube-api-access-h5sxt\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:29.380622 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:29.380588 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-tls\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:29.380784 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:08:29.380736 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 20:08:29.380858 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:08:29.380828 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-tls podName:40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce nodeName:}" failed. No retries permitted until 2026-04-20 20:08:30.380784637 +0000 UTC m=+184.735387361 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-tls") pod "node-exporter-76tnk" (UID: "40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce") : secret "node-exporter-tls" not found Apr 20 20:08:30.389703 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:30.389674 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-tls\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:30.392112 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:30.392089 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce-node-exporter-tls\") pod \"node-exporter-76tnk\" (UID: \"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce\") " pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:30.567348 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:30.567322 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-76tnk" Apr 20 20:08:30.575295 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:08:30.575269 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40a7c052_d7ca_4d0f_ab9d_6715e8cfb1ce.slice/crio-d32b98c36f22e855a22702ab3839647b9a6972d92a3bb5a82993fabb4cc3d896 WatchSource:0}: Error finding container d32b98c36f22e855a22702ab3839647b9a6972d92a3bb5a82993fabb4cc3d896: Status 404 returned error can't find the container with id d32b98c36f22e855a22702ab3839647b9a6972d92a3bb5a82993fabb4cc3d896 Apr 20 20:08:30.818246 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:30.818186 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-76tnk" event={"ID":"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce","Type":"ContainerStarted","Data":"d32b98c36f22e855a22702ab3839647b9a6972d92a3bb5a82993fabb4cc3d896"} Apr 20 20:08:31.822091 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:31.822059 2575 generic.go:358] "Generic (PLEG): container finished" podID="40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce" containerID="e7159ca250a7bff7554b5306019453ff9879d511c41d7e56af240fb9cc11ba38" exitCode=0 Apr 20 20:08:31.822460 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:31.822101 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-76tnk" event={"ID":"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce","Type":"ContainerDied","Data":"e7159ca250a7bff7554b5306019453ff9879d511c41d7e56af240fb9cc11ba38"} Apr 20 20:08:32.830148 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:32.830107 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-76tnk" event={"ID":"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce","Type":"ContainerStarted","Data":"eb4053584354bd4f9344c3c35c2972997af878088d3c86e8e70350e771d352fb"} Apr 20 20:08:32.830148 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:32.830149 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-76tnk" event={"ID":"40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce","Type":"ContainerStarted","Data":"394a19a5b89f30cadc6a2bba29e86b6cb48de4131f9c8193c7b061f2445772b6"} Apr 20 20:08:32.852839 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:32.852764 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-76tnk" podStartSLOduration=4.140133672 podStartE2EDuration="4.852748177s" podCreationTimestamp="2026-04-20 20:08:28 +0000 UTC" firstStartedPulling="2026-04-20 20:08:30.577040887 +0000 UTC m=+184.931643611" lastFinishedPulling="2026-04-20 20:08:31.289655379 +0000 UTC m=+185.644258116" observedRunningTime="2026-04-20 20:08:32.851053946 +0000 UTC m=+187.205656693" watchObservedRunningTime="2026-04-20 20:08:32.852748177 +0000 UTC m=+187.207350922" Apr 20 20:08:33.172558 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.172528 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-8bc855446-2vt7r"] Apr 20 20:08:33.175726 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.175711 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.179567 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.179548 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 20:08:33.179682 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.179546 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 20:08:33.179682 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.179546 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 20:08:33.180290 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.180269 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-4isdv3nimtvup\"" Apr 20 20:08:33.180439 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.180424 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-8hrxv\"" Apr 20 20:08:33.180562 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.180535 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 20:08:33.197826 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.197786 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8bc855446-2vt7r"] Apr 20 20:08:33.209097 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.209079 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-client-ca-bundle\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.209185 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.209105 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-metrics-server-audit-profiles\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.209185 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.209128 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz5hf\" (UniqueName: \"kubernetes.io/projected/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-kube-api-access-qz5hf\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.209185 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.209170 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-audit-log\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.209295 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.209190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-secret-metrics-server-client-certs\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.209295 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.209218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.209366 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.209296 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-secret-metrics-server-tls\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.309765 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.309744 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-secret-metrics-server-tls\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.309863 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.309775 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-client-ca-bundle\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.309863 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.309813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-metrics-server-audit-profiles\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.309863 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.309833 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qz5hf\" (UniqueName: \"kubernetes.io/projected/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-kube-api-access-qz5hf\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.309993 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.309875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-audit-log\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.309993 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.309892 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-secret-metrics-server-client-certs\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.309993 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.309934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.310324 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.310306 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-audit-log\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.310783 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.310762 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.311081 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.311059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-metrics-server-audit-profiles\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.312373 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.312351 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-secret-metrics-server-client-certs\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.312522 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.312505 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-secret-metrics-server-tls\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.312578 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.312544 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-client-ca-bundle\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.318290 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.318262 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz5hf\" (UniqueName: \"kubernetes.io/projected/2289d69c-8dd9-4860-a6cb-1ff3bfc94470-kube-api-access-qz5hf\") pod \"metrics-server-8bc855446-2vt7r\" (UID: \"2289d69c-8dd9-4860-a6cb-1ff3bfc94470\") " pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.472845 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.472750 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-2m4t9"] Apr 20 20:08:33.477134 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.477116 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2m4t9" Apr 20 20:08:33.479377 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.479356 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 20:08:33.479459 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.479361 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-g6zjs\"" Apr 20 20:08:33.483859 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.483836 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-2m4t9"] Apr 20 20:08:33.485196 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.485180 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:33.511460 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.511436 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7cd7ad34-6ea8-46c6-8919-9d22f21ec3e0-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-2m4t9\" (UID: \"7cd7ad34-6ea8-46c6-8919-9d22f21ec3e0\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2m4t9" Apr 20 20:08:33.612555 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.612520 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7cd7ad34-6ea8-46c6-8919-9d22f21ec3e0-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-2m4t9\" (UID: \"7cd7ad34-6ea8-46c6-8919-9d22f21ec3e0\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2m4t9" Apr 20 20:08:33.612705 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:08:33.612687 2575 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 20 20:08:33.612764 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:08:33.612755 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cd7ad34-6ea8-46c6-8919-9d22f21ec3e0-monitoring-plugin-cert podName:7cd7ad34-6ea8-46c6-8919-9d22f21ec3e0 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:34.112738334 +0000 UTC m=+188.467341076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/7cd7ad34-6ea8-46c6-8919-9d22f21ec3e0-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-2m4t9" (UID: "7cd7ad34-6ea8-46c6-8919-9d22f21ec3e0") : secret "monitoring-plugin-cert" not found Apr 20 20:08:33.644019 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.643902 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8bc855446-2vt7r"] Apr 20 20:08:33.646472 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:08:33.646448 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2289d69c_8dd9_4860_a6cb_1ff3bfc94470.slice/crio-c03402b9bd46e2cf7a42e3ea4fb591847a94092270965827748c0cf41ab12fbf WatchSource:0}: Error finding container c03402b9bd46e2cf7a42e3ea4fb591847a94092270965827748c0cf41ab12fbf: Status 404 returned error can't find the container with id c03402b9bd46e2cf7a42e3ea4fb591847a94092270965827748c0cf41ab12fbf Apr 20 20:08:33.833359 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:33.833298 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" event={"ID":"2289d69c-8dd9-4860-a6cb-1ff3bfc94470","Type":"ContainerStarted","Data":"c03402b9bd46e2cf7a42e3ea4fb591847a94092270965827748c0cf41ab12fbf"} Apr 20 20:08:34.116715 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:34.116688 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7cd7ad34-6ea8-46c6-8919-9d22f21ec3e0-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-2m4t9\" (UID: \"7cd7ad34-6ea8-46c6-8919-9d22f21ec3e0\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2m4t9" Apr 20 20:08:34.119502 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:34.119477 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7cd7ad34-6ea8-46c6-8919-9d22f21ec3e0-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-2m4t9\" (UID: \"7cd7ad34-6ea8-46c6-8919-9d22f21ec3e0\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2m4t9" Apr 20 20:08:34.386651 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:34.386579 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2m4t9" Apr 20 20:08:34.513371 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:34.513343 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-2m4t9"] Apr 20 20:08:34.896783 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:08:34.896743 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cd7ad34_6ea8_46c6_8919_9d22f21ec3e0.slice/crio-46749d09b61401b68bbb78a93dca41bb5c85b9bde071c0369fd85d3e718483ba WatchSource:0}: Error finding container 46749d09b61401b68bbb78a93dca41bb5c85b9bde071c0369fd85d3e718483ba: Status 404 returned error can't find the container with id 46749d09b61401b68bbb78a93dca41bb5c85b9bde071c0369fd85d3e718483ba Apr 20 20:08:35.092166 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.092140 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:08:35.095607 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.095583 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.099562 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.099516 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-fz4pd\"" Apr 20 20:08:35.099670 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.099634 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-805sr78pgg8la\"" Apr 20 20:08:35.100241 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.100210 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 20:08:35.100343 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.100240 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 20:08:35.100407 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.100361 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 20:08:35.100737 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.100709 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 20:08:35.100842 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.100806 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 20:08:35.104481 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.101968 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 20:08:35.104481 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.102416 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 20:08:35.104481 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.104319 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 20:08:35.104481 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.104393 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 20:08:35.104726 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.104662 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 20:08:35.109813 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.109522 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 20:08:35.112256 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.112236 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 20:08:35.126194 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126166 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.126280 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126208 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.126280 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.126280 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126255 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-config\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.126437 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126335 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e35a2045-4177-4415-83f1-6ed2275787d4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.126437 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126369 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.126437 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126396 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e35a2045-4177-4415-83f1-6ed2275787d4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.126437 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126429 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.126683 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126446 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.126683 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126484 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-web-config\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.126683 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126512 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.126683 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126487 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:08:35.126683 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126542 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crgw4\" (UniqueName: \"kubernetes.io/projected/e35a2045-4177-4415-83f1-6ed2275787d4-kube-api-access-crgw4\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.126683 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126571 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.126683 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126593 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.126683 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126610 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.126683 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126625 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.126683 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126658 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.127180 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.126692 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e35a2045-4177-4415-83f1-6ed2275787d4-config-out\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.227889 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.227805 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-config\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.227889 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.227840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e35a2045-4177-4415-83f1-6ed2275787d4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.227889 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.227859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.227889 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.227875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e35a2045-4177-4415-83f1-6ed2275787d4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.228187 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.227994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.228187 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.228038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.228187 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.228062 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-web-config\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.228187 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.228088 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.228187 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.228126 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crgw4\" (UniqueName: \"kubernetes.io/projected/e35a2045-4177-4415-83f1-6ed2275787d4-kube-api-access-crgw4\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.228187 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.228158 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.228187 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.228183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.228515 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.228214 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.228515 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.228241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.228515 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.228239 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e35a2045-4177-4415-83f1-6ed2275787d4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.229676 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.229321 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.230852 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.230826 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.230955 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.230896 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.231025 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.231005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e35a2045-4177-4415-83f1-6ed2275787d4-config-out\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.231078 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.231046 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.231133 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.231097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.231133 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.231123 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.232224 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.232198 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.232430 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.232242 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.232692 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.232666 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-config\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.232992 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.232944 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.233105 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.233011 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e35a2045-4177-4415-83f1-6ed2275787d4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.233105 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.233088 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.233722 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.233698 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-web-config\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.233854 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.233741 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.233936 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.233915 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.234004 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.233964 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.234312 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.234292 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e35a2045-4177-4415-83f1-6ed2275787d4-config-out\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.234434 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.234417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.235030 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.235011 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.235308 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.235291 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.250069 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.250049 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crgw4\" (UniqueName: \"kubernetes.io/projected/e35a2045-4177-4415-83f1-6ed2275787d4-kube-api-access-crgw4\") pod \"prometheus-k8s-0\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.412517 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.412490 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:35.572285 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.572228 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:08:35.578042 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:08:35.578010 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode35a2045_4177_4415_83f1_6ed2275787d4.slice/crio-8f81a20dd4189d159c3c0ef29ae474a2e12de2ebe5cc70106c90a02eb6982bc5 WatchSource:0}: Error finding container 8f81a20dd4189d159c3c0ef29ae474a2e12de2ebe5cc70106c90a02eb6982bc5: Status 404 returned error can't find the container with id 8f81a20dd4189d159c3c0ef29ae474a2e12de2ebe5cc70106c90a02eb6982bc5 Apr 20 20:08:35.840902 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.840860 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" event={"ID":"2289d69c-8dd9-4860-a6cb-1ff3bfc94470","Type":"ContainerStarted","Data":"3952a749b0df75ccd1fd82f0835af8037a78b38f1e5d35e7ba4a1f5409c5bdd0"} Apr 20 20:08:35.842098 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.842065 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e35a2045-4177-4415-83f1-6ed2275787d4","Type":"ContainerStarted","Data":"8f81a20dd4189d159c3c0ef29ae474a2e12de2ebe5cc70106c90a02eb6982bc5"} Apr 20 20:08:35.843144 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.843108 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2m4t9" event={"ID":"7cd7ad34-6ea8-46c6-8919-9d22f21ec3e0","Type":"ContainerStarted","Data":"46749d09b61401b68bbb78a93dca41bb5c85b9bde071c0369fd85d3e718483ba"} Apr 20 20:08:35.858346 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:35.858297 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" podStartSLOduration=1.560992373 podStartE2EDuration="2.858282827s" podCreationTimestamp="2026-04-20 20:08:33 +0000 UTC" firstStartedPulling="2026-04-20 20:08:33.648292186 +0000 UTC m=+188.002894911" lastFinishedPulling="2026-04-20 20:08:34.945582639 +0000 UTC m=+189.300185365" observedRunningTime="2026-04-20 20:08:35.85784297 +0000 UTC m=+190.212445717" watchObservedRunningTime="2026-04-20 20:08:35.858282827 +0000 UTC m=+190.212885576" Apr 20 20:08:36.847983 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:36.847936 2575 generic.go:358] "Generic (PLEG): container finished" podID="e35a2045-4177-4415-83f1-6ed2275787d4" containerID="4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7" exitCode=0 Apr 20 20:08:36.848336 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:36.847997 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e35a2045-4177-4415-83f1-6ed2275787d4","Type":"ContainerDied","Data":"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7"} Apr 20 20:08:36.849402 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:36.849377 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2m4t9" event={"ID":"7cd7ad34-6ea8-46c6-8919-9d22f21ec3e0","Type":"ContainerStarted","Data":"30ae2f5f44fdfd26b11f5bd0ab15ef4fc1b530fe008ec1db016db1e5e5047b98"} Apr 20 20:08:36.900089 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:36.900013 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2m4t9" podStartSLOduration=2.71208925 podStartE2EDuration="3.900002953s" podCreationTimestamp="2026-04-20 20:08:33 +0000 UTC" firstStartedPulling="2026-04-20 20:08:34.900375047 +0000 UTC m=+189.254977774" lastFinishedPulling="2026-04-20 20:08:36.088288751 +0000 UTC m=+190.442891477" observedRunningTime="2026-04-20 20:08:36.899116056 +0000 UTC m=+191.253718803" watchObservedRunningTime="2026-04-20 20:08:36.900002953 +0000 UTC m=+191.254605699" Apr 20 20:08:37.852738 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:37.852673 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2m4t9" Apr 20 20:08:37.858308 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:37.858286 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2m4t9" Apr 20 20:08:39.860286 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:39.860241 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e35a2045-4177-4415-83f1-6ed2275787d4","Type":"ContainerStarted","Data":"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e"} Apr 20 20:08:39.860756 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:39.860293 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e35a2045-4177-4415-83f1-6ed2275787d4","Type":"ContainerStarted","Data":"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6"} Apr 20 20:08:41.868973 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:41.868933 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e35a2045-4177-4415-83f1-6ed2275787d4","Type":"ContainerStarted","Data":"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900"} Apr 20 20:08:41.868973 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:41.868969 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e35a2045-4177-4415-83f1-6ed2275787d4","Type":"ContainerStarted","Data":"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19"} Apr 20 20:08:41.869392 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:41.868983 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e35a2045-4177-4415-83f1-6ed2275787d4","Type":"ContainerStarted","Data":"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272"} Apr 20 20:08:41.869392 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:41.868993 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e35a2045-4177-4415-83f1-6ed2275787d4","Type":"ContainerStarted","Data":"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6"} Apr 20 20:08:41.898648 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:41.898582 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.466111681 podStartE2EDuration="6.898563288s" podCreationTimestamp="2026-04-20 20:08:35 +0000 UTC" firstStartedPulling="2026-04-20 20:08:35.582918304 +0000 UTC m=+189.937521032" lastFinishedPulling="2026-04-20 20:08:41.015369915 +0000 UTC m=+195.369972639" observedRunningTime="2026-04-20 20:08:41.897105945 +0000 UTC m=+196.251708697" watchObservedRunningTime="2026-04-20 20:08:41.898563288 +0000 UTC m=+196.253166035" Apr 20 20:08:45.413705 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:45.413661 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:53.485721 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:53.485675 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:53.485721 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:53.485726 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:08:57.455759 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:08:57.455697 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" podUID="267cbc4f-38d3-482c-8cd4-ecdd11902c20" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 20:09:07.455140 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:07.455100 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" podUID="267cbc4f-38d3-482c-8cd4-ecdd11902c20" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 20:09:13.491441 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:13.491410 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:09:13.495369 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:13.495348 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-8bc855446-2vt7r" Apr 20 20:09:17.455914 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:17.455877 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" podUID="267cbc4f-38d3-482c-8cd4-ecdd11902c20" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 20:09:17.456275 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:17.455948 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" Apr 20 20:09:17.456422 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:17.456391 2575 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"491ca9451e93304103ab83f2cc71623a4b43c0338f06d29c81c8784fa1d988d9"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 20:09:17.456467 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:17.456453 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" podUID="267cbc4f-38d3-482c-8cd4-ecdd11902c20" containerName="service-proxy" containerID="cri-o://491ca9451e93304103ab83f2cc71623a4b43c0338f06d29c81c8784fa1d988d9" gracePeriod=30 Apr 20 20:09:17.973936 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:17.973905 2575 generic.go:358] "Generic (PLEG): container finished" podID="267cbc4f-38d3-482c-8cd4-ecdd11902c20" containerID="491ca9451e93304103ab83f2cc71623a4b43c0338f06d29c81c8784fa1d988d9" exitCode=2 Apr 20 20:09:17.974090 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:17.973980 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" event={"ID":"267cbc4f-38d3-482c-8cd4-ecdd11902c20","Type":"ContainerDied","Data":"491ca9451e93304103ab83f2cc71623a4b43c0338f06d29c81c8784fa1d988d9"} Apr 20 20:09:17.974090 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:17.974023 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f486444b4-gczbt" event={"ID":"267cbc4f-38d3-482c-8cd4-ecdd11902c20","Type":"ContainerStarted","Data":"737a062cebb5e1471fd407a32094fe421844d4b9de9009b041509ca9c64217fa"} Apr 20 20:09:35.413501 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:35.413460 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:35.432740 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:35.432709 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:36.044717 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:36.044680 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:38.109526 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:38.109494 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs\") pod \"network-metrics-daemon-tkqg4\" (UID: \"e5ae3fcb-9103-43ca-a45c-46f86f99016e\") " pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:09:38.111875 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:38.111848 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5ae3fcb-9103-43ca-a45c-46f86f99016e-metrics-certs\") pod \"network-metrics-daemon-tkqg4\" (UID: \"e5ae3fcb-9103-43ca-a45c-46f86f99016e\") " pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:09:38.220625 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:38.220595 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gwc22\"" Apr 20 20:09:38.227563 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:38.227542 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tkqg4" Apr 20 20:09:38.346890 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:38.346752 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tkqg4"] Apr 20 20:09:38.349643 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:09:38.349614 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5ae3fcb_9103_43ca_a45c_46f86f99016e.slice/crio-71606f78a8399a85ddc05b43b90f645c65b833545bd40c788c881261c1e9b40e WatchSource:0}: Error finding container 71606f78a8399a85ddc05b43b90f645c65b833545bd40c788c881261c1e9b40e: Status 404 returned error can't find the container with id 71606f78a8399a85ddc05b43b90f645c65b833545bd40c788c881261c1e9b40e Apr 20 20:09:39.039132 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:39.039092 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tkqg4" event={"ID":"e5ae3fcb-9103-43ca-a45c-46f86f99016e","Type":"ContainerStarted","Data":"71606f78a8399a85ddc05b43b90f645c65b833545bd40c788c881261c1e9b40e"} Apr 20 20:09:40.045171 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:40.045131 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tkqg4" event={"ID":"e5ae3fcb-9103-43ca-a45c-46f86f99016e","Type":"ContainerStarted","Data":"996b6ee8a69b3daeac26b421b058d12ff302ebba74d1a6004861de9d54632d12"} Apr 20 20:09:40.045171 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:40.045172 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tkqg4" event={"ID":"e5ae3fcb-9103-43ca-a45c-46f86f99016e","Type":"ContainerStarted","Data":"f7a60ca01d3ec9d879779c7ca32edc2b7501ab1ea3b0b52894d07168d3803e81"} Apr 20 20:09:40.063540 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:40.063485 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tkqg4" podStartSLOduration=253.002017032 podStartE2EDuration="4m14.063466367s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:09:38.351658141 +0000 UTC m=+252.706260880" lastFinishedPulling="2026-04-20 20:09:39.413107488 +0000 UTC m=+253.767710215" observedRunningTime="2026-04-20 20:09:40.062079393 +0000 UTC m=+254.416682138" watchObservedRunningTime="2026-04-20 20:09:40.063466367 +0000 UTC m=+254.418069116" Apr 20 20:09:53.360467 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.360426 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:09:53.360991 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.360924 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="prometheus" containerID="cri-o://963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6" gracePeriod=600 Apr 20 20:09:53.361134 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.361054 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="kube-rbac-proxy-web" containerID="cri-o://98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272" gracePeriod=600 Apr 20 20:09:53.361134 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.361071 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="config-reloader" containerID="cri-o://a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e" gracePeriod=600 Apr 20 20:09:53.361134 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.360961 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="kube-rbac-proxy" containerID="cri-o://c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19" gracePeriod=600 Apr 20 20:09:53.361308 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.360997 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="thanos-sidecar" containerID="cri-o://a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6" gracePeriod=600 Apr 20 20:09:53.361308 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.361035 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="kube-rbac-proxy-thanos" containerID="cri-o://a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900" gracePeriod=600 Apr 20 20:09:53.612844 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.612770 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:53.620709 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.620689 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e35a2045-4177-4415-83f1-6ed2275787d4-tls-assets\") pod \"e35a2045-4177-4415-83f1-6ed2275787d4\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " Apr 20 20:09:53.620842 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.620722 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-thanos-prometheus-http-client-file\") pod \"e35a2045-4177-4415-83f1-6ed2275787d4\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " Apr 20 20:09:53.620842 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.620740 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-metrics-client-certs\") pod \"e35a2045-4177-4415-83f1-6ed2275787d4\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " Apr 20 20:09:53.620842 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.620766 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-config\") pod \"e35a2045-4177-4415-83f1-6ed2275787d4\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " Apr 20 20:09:53.620842 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.620808 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-configmap-serving-certs-ca-bundle\") pod \"e35a2045-4177-4415-83f1-6ed2275787d4\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " Apr 20 20:09:53.621074 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.620912 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"e35a2045-4177-4415-83f1-6ed2275787d4\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " Apr 20 20:09:53.621074 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.620964 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-prometheus-k8s-rulefiles-0\") pod \"e35a2045-4177-4415-83f1-6ed2275787d4\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " Apr 20 20:09:53.621074 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.620994 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"e35a2045-4177-4415-83f1-6ed2275787d4\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " Apr 20 20:09:53.621074 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.621036 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-web-config\") pod \"e35a2045-4177-4415-83f1-6ed2275787d4\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " Apr 20 20:09:53.621074 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.621062 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-configmap-metrics-client-ca\") pod \"e35a2045-4177-4415-83f1-6ed2275787d4\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " Apr 20 20:09:53.621345 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.621100 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-configmap-kubelet-serving-ca-bundle\") pod \"e35a2045-4177-4415-83f1-6ed2275787d4\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " Apr 20 20:09:53.621345 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.621170 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "e35a2045-4177-4415-83f1-6ed2275787d4" (UID: "e35a2045-4177-4415-83f1-6ed2275787d4"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:53.625662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.621701 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "e35a2045-4177-4415-83f1-6ed2275787d4" (UID: "e35a2045-4177-4415-83f1-6ed2275787d4"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:53.625662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.622130 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "e35a2045-4177-4415-83f1-6ed2275787d4" (UID: "e35a2045-4177-4415-83f1-6ed2275787d4"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:53.625662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.622279 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "e35a2045-4177-4415-83f1-6ed2275787d4" (UID: "e35a2045-4177-4415-83f1-6ed2275787d4"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:53.625662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.622329 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-grpc-tls\") pod \"e35a2045-4177-4415-83f1-6ed2275787d4\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " Apr 20 20:09:53.625662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.622372 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e35a2045-4177-4415-83f1-6ed2275787d4-prometheus-k8s-db\") pod \"e35a2045-4177-4415-83f1-6ed2275787d4\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " Apr 20 20:09:53.625662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.622403 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-prometheus-trusted-ca-bundle\") pod \"e35a2045-4177-4415-83f1-6ed2275787d4\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " Apr 20 20:09:53.625662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.622426 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-kube-rbac-proxy\") pod \"e35a2045-4177-4415-83f1-6ed2275787d4\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " Apr 20 20:09:53.625662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.622457 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-prometheus-k8s-tls\") pod \"e35a2045-4177-4415-83f1-6ed2275787d4\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " Apr 20 20:09:53.625662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.622504 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crgw4\" (UniqueName: \"kubernetes.io/projected/e35a2045-4177-4415-83f1-6ed2275787d4-kube-api-access-crgw4\") pod \"e35a2045-4177-4415-83f1-6ed2275787d4\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " Apr 20 20:09:53.625662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.622527 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e35a2045-4177-4415-83f1-6ed2275787d4-config-out\") pod \"e35a2045-4177-4415-83f1-6ed2275787d4\" (UID: \"e35a2045-4177-4415-83f1-6ed2275787d4\") " Apr 20 20:09:53.625662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.622722 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-140-155.ec2.internal\" DevicePath \"\"" Apr 20 20:09:53.625662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.622739 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-140-155.ec2.internal\" DevicePath \"\"" Apr 20 20:09:53.625662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.622752 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-configmap-metrics-client-ca\") on node \"ip-10-0-140-155.ec2.internal\" DevicePath \"\"" Apr 20 20:09:53.625662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.622765 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-140-155.ec2.internal\" DevicePath \"\"" Apr 20 20:09:53.625662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.622786 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "e35a2045-4177-4415-83f1-6ed2275787d4" (UID: "e35a2045-4177-4415-83f1-6ed2275787d4"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:53.625662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.623580 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e35a2045-4177-4415-83f1-6ed2275787d4-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "e35a2045-4177-4415-83f1-6ed2275787d4" (UID: "e35a2045-4177-4415-83f1-6ed2275787d4"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:09:53.625662 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.624541 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e35a2045-4177-4415-83f1-6ed2275787d4-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e35a2045-4177-4415-83f1-6ed2275787d4" (UID: "e35a2045-4177-4415-83f1-6ed2275787d4"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:53.626598 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.625629 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "e35a2045-4177-4415-83f1-6ed2275787d4" (UID: "e35a2045-4177-4415-83f1-6ed2275787d4"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:53.626598 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.625835 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e35a2045-4177-4415-83f1-6ed2275787d4-config-out" (OuterVolumeSpecName: "config-out") pod "e35a2045-4177-4415-83f1-6ed2275787d4" (UID: "e35a2045-4177-4415-83f1-6ed2275787d4"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:09:53.626598 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.626228 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-config" (OuterVolumeSpecName: "config") pod "e35a2045-4177-4415-83f1-6ed2275787d4" (UID: "e35a2045-4177-4415-83f1-6ed2275787d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:53.626756 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.626671 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "e35a2045-4177-4415-83f1-6ed2275787d4" (UID: "e35a2045-4177-4415-83f1-6ed2275787d4"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:53.627121 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.627094 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e35a2045-4177-4415-83f1-6ed2275787d4" (UID: "e35a2045-4177-4415-83f1-6ed2275787d4"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:53.627356 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.627331 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "e35a2045-4177-4415-83f1-6ed2275787d4" (UID: "e35a2045-4177-4415-83f1-6ed2275787d4"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:53.627460 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.627392 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e35a2045-4177-4415-83f1-6ed2275787d4-kube-api-access-crgw4" (OuterVolumeSpecName: "kube-api-access-crgw4") pod "e35a2045-4177-4415-83f1-6ed2275787d4" (UID: "e35a2045-4177-4415-83f1-6ed2275787d4"). InnerVolumeSpecName "kube-api-access-crgw4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:53.627624 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.627597 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "e35a2045-4177-4415-83f1-6ed2275787d4" (UID: "e35a2045-4177-4415-83f1-6ed2275787d4"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:53.627743 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.627720 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "e35a2045-4177-4415-83f1-6ed2275787d4" (UID: "e35a2045-4177-4415-83f1-6ed2275787d4"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:53.627827 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.627734 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "e35a2045-4177-4415-83f1-6ed2275787d4" (UID: "e35a2045-4177-4415-83f1-6ed2275787d4"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:53.640021 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.639733 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-web-config" (OuterVolumeSpecName: "web-config") pod "e35a2045-4177-4415-83f1-6ed2275787d4" (UID: "e35a2045-4177-4415-83f1-6ed2275787d4"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:53.723725 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.723688 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e35a2045-4177-4415-83f1-6ed2275787d4-prometheus-k8s-db\") on node \"ip-10-0-140-155.ec2.internal\" DevicePath \"\"" Apr 20 20:09:53.723725 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.723720 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35a2045-4177-4415-83f1-6ed2275787d4-prometheus-trusted-ca-bundle\") on node \"ip-10-0-140-155.ec2.internal\" DevicePath \"\"" Apr 20 20:09:53.723725 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.723731 2575 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-kube-rbac-proxy\") on node \"ip-10-0-140-155.ec2.internal\" DevicePath \"\"" Apr 20 20:09:53.723956 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.723742 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-prometheus-k8s-tls\") on node \"ip-10-0-140-155.ec2.internal\" DevicePath \"\"" Apr 20 20:09:53.723956 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.723752 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-crgw4\" (UniqueName: \"kubernetes.io/projected/e35a2045-4177-4415-83f1-6ed2275787d4-kube-api-access-crgw4\") on node \"ip-10-0-140-155.ec2.internal\" DevicePath \"\"" Apr 20 20:09:53.723956 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.723761 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e35a2045-4177-4415-83f1-6ed2275787d4-config-out\") on node \"ip-10-0-140-155.ec2.internal\" DevicePath \"\"" Apr 20 20:09:53.723956 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.723769 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e35a2045-4177-4415-83f1-6ed2275787d4-tls-assets\") on node \"ip-10-0-140-155.ec2.internal\" DevicePath \"\"" Apr 20 20:09:53.723956 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.723778 2575 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-thanos-prometheus-http-client-file\") on node \"ip-10-0-140-155.ec2.internal\" DevicePath \"\"" Apr 20 20:09:53.723956 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.723787 2575 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-metrics-client-certs\") on node \"ip-10-0-140-155.ec2.internal\" DevicePath \"\"" Apr 20 20:09:53.723956 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.723815 2575 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-config\") on node \"ip-10-0-140-155.ec2.internal\" DevicePath \"\"" Apr 20 20:09:53.723956 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.723824 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-140-155.ec2.internal\" DevicePath \"\"" Apr 20 20:09:53.723956 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.723834 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-140-155.ec2.internal\" DevicePath \"\"" Apr 20 20:09:53.723956 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.723843 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-web-config\") on node \"ip-10-0-140-155.ec2.internal\" DevicePath \"\"" Apr 20 20:09:53.723956 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:53.723851 2575 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e35a2045-4177-4415-83f1-6ed2275787d4-secret-grpc-tls\") on node \"ip-10-0-140-155.ec2.internal\" DevicePath \"\"" Apr 20 20:09:54.089075 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.089038 2575 generic.go:358] "Generic (PLEG): container finished" podID="e35a2045-4177-4415-83f1-6ed2275787d4" containerID="a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900" exitCode=0 Apr 20 20:09:54.089075 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.089065 2575 generic.go:358] "Generic (PLEG): container finished" podID="e35a2045-4177-4415-83f1-6ed2275787d4" containerID="c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19" exitCode=0 Apr 20 20:09:54.089075 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.089071 2575 generic.go:358] "Generic (PLEG): container finished" podID="e35a2045-4177-4415-83f1-6ed2275787d4" containerID="98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272" exitCode=0 Apr 20 20:09:54.089075 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.089079 2575 generic.go:358] "Generic (PLEG): container finished" podID="e35a2045-4177-4415-83f1-6ed2275787d4" containerID="a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6" exitCode=0 Apr 20 20:09:54.089075 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.089084 2575 generic.go:358] "Generic (PLEG): container finished" podID="e35a2045-4177-4415-83f1-6ed2275787d4" containerID="a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e" exitCode=0 Apr 20 20:09:54.089474 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.089090 2575 generic.go:358] "Generic (PLEG): container finished" podID="e35a2045-4177-4415-83f1-6ed2275787d4" containerID="963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6" exitCode=0 Apr 20 20:09:54.089474 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.089128 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e35a2045-4177-4415-83f1-6ed2275787d4","Type":"ContainerDied","Data":"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900"} Apr 20 20:09:54.089474 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.089173 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e35a2045-4177-4415-83f1-6ed2275787d4","Type":"ContainerDied","Data":"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19"} Apr 20 20:09:54.089474 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.089188 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e35a2045-4177-4415-83f1-6ed2275787d4","Type":"ContainerDied","Data":"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272"} Apr 20 20:09:54.089474 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.089193 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.089474 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.089214 2575 scope.go:117] "RemoveContainer" containerID="a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900" Apr 20 20:09:54.089474 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.089200 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e35a2045-4177-4415-83f1-6ed2275787d4","Type":"ContainerDied","Data":"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6"} Apr 20 20:09:54.089474 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.089312 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e35a2045-4177-4415-83f1-6ed2275787d4","Type":"ContainerDied","Data":"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e"} Apr 20 20:09:54.089474 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.089326 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e35a2045-4177-4415-83f1-6ed2275787d4","Type":"ContainerDied","Data":"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6"} Apr 20 20:09:54.089474 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.089341 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e35a2045-4177-4415-83f1-6ed2275787d4","Type":"ContainerDied","Data":"8f81a20dd4189d159c3c0ef29ae474a2e12de2ebe5cc70106c90a02eb6982bc5"} Apr 20 20:09:54.099024 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.099005 2575 scope.go:117] "RemoveContainer" containerID="c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19" Apr 20 20:09:54.106453 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.106374 2575 scope.go:117] "RemoveContainer" containerID="98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272" Apr 20 20:09:54.113905 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.113880 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:09:54.114528 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.114491 2575 scope.go:117] "RemoveContainer" containerID="a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6" Apr 20 20:09:54.119141 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.119114 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:09:54.121899 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.121879 2575 scope.go:117] "RemoveContainer" containerID="a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e" Apr 20 20:09:54.128524 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.128502 2575 scope.go:117] "RemoveContainer" containerID="963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6" Apr 20 20:09:54.135347 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.135323 2575 scope.go:117] "RemoveContainer" containerID="4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7" Apr 20 20:09:54.141614 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.141596 2575 scope.go:117] "RemoveContainer" containerID="a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900" Apr 20 20:09:54.141903 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:09:54.141881 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900\": container with ID starting with a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900 not found: ID does not exist" containerID="a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900" Apr 20 20:09:54.141986 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.141925 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900"} err="failed to get container status \"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900\": rpc error: code = NotFound desc = could not find container \"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900\": container with ID starting with a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900 not found: ID does not exist" Apr 20 20:09:54.141986 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.141950 2575 scope.go:117] "RemoveContainer" containerID="c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19" Apr 20 20:09:54.142254 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:09:54.142235 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19\": container with ID starting with c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19 not found: ID does not exist" containerID="c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19" Apr 20 20:09:54.142301 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.142265 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19"} err="failed to get container status \"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19\": rpc error: code = NotFound desc = could not find container \"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19\": container with ID starting with c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19 not found: ID does not exist" Apr 20 20:09:54.142301 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.142282 2575 scope.go:117] "RemoveContainer" containerID="98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272" Apr 20 20:09:54.142512 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:09:54.142497 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272\": container with ID starting with 98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272 not found: ID does not exist" containerID="98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272" Apr 20 20:09:54.142583 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.142547 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272"} err="failed to get container status \"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272\": rpc error: code = NotFound desc = could not find container \"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272\": container with ID starting with 98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272 not found: ID does not exist" Apr 20 20:09:54.142583 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.142562 2575 scope.go:117] "RemoveContainer" containerID="a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6" Apr 20 20:09:54.142875 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:09:54.142842 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6\": container with ID starting with a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6 not found: ID does not exist" containerID="a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6" Apr 20 20:09:54.142964 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.142871 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6"} err="failed to get container status \"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6\": rpc error: code = NotFound desc = could not find container \"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6\": container with ID starting with a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6 not found: ID does not exist" Apr 20 20:09:54.142964 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.142892 2575 scope.go:117] "RemoveContainer" containerID="a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e" Apr 20 20:09:54.143271 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:09:54.143251 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e\": container with ID starting with a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e not found: ID does not exist" containerID="a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e" Apr 20 20:09:54.143340 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.143276 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e"} err="failed to get container status \"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e\": rpc error: code = NotFound desc = could not find container \"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e\": container with ID starting with a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e not found: ID does not exist" Apr 20 20:09:54.143340 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.143294 2575 scope.go:117] "RemoveContainer" containerID="963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6" Apr 20 20:09:54.143583 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:09:54.143560 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6\": container with ID starting with 963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6 not found: ID does not exist" containerID="963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6" Apr 20 20:09:54.143726 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.143585 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6"} err="failed to get container status \"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6\": rpc error: code = NotFound desc = could not find container \"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6\": container with ID starting with 963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6 not found: ID does not exist" Apr 20 20:09:54.143726 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.143605 2575 scope.go:117] "RemoveContainer" containerID="4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7" Apr 20 20:09:54.143909 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.143883 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:09:54.144041 ip-10-0-140-155 kubenswrapper[2575]: E0420 20:09:54.143889 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7\": container with ID starting with 4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7 not found: ID does not exist" containerID="4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7" Apr 20 20:09:54.144041 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.143987 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7"} err="failed to get container status \"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7\": rpc error: code = NotFound desc = could not find container \"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7\": container with ID starting with 4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7 not found: ID does not exist" Apr 20 20:09:54.144041 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144005 2575 scope.go:117] "RemoveContainer" containerID="a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900" Apr 20 20:09:54.144248 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144231 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="kube-rbac-proxy-web" Apr 20 20:09:54.144302 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144252 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="kube-rbac-proxy-web" Apr 20 20:09:54.144302 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144267 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="init-config-reloader" Apr 20 20:09:54.144302 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144265 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900"} err="failed to get container status \"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900\": rpc error: code = NotFound desc = could not find container \"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900\": container with ID starting with a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900 not found: ID does not exist" Apr 20 20:09:54.144302 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144276 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="init-config-reloader" Apr 20 20:09:54.144302 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144281 2575 scope.go:117] "RemoveContainer" containerID="c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19" Apr 20 20:09:54.144302 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144301 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="thanos-sidecar" Apr 20 20:09:54.144611 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144309 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="thanos-sidecar" Apr 20 20:09:54.144611 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144318 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="kube-rbac-proxy" Apr 20 20:09:54.144611 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144327 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="kube-rbac-proxy" Apr 20 20:09:54.144611 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144339 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="kube-rbac-proxy-thanos" Apr 20 20:09:54.144611 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144347 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="kube-rbac-proxy-thanos" Apr 20 20:09:54.144611 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144358 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="prometheus" Apr 20 20:09:54.144611 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144366 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="prometheus" Apr 20 20:09:54.144611 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144375 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="config-reloader" Apr 20 20:09:54.144611 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144383 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="config-reloader" Apr 20 20:09:54.144611 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144450 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="thanos-sidecar" Apr 20 20:09:54.144611 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144464 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="kube-rbac-proxy" Apr 20 20:09:54.144611 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144474 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="kube-rbac-proxy-thanos" Apr 20 20:09:54.144611 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144485 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="kube-rbac-proxy-web" Apr 20 20:09:54.144611 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144497 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="prometheus" Apr 20 20:09:54.144611 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144508 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" containerName="config-reloader" Apr 20 20:09:54.144611 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144514 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19"} err="failed to get container status \"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19\": rpc error: code = NotFound desc = could not find container \"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19\": container with ID starting with c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19 not found: ID does not exist" Apr 20 20:09:54.144611 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144528 2575 scope.go:117] "RemoveContainer" containerID="98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272" Apr 20 20:09:54.145397 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144766 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272"} err="failed to get container status \"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272\": rpc error: code = NotFound desc = could not find container \"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272\": container with ID starting with 98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272 not found: ID does not exist" Apr 20 20:09:54.145397 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.144787 2575 scope.go:117] "RemoveContainer" containerID="a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6" Apr 20 20:09:54.145397 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.145085 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6"} err="failed to get container status \"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6\": rpc error: code = NotFound desc = could not find container \"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6\": container with ID starting with a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6 not found: ID does not exist" Apr 20 20:09:54.145397 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.145101 2575 scope.go:117] "RemoveContainer" containerID="a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e" Apr 20 20:09:54.145397 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.145296 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e"} err="failed to get container status \"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e\": rpc error: code = NotFound desc = could not find container \"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e\": container with ID starting with a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e not found: ID does not exist" Apr 20 20:09:54.145397 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.145311 2575 scope.go:117] "RemoveContainer" containerID="963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6" Apr 20 20:09:54.145672 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.145522 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6"} err="failed to get container status \"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6\": rpc error: code = NotFound desc = could not find container \"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6\": container with ID starting with 963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6 not found: ID does not exist" Apr 20 20:09:54.145672 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.145538 2575 scope.go:117] "RemoveContainer" containerID="4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7" Apr 20 20:09:54.145764 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.145749 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7"} err="failed to get container status \"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7\": rpc error: code = NotFound desc = could not find container \"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7\": container with ID starting with 4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7 not found: ID does not exist" Apr 20 20:09:54.145839 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.145768 2575 scope.go:117] "RemoveContainer" containerID="a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900" Apr 20 20:09:54.146041 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.146016 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900"} err="failed to get container status \"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900\": rpc error: code = NotFound desc = could not find container \"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900\": container with ID starting with a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900 not found: ID does not exist" Apr 20 20:09:54.146087 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.146041 2575 scope.go:117] "RemoveContainer" containerID="c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19" Apr 20 20:09:54.146293 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.146275 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19"} err="failed to get container status \"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19\": rpc error: code = NotFound desc = could not find container \"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19\": container with ID starting with c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19 not found: ID does not exist" Apr 20 20:09:54.146331 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.146294 2575 scope.go:117] "RemoveContainer" containerID="98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272" Apr 20 20:09:54.146510 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.146491 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272"} err="failed to get container status \"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272\": rpc error: code = NotFound desc = could not find container \"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272\": container with ID starting with 98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272 not found: ID does not exist" Apr 20 20:09:54.146553 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.146511 2575 scope.go:117] "RemoveContainer" containerID="a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6" Apr 20 20:09:54.146741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.146725 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6"} err="failed to get container status \"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6\": rpc error: code = NotFound desc = could not find container \"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6\": container with ID starting with a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6 not found: ID does not exist" Apr 20 20:09:54.146785 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.146741 2575 scope.go:117] "RemoveContainer" containerID="a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e" Apr 20 20:09:54.146967 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.146947 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e"} err="failed to get container status \"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e\": rpc error: code = NotFound desc = could not find container \"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e\": container with ID starting with a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e not found: ID does not exist" Apr 20 20:09:54.147028 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.146969 2575 scope.go:117] "RemoveContainer" containerID="963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6" Apr 20 20:09:54.147202 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.147185 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6"} err="failed to get container status \"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6\": rpc error: code = NotFound desc = could not find container \"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6\": container with ID starting with 963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6 not found: ID does not exist" Apr 20 20:09:54.147202 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.147202 2575 scope.go:117] "RemoveContainer" containerID="4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7" Apr 20 20:09:54.147411 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.147392 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7"} err="failed to get container status \"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7\": rpc error: code = NotFound desc = could not find container \"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7\": container with ID starting with 4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7 not found: ID does not exist" Apr 20 20:09:54.147453 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.147412 2575 scope.go:117] "RemoveContainer" containerID="a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900" Apr 20 20:09:54.147617 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.147600 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900"} err="failed to get container status \"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900\": rpc error: code = NotFound desc = could not find container \"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900\": container with ID starting with a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900 not found: ID does not exist" Apr 20 20:09:54.147697 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.147619 2575 scope.go:117] "RemoveContainer" containerID="c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19" Apr 20 20:09:54.147859 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.147839 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19"} err="failed to get container status \"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19\": rpc error: code = NotFound desc = could not find container \"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19\": container with ID starting with c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19 not found: ID does not exist" Apr 20 20:09:54.147924 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.147861 2575 scope.go:117] "RemoveContainer" containerID="98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272" Apr 20 20:09:54.148104 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.148076 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272"} err="failed to get container status \"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272\": rpc error: code = NotFound desc = could not find container \"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272\": container with ID starting with 98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272 not found: ID does not exist" Apr 20 20:09:54.148104 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.148104 2575 scope.go:117] "RemoveContainer" containerID="a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6" Apr 20 20:09:54.148343 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.148324 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6"} err="failed to get container status \"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6\": rpc error: code = NotFound desc = could not find container \"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6\": container with ID starting with a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6 not found: ID does not exist" Apr 20 20:09:54.148438 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.148345 2575 scope.go:117] "RemoveContainer" containerID="a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e" Apr 20 20:09:54.148643 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.148618 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e"} err="failed to get container status \"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e\": rpc error: code = NotFound desc = could not find container \"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e\": container with ID starting with a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e not found: ID does not exist" Apr 20 20:09:54.148749 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.148644 2575 scope.go:117] "RemoveContainer" containerID="963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6" Apr 20 20:09:54.148851 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.148758 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.148914 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.148894 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6"} err="failed to get container status \"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6\": rpc error: code = NotFound desc = could not find container \"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6\": container with ID starting with 963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6 not found: ID does not exist" Apr 20 20:09:54.148965 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.148918 2575 scope.go:117] "RemoveContainer" containerID="4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7" Apr 20 20:09:54.149432 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.149404 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7"} err="failed to get container status \"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7\": rpc error: code = NotFound desc = could not find container \"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7\": container with ID starting with 4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7 not found: ID does not exist" Apr 20 20:09:54.149528 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.149464 2575 scope.go:117] "RemoveContainer" containerID="a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900" Apr 20 20:09:54.149811 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.149764 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900"} err="failed to get container status \"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900\": rpc error: code = NotFound desc = could not find container \"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900\": container with ID starting with a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900 not found: ID does not exist" Apr 20 20:09:54.149911 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.149817 2575 scope.go:117] "RemoveContainer" containerID="c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19" Apr 20 20:09:54.152741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.150085 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19"} err="failed to get container status \"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19\": rpc error: code = NotFound desc = could not find container \"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19\": container with ID starting with c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19 not found: ID does not exist" Apr 20 20:09:54.152741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.150112 2575 scope.go:117] "RemoveContainer" containerID="98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272" Apr 20 20:09:54.152741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.150392 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272"} err="failed to get container status \"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272\": rpc error: code = NotFound desc = could not find container \"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272\": container with ID starting with 98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272 not found: ID does not exist" Apr 20 20:09:54.152741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.150415 2575 scope.go:117] "RemoveContainer" containerID="a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6" Apr 20 20:09:54.152741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.150698 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6"} err="failed to get container status \"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6\": rpc error: code = NotFound desc = could not find container \"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6\": container with ID starting with a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6 not found: ID does not exist" Apr 20 20:09:54.152741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.150718 2575 scope.go:117] "RemoveContainer" containerID="a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e" Apr 20 20:09:54.152741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.151018 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e"} err="failed to get container status \"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e\": rpc error: code = NotFound desc = could not find container \"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e\": container with ID starting with a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e not found: ID does not exist" Apr 20 20:09:54.152741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.151066 2575 scope.go:117] "RemoveContainer" containerID="963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6" Apr 20 20:09:54.152741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.151292 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6"} err="failed to get container status \"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6\": rpc error: code = NotFound desc = could not find container \"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6\": container with ID starting with 963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6 not found: ID does not exist" Apr 20 20:09:54.152741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.151314 2575 scope.go:117] "RemoveContainer" containerID="4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7" Apr 20 20:09:54.152741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.151544 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7"} err="failed to get container status \"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7\": rpc error: code = NotFound desc = could not find container \"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7\": container with ID starting with 4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7 not found: ID does not exist" Apr 20 20:09:54.152741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.151565 2575 scope.go:117] "RemoveContainer" containerID="a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900" Apr 20 20:09:54.152741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.151862 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900"} err="failed to get container status \"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900\": rpc error: code = NotFound desc = could not find container \"a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900\": container with ID starting with a60658c5e5cf7aef290cc420fdca9e83a523048e06d1abeb27b4fe9105ecf900 not found: ID does not exist" Apr 20 20:09:54.152741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.151882 2575 scope.go:117] "RemoveContainer" containerID="c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19" Apr 20 20:09:54.152741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.151991 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 20:09:54.152741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.152389 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19"} err="failed to get container status \"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19\": rpc error: code = NotFound desc = could not find container \"c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19\": container with ID starting with c17abdc7167993907a0c0b7b687704f19131a388f824a508a6227d522bc87e19 not found: ID does not exist" Apr 20 20:09:54.152741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.152415 2575 scope.go:117] "RemoveContainer" containerID="98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272" Apr 20 20:09:54.152741 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.152628 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 20:09:54.153617 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.152843 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 20:09:54.153617 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.152996 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 20:09:54.153617 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.153003 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 20:09:54.153617 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.153023 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 20:09:54.153617 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.153018 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272"} err="failed to get container status \"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272\": rpc error: code = NotFound desc = could not find container \"98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272\": container with ID starting with 98bf01267de8baa9ea9ba5c58bfc8b46dc2883f801e8eb6c9379d2dba1f36272 not found: ID does not exist" Apr 20 20:09:54.153617 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.153041 2575 scope.go:117] "RemoveContainer" containerID="a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6" Apr 20 20:09:54.153617 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.153147 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 20:09:54.153617 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.153148 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 20:09:54.153617 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.153322 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 20:09:54.153617 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.153354 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6"} err="failed to get container status \"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6\": rpc error: code = NotFound desc = could not find container \"a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6\": container with ID starting with a9a17c1906353b2e71ce8bdbf3f027b705a51fc3f43a644e9818b7c1901110d6 not found: ID does not exist" Apr 20 20:09:54.153617 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.153430 2575 scope.go:117] "RemoveContainer" containerID="a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e" Apr 20 20:09:54.153617 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.153589 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-fz4pd\"" Apr 20 20:09:54.154269 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.154066 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e"} err="failed to get container status \"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e\": rpc error: code = NotFound desc = could not find container \"a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e\": container with ID starting with a168dc68befebe0644244adc81e5540da424757f691a91f5b3a85563fc852a8e not found: ID does not exist" Apr 20 20:09:54.154269 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.154089 2575 scope.go:117] "RemoveContainer" containerID="963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6" Apr 20 20:09:54.154269 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.154190 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-805sr78pgg8la\"" Apr 20 20:09:54.154269 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.154249 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 20:09:54.154840 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.154575 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6"} err="failed to get container status \"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6\": rpc error: code = NotFound desc = could not find container \"963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6\": container with ID starting with 963893e29618affe4dabd975e45712ef6741fcb8f0a7aa273c0a88379d63b8c6 not found: ID does not exist" Apr 20 20:09:54.154840 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.154602 2575 scope.go:117] "RemoveContainer" containerID="4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7" Apr 20 20:09:54.155151 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.155040 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7"} err="failed to get container status \"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7\": rpc error: code = NotFound desc = could not find container \"4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7\": container with ID starting with 4ebcf69b088d5bf80d63c00072e296068a6987dca2ed4184e253d349c02a96c7 not found: ID does not exist" Apr 20 20:09:54.156454 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.156429 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 20:09:54.158024 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.158004 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 20:09:54.163160 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.163135 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:09:54.220965 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.220941 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e35a2045-4177-4415-83f1-6ed2275787d4" path="/var/lib/kubelet/pods/e35a2045-4177-4415-83f1-6ed2275787d4/volumes" Apr 20 20:09:54.227683 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.227665 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.227773 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.227693 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.227773 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.227724 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.227872 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.227834 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.227918 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.227870 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.227918 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.227897 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c46m\" (UniqueName: \"kubernetes.io/projected/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-kube-api-access-8c46m\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.227992 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.227927 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-config\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.227992 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.227946 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.227992 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.227974 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.228084 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.227997 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-config-out\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.228084 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.228013 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.228084 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.228035 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.228183 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.228175 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.228217 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.228195 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-web-config\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.228217 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.228211 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.228294 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.228228 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.228294 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.228248 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.228352 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.228327 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.329213 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.329176 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-config\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.329328 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.329220 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.329328 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.329245 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.329328 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.329268 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-config-out\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.329328 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.329291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.329328 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.329316 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.329538 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.329348 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.329538 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.329415 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-web-config\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.329661 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.329639 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.329760 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.329733 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.329860 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.329771 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.329860 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.329830 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.329966 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.329890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.329966 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.329923 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.329966 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.329947 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.330112 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.329982 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.330112 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.330019 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.330112 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.330043 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.330112 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.330073 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8c46m\" (UniqueName: \"kubernetes.io/projected/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-kube-api-access-8c46m\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.330329 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.330307 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.330689 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.330442 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.332504 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.332473 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.332504 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.332502 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.333291 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.332970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-config\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.333291 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.333118 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-web-config\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.333291 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.333170 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.333291 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.333208 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-config-out\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.333546 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.333343 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.333546 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.333350 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.334091 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.334064 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.334288 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.334267 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.335306 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.335279 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.335385 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.335314 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.335567 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.335550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.335642 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.335625 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.337360 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.337344 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c46m\" (UniqueName: \"kubernetes.io/projected/cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6-kube-api-access-8c46m\") pod \"prometheus-k8s-0\" (UID: \"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.459676 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.459651 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:54.583517 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:54.583487 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:09:54.587455 ip-10-0-140-155 kubenswrapper[2575]: W0420 20:09:54.587426 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdc5ad4b_ab57_4af9_9d26_62806fdb6eb6.slice/crio-e12b7a0f113285bd5d353a29321df136455ce1fea249418df566e338e57d9b59 WatchSource:0}: Error finding container e12b7a0f113285bd5d353a29321df136455ce1fea249418df566e338e57d9b59: Status 404 returned error can't find the container with id e12b7a0f113285bd5d353a29321df136455ce1fea249418df566e338e57d9b59 Apr 20 20:09:55.093274 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:55.093242 2575 generic.go:358] "Generic (PLEG): container finished" podID="cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6" containerID="85ed43f55e0d4690835640f9267bfd5c9422cebd7661de149738a92e965157cb" exitCode=0 Apr 20 20:09:55.093440 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:55.093335 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6","Type":"ContainerDied","Data":"85ed43f55e0d4690835640f9267bfd5c9422cebd7661de149738a92e965157cb"} Apr 20 20:09:55.093440 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:55.093368 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6","Type":"ContainerStarted","Data":"e12b7a0f113285bd5d353a29321df136455ce1fea249418df566e338e57d9b59"} Apr 20 20:09:56.100589 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:56.100552 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6","Type":"ContainerStarted","Data":"9526ef8c4018e91c1a88811a564b4a0dd45b90a955e3b8e9f78ec88fc12d0cad"} Apr 20 20:09:56.100589 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:56.100592 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6","Type":"ContainerStarted","Data":"76226b7ec09167779571739319683f9f6e8d598d78d7fb0b851db4186b0af55e"} Apr 20 20:09:56.100999 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:56.100603 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6","Type":"ContainerStarted","Data":"4efe7e78e205cccc00300065cf0087afaa1fbae7ce9c8f192b00a45be8818735"} Apr 20 20:09:56.100999 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:56.100612 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6","Type":"ContainerStarted","Data":"c2369411b88048462cf2c25a581acd7c270eb813723963e03cc32981a0fad944"} Apr 20 20:09:56.100999 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:56.100620 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6","Type":"ContainerStarted","Data":"e452158ec1da38a7bd841c7c3e492cea98c6d6725ee6597c6f37a783e1a675aa"} Apr 20 20:09:56.100999 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:56.100628 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6","Type":"ContainerStarted","Data":"92b10ae72ecf901418457f2de8516eade6d8f7b55169c44a688c6ada90c32e21"} Apr 20 20:09:56.126810 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:56.126746 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.126732947 podStartE2EDuration="2.126732947s" podCreationTimestamp="2026-04-20 20:09:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:09:56.125451166 +0000 UTC m=+270.480053927" watchObservedRunningTime="2026-04-20 20:09:56.126732947 +0000 UTC m=+270.481335692" Apr 20 20:09:59.459908 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:09:59.459863 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:26.107126 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:10:26.107096 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:10:26.109840 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:10:26.109819 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:10:54.460833 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:10:54.460782 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:54.478300 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:10:54.478273 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:55.281630 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:10:55.281588 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:26.126890 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:15:26.126848 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:15:26.129350 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:15:26.129327 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:20:26.150686 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:20:26.150656 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:20:26.154950 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:20:26.154929 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:25:26.168228 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:25:26.168193 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:25:26.174285 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:25:26.174266 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:30:26.188817 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:30:26.188769 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:30:26.195088 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:30:26.195065 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:35:26.209183 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:35:26.209145 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:35:26.214609 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:35:26.214585 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:40:26.231868 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:40:26.231835 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:40:26.236257 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:40:26.236234 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:45:26.253831 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:45:26.253784 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:45:26.259728 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:45:26.259704 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:50:26.272163 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:50:26.272084 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:50:26.278095 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:50:26.278076 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:55:26.292005 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:55:26.291974 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 20:55:26.297539 ip-10-0-140-155 kubenswrapper[2575]: I0420 20:55:26.297518 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 21:00:26.310427 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:00:26.310398 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 21:00:26.315868 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:00:26.315846 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 21:05:26.327468 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:05:26.327356 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 21:05:26.333609 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:05:26.333592 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 21:10:26.344936 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:10:26.344907 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 21:10:26.351054 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:10:26.351035 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 21:12:00.170372 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:00.170335 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qsjk5/must-gather-s79bm"] Apr 20 21:12:00.173559 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:00.173538 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qsjk5/must-gather-s79bm" Apr 20 21:12:00.176814 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:00.176130 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qsjk5\"/\"openshift-service-ca.crt\"" Apr 20 21:12:00.176814 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:00.176140 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qsjk5\"/\"kube-root-ca.crt\"" Apr 20 21:12:00.176984 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:00.176899 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qsjk5\"/\"default-dockercfg-nn688\"" Apr 20 21:12:00.180714 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:00.180690 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qsjk5/must-gather-s79bm"] Apr 20 21:12:00.290110 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:00.290079 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1ec8b4d2-4ee6-4143-a0aa-ac0957edbc4d-must-gather-output\") pod \"must-gather-s79bm\" (UID: \"1ec8b4d2-4ee6-4143-a0aa-ac0957edbc4d\") " pod="openshift-must-gather-qsjk5/must-gather-s79bm" Apr 20 21:12:00.290110 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:00.290111 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2jfh\" (UniqueName: \"kubernetes.io/projected/1ec8b4d2-4ee6-4143-a0aa-ac0957edbc4d-kube-api-access-x2jfh\") pod \"must-gather-s79bm\" (UID: \"1ec8b4d2-4ee6-4143-a0aa-ac0957edbc4d\") " pod="openshift-must-gather-qsjk5/must-gather-s79bm" Apr 20 21:12:00.390709 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:00.390683 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1ec8b4d2-4ee6-4143-a0aa-ac0957edbc4d-must-gather-output\") pod \"must-gather-s79bm\" (UID: \"1ec8b4d2-4ee6-4143-a0aa-ac0957edbc4d\") " pod="openshift-must-gather-qsjk5/must-gather-s79bm" Apr 20 21:12:00.390862 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:00.390712 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2jfh\" (UniqueName: \"kubernetes.io/projected/1ec8b4d2-4ee6-4143-a0aa-ac0957edbc4d-kube-api-access-x2jfh\") pod \"must-gather-s79bm\" (UID: \"1ec8b4d2-4ee6-4143-a0aa-ac0957edbc4d\") " pod="openshift-must-gather-qsjk5/must-gather-s79bm" Apr 20 21:12:00.391019 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:00.391001 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1ec8b4d2-4ee6-4143-a0aa-ac0957edbc4d-must-gather-output\") pod \"must-gather-s79bm\" (UID: \"1ec8b4d2-4ee6-4143-a0aa-ac0957edbc4d\") " pod="openshift-must-gather-qsjk5/must-gather-s79bm" Apr 20 21:12:00.399146 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:00.399126 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2jfh\" (UniqueName: \"kubernetes.io/projected/1ec8b4d2-4ee6-4143-a0aa-ac0957edbc4d-kube-api-access-x2jfh\") pod \"must-gather-s79bm\" (UID: \"1ec8b4d2-4ee6-4143-a0aa-ac0957edbc4d\") " pod="openshift-must-gather-qsjk5/must-gather-s79bm" Apr 20 21:12:00.486834 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:00.486745 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qsjk5/must-gather-s79bm" Apr 20 21:12:00.600723 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:00.600692 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qsjk5/must-gather-s79bm"] Apr 20 21:12:00.604270 ip-10-0-140-155 kubenswrapper[2575]: W0420 21:12:00.604240 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ec8b4d2_4ee6_4143_a0aa_ac0957edbc4d.slice/crio-0251efbe5ac93b240a2227baf55e397d4de30c4fbbcd3e23d9b1ec7435de84e4 WatchSource:0}: Error finding container 0251efbe5ac93b240a2227baf55e397d4de30c4fbbcd3e23d9b1ec7435de84e4: Status 404 returned error can't find the container with id 0251efbe5ac93b240a2227baf55e397d4de30c4fbbcd3e23d9b1ec7435de84e4 Apr 20 21:12:00.605997 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:00.605982 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:12:01.423411 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:01.423379 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qsjk5/must-gather-s79bm" event={"ID":"1ec8b4d2-4ee6-4143-a0aa-ac0957edbc4d","Type":"ContainerStarted","Data":"0251efbe5ac93b240a2227baf55e397d4de30c4fbbcd3e23d9b1ec7435de84e4"} Apr 20 21:12:02.429637 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:02.429605 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qsjk5/must-gather-s79bm" event={"ID":"1ec8b4d2-4ee6-4143-a0aa-ac0957edbc4d","Type":"ContainerStarted","Data":"e7f5993098c13f171448cbff2ec3150b3d3681ac46e1de89458f28e7a946d643"} Apr 20 21:12:02.429637 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:02.429641 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qsjk5/must-gather-s79bm" event={"ID":"1ec8b4d2-4ee6-4143-a0aa-ac0957edbc4d","Type":"ContainerStarted","Data":"7049da350a6a8ad868e3d67d2e8c7f9bec0920af4e5852b5060305be8003db67"} Apr 20 21:12:02.444608 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:02.444563 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qsjk5/must-gather-s79bm" podStartSLOduration=1.658065796 podStartE2EDuration="2.444547076s" podCreationTimestamp="2026-04-20 21:12:00 +0000 UTC" firstStartedPulling="2026-04-20 21:12:00.606108382 +0000 UTC m=+3994.960711105" lastFinishedPulling="2026-04-20 21:12:01.392589647 +0000 UTC m=+3995.747192385" observedRunningTime="2026-04-20 21:12:02.442267215 +0000 UTC m=+3996.796869962" watchObservedRunningTime="2026-04-20 21:12:02.444547076 +0000 UTC m=+3996.799149822" Apr 20 21:12:02.819578 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:02.819509 2575 ???:1] "http: TLS handshake error from 10.0.140.155:35064: EOF" Apr 20 21:12:02.827667 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:02.827644 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9rdlg_44d1f99f-7189-4247-a28d-b55ea139aca0/global-pull-secret-syncer/0.log" Apr 20 21:12:02.972509 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:02.972474 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-b2hzv_2cdb632c-969a-4d5b-96a5-44863d4b1606/konnectivity-agent/0.log" Apr 20 21:12:03.050622 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:03.050592 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-155.ec2.internal_fb94a1a385423d979d451957f03175cb/haproxy/0.log" Apr 20 21:12:06.535065 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:06.534975 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-8bc855446-2vt7r_2289d69c-8dd9-4860-a6cb-1ff3bfc94470/metrics-server/0.log" Apr 20 21:12:06.561269 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:06.561237 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-2m4t9_7cd7ad34-6ea8-46c6-8919-9d22f21ec3e0/monitoring-plugin/0.log" Apr 20 21:12:06.598271 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:06.598222 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-76tnk_40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce/node-exporter/0.log" Apr 20 21:12:06.620284 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:06.620237 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-76tnk_40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce/kube-rbac-proxy/0.log" Apr 20 21:12:06.645342 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:06.645315 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-76tnk_40a7c052-d7ca-4d0f-ab9d-6715e8cfb1ce/init-textfile/0.log" Apr 20 21:12:06.926744 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:06.926685 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6/prometheus/0.log" Apr 20 21:12:06.945230 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:06.945199 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6/config-reloader/0.log" Apr 20 21:12:06.971167 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:06.971138 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6/thanos-sidecar/0.log" Apr 20 21:12:06.991718 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:06.991691 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6/kube-rbac-proxy-web/0.log" Apr 20 21:12:07.015759 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:07.015737 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6/kube-rbac-proxy/0.log" Apr 20 21:12:07.037944 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:07.037908 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6/kube-rbac-proxy-thanos/0.log" Apr 20 21:12:07.059204 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:07.059173 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cdc5ad4b-ab57-4af9-9d26-62806fdb6eb6/init-config-reloader/0.log" Apr 20 21:12:07.095095 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:07.095065 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9npmw_95353743-5969-4ff5-8f9b-7bceb016d992/prometheus-operator/0.log" Apr 20 21:12:07.110650 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:07.110628 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9npmw_95353743-5969-4ff5-8f9b-7bceb016d992/kube-rbac-proxy/0.log" Apr 20 21:12:09.731492 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:09.731434 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm"] Apr 20 21:12:09.735661 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:09.735633 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:09.744986 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:09.744960 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm"] Apr 20 21:12:09.878210 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:09.878172 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2920c589-c96c-4c28-adf1-2211232bd84c-proc\") pod \"perf-node-gather-daemonset-rv7xm\" (UID: \"2920c589-c96c-4c28-adf1-2211232bd84c\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:09.878210 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:09.878208 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2920c589-c96c-4c28-adf1-2211232bd84c-lib-modules\") pod \"perf-node-gather-daemonset-rv7xm\" (UID: \"2920c589-c96c-4c28-adf1-2211232bd84c\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:09.878391 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:09.878272 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2920c589-c96c-4c28-adf1-2211232bd84c-podres\") pod \"perf-node-gather-daemonset-rv7xm\" (UID: \"2920c589-c96c-4c28-adf1-2211232bd84c\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:09.878391 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:09.878297 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bltz\" (UniqueName: \"kubernetes.io/projected/2920c589-c96c-4c28-adf1-2211232bd84c-kube-api-access-4bltz\") pod \"perf-node-gather-daemonset-rv7xm\" (UID: \"2920c589-c96c-4c28-adf1-2211232bd84c\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:09.878391 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:09.878369 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2920c589-c96c-4c28-adf1-2211232bd84c-sys\") pod \"perf-node-gather-daemonset-rv7xm\" (UID: \"2920c589-c96c-4c28-adf1-2211232bd84c\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:09.979288 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:09.979257 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2920c589-c96c-4c28-adf1-2211232bd84c-podres\") pod \"perf-node-gather-daemonset-rv7xm\" (UID: \"2920c589-c96c-4c28-adf1-2211232bd84c\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:09.979440 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:09.979299 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bltz\" (UniqueName: \"kubernetes.io/projected/2920c589-c96c-4c28-adf1-2211232bd84c-kube-api-access-4bltz\") pod \"perf-node-gather-daemonset-rv7xm\" (UID: \"2920c589-c96c-4c28-adf1-2211232bd84c\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:09.979440 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:09.979333 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2920c589-c96c-4c28-adf1-2211232bd84c-sys\") pod \"perf-node-gather-daemonset-rv7xm\" (UID: \"2920c589-c96c-4c28-adf1-2211232bd84c\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:09.979440 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:09.979350 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2920c589-c96c-4c28-adf1-2211232bd84c-proc\") pod \"perf-node-gather-daemonset-rv7xm\" (UID: \"2920c589-c96c-4c28-adf1-2211232bd84c\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:09.979440 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:09.979368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2920c589-c96c-4c28-adf1-2211232bd84c-lib-modules\") pod \"perf-node-gather-daemonset-rv7xm\" (UID: \"2920c589-c96c-4c28-adf1-2211232bd84c\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:09.979440 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:09.979422 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2920c589-c96c-4c28-adf1-2211232bd84c-podres\") pod \"perf-node-gather-daemonset-rv7xm\" (UID: \"2920c589-c96c-4c28-adf1-2211232bd84c\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:09.979600 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:09.979447 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2920c589-c96c-4c28-adf1-2211232bd84c-sys\") pod \"perf-node-gather-daemonset-rv7xm\" (UID: \"2920c589-c96c-4c28-adf1-2211232bd84c\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:09.979600 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:09.979481 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2920c589-c96c-4c28-adf1-2211232bd84c-proc\") pod \"perf-node-gather-daemonset-rv7xm\" (UID: \"2920c589-c96c-4c28-adf1-2211232bd84c\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:09.979600 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:09.979490 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2920c589-c96c-4c28-adf1-2211232bd84c-lib-modules\") pod \"perf-node-gather-daemonset-rv7xm\" (UID: \"2920c589-c96c-4c28-adf1-2211232bd84c\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:09.989930 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:09.989851 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bltz\" (UniqueName: \"kubernetes.io/projected/2920c589-c96c-4c28-adf1-2211232bd84c-kube-api-access-4bltz\") pod \"perf-node-gather-daemonset-rv7xm\" (UID: \"2920c589-c96c-4c28-adf1-2211232bd84c\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:10.046468 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:10.046441 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:10.171727 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:10.171087 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm"] Apr 20 21:12:10.174564 ip-10-0-140-155 kubenswrapper[2575]: W0420 21:12:10.174533 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2920c589_c96c_4c28_adf1_2211232bd84c.slice/crio-6fb57ab0a7c197e862daae8caacfaaef706245cb176c8728c671ad9d2e0b67ac WatchSource:0}: Error finding container 6fb57ab0a7c197e862daae8caacfaaef706245cb176c8728c671ad9d2e0b67ac: Status 404 returned error can't find the container with id 6fb57ab0a7c197e862daae8caacfaaef706245cb176c8728c671ad9d2e0b67ac Apr 20 21:12:10.459310 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:10.459275 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" event={"ID":"2920c589-c96c-4c28-adf1-2211232bd84c","Type":"ContainerStarted","Data":"b79803c3f56fc40c125d15404f63f6ac20cc845d1cc0596ced5dd8902d548c84"} Apr 20 21:12:10.459310 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:10.459313 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" event={"ID":"2920c589-c96c-4c28-adf1-2211232bd84c","Type":"ContainerStarted","Data":"6fb57ab0a7c197e862daae8caacfaaef706245cb176c8728c671ad9d2e0b67ac"} Apr 20 21:12:10.459504 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:10.459341 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:10.481525 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:10.481386 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" podStartSLOduration=1.481366111 podStartE2EDuration="1.481366111s" podCreationTimestamp="2026-04-20 21:12:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:12:10.479182397 +0000 UTC m=+4004.833785146" watchObservedRunningTime="2026-04-20 21:12:10.481366111 +0000 UTC m=+4004.835968858" Apr 20 21:12:10.705935 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:10.705908 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mz59j_86fdf4f4-4135-45ec-8ea1-ade53c77d176/dns/0.log" Apr 20 21:12:10.736844 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:10.736822 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mz59j_86fdf4f4-4135-45ec-8ea1-ade53c77d176/kube-rbac-proxy/0.log" Apr 20 21:12:10.897401 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:10.897371 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pmx7s_825cf132-8688-4d28-b93d-28380334363a/dns-node-resolver/0.log" Apr 20 21:12:11.417536 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:11.417507 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vhc9b_dc97d8be-f2be-4f95-b5f8-2f30bd207040/node-ca/0.log" Apr 20 21:12:12.449784 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:12.449759 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vw9v8_857f5f33-f93c-4621-bf95-3178c8f078b1/serve-healthcheck-canary/0.log" Apr 20 21:12:12.811118 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:12.811031 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7n2qv_4f69caa4-e0fd-4758-a26d-5de4cae89837/kube-rbac-proxy/0.log" Apr 20 21:12:12.830207 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:12.830187 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7n2qv_4f69caa4-e0fd-4758-a26d-5de4cae89837/exporter/0.log" Apr 20 21:12:12.851074 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:12.851052 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7n2qv_4f69caa4-e0fd-4758-a26d-5de4cae89837/extractor/0.log" Apr 20 21:12:16.472622 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:16.472592 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-rv7xm" Apr 20 21:12:20.815365 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:20.815337 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zln2l_77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87/kube-multus-additional-cni-plugins/0.log" Apr 20 21:12:20.838522 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:20.838486 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zln2l_77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87/egress-router-binary-copy/0.log" Apr 20 21:12:20.860136 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:20.860111 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zln2l_77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87/cni-plugins/0.log" Apr 20 21:12:20.880898 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:20.880876 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zln2l_77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87/bond-cni-plugin/0.log" Apr 20 21:12:20.900685 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:20.900658 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zln2l_77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87/routeoverride-cni/0.log" Apr 20 21:12:20.923333 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:20.923310 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zln2l_77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87/whereabouts-cni-bincopy/0.log" Apr 20 21:12:20.944929 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:20.944908 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zln2l_77159cf3-9f8b-4a2f-8ff1-7210fcaf5e87/whereabouts-cni/0.log" Apr 20 21:12:21.003816 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:21.003769 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z2mjf_36ded35a-5271-4357-8314-e71ef065d1c8/kube-multus/0.log" Apr 20 21:12:21.132850 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:21.132821 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tkqg4_e5ae3fcb-9103-43ca-a45c-46f86f99016e/network-metrics-daemon/0.log" Apr 20 21:12:21.161597 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:21.161576 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tkqg4_e5ae3fcb-9103-43ca-a45c-46f86f99016e/kube-rbac-proxy/0.log" Apr 20 21:12:21.833136 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:21.833100 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-controller/0.log" Apr 20 21:12:21.851523 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:21.851500 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/0.log" Apr 20 21:12:21.870378 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:21.870353 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovn-acl-logging/1.log" Apr 20 21:12:21.891534 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:21.891509 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/kube-rbac-proxy-node/0.log" Apr 20 21:12:21.910843 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:21.910815 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 21:12:21.928213 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:21.928191 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/northd/0.log" Apr 20 21:12:21.951477 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:21.951456 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/nbdb/0.log" Apr 20 21:12:21.970502 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:21.970477 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/sbdb/0.log" Apr 20 21:12:22.091435 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:22.091407 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5k984_3d96f75f-8097-4ea6-bf1c-c5bc31761969/ovnkube-controller/0.log" Apr 20 21:12:23.744886 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:23.744860 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-zwtx2_382abf9b-eb7a-4ef7-918c-50641d3558a8/network-check-target-container/0.log" Apr 20 21:12:24.606663 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:24.606635 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-5ksn8_13fedadd-686b-4e93-a2ae-cbab6edc3bff/iptables-alerter/0.log" Apr 20 21:12:25.251645 ip-10-0-140-155 kubenswrapper[2575]: I0420 21:12:25.251618 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-98bc5_daf7a90d-c40a-4f5d-acec-d016728caae2/tuned/0.log"