Apr 23 08:50:34.057239 ip-10-0-137-31 systemd[1]: Starting Kubernetes Kubelet... Apr 23 08:50:34.557729 ip-10-0-137-31 kubenswrapper[2559]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:50:34.557729 ip-10-0-137-31 kubenswrapper[2559]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 08:50:34.557729 ip-10-0-137-31 kubenswrapper[2559]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:50:34.557729 ip-10-0-137-31 kubenswrapper[2559]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 08:50:34.558711 ip-10-0-137-31 kubenswrapper[2559]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:50:34.560680 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.560604 2559 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 08:50:34.564695 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564675 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:50:34.564695 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564691 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:50:34.564695 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564695 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:50:34.564695 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564698 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564702 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564706 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564711 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564714 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564718 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564720 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564723 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564726 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564729 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564732 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564735 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564737 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564740 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564743 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564747 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564750 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564753 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564756 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:50:34.564846 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564758 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564761 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564763 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564766 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564769 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564771 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564774 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564783 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564786 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564789 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564791 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564794 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564796 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564799 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564801 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564805 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564808 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564811 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564813 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564816 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:50:34.565317 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564819 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564821 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564824 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564827 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564829 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564832 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564834 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564837 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564839 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564842 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564844 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564848 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564851 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564853 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564856 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564859 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564861 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564864 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564867 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564869 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:50:34.565802 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564872 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564874 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564877 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564879 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564882 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564884 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564888 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564891 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564893 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564896 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564899 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564902 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564904 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564907 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564910 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564912 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564916 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564919 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564922 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564924 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:50:34.566308 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564927 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564929 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564933 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.564936 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565336 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565343 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565345 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565348 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565351 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565354 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565356 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565359 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565361 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565364 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565366 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565369 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565371 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565374 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565377 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565380 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:50:34.566882 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565383 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565385 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565388 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565391 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565394 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565396 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565399 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565401 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565404 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565407 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565409 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565412 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565415 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565417 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565419 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565422 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565424 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565427 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565429 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565432 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:50:34.567388 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565435 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565437 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565440 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565442 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565445 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565447 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565450 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565452 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565455 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565457 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565460 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565464 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565467 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565471 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565475 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565478 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565481 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565484 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565487 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565490 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:50:34.567869 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565493 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565496 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565499 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565502 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565505 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565507 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565510 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565513 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565516 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565519 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565521 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565524 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565527 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565529 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565532 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565534 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565537 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565539 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565543 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:50:34.568383 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565546 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565548 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565551 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565554 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565557 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565560 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565562 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565565 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565568 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565570 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.565573 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565657 2559 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565667 2559 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565676 2559 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565682 2559 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565687 2559 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565691 2559 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565695 2559 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565700 2559 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565703 2559 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565706 2559 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 08:50:34.568861 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565710 2559 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565713 2559 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565716 2559 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565719 2559 flags.go:64] FLAG: --cgroup-root="" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565722 2559 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565725 2559 flags.go:64] FLAG: --client-ca-file="" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565728 2559 flags.go:64] FLAG: --cloud-config="" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565731 2559 flags.go:64] FLAG: --cloud-provider="external" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565734 2559 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565738 2559 flags.go:64] FLAG: --cluster-domain="" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565741 2559 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565745 2559 flags.go:64] FLAG: --config-dir="" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565748 2559 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565751 2559 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565755 2559 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565759 2559 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565762 2559 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565765 2559 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565769 2559 flags.go:64] FLAG: --contention-profiling="false" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565771 2559 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565774 2559 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565777 2559 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565780 2559 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565785 2559 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565788 2559 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 08:50:34.569387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565792 2559 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565794 2559 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565798 2559 flags.go:64] FLAG: --enable-server="true" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565801 2559 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565806 2559 flags.go:64] FLAG: --event-burst="100" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565810 2559 flags.go:64] FLAG: --event-qps="50" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565813 2559 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565816 2559 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565820 2559 flags.go:64] FLAG: --eviction-hard="" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565824 2559 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565827 2559 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565830 2559 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565833 2559 flags.go:64] FLAG: --eviction-soft="" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565836 2559 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565839 2559 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565842 2559 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565845 2559 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565848 2559 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565851 2559 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565854 2559 flags.go:64] FLAG: --feature-gates="" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565857 2559 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565860 2559 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565863 2559 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565866 2559 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565870 2559 flags.go:64] FLAG: --healthz-port="10248" Apr 23 08:50:34.570034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565873 2559 flags.go:64] FLAG: --help="false" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565875 2559 flags.go:64] FLAG: --hostname-override="ip-10-0-137-31.ec2.internal" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565878 2559 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565881 2559 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565884 2559 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565888 2559 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565891 2559 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565894 2559 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565897 2559 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565900 2559 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565903 2559 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565906 2559 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565910 2559 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565913 2559 flags.go:64] FLAG: --kube-reserved="" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565916 2559 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565918 2559 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565921 2559 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565924 2559 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565927 2559 flags.go:64] FLAG: --lock-file="" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565930 2559 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565933 2559 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565936 2559 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565942 2559 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 08:50:34.570646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565944 2559 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565947 2559 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565950 2559 flags.go:64] FLAG: --logging-format="text" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565953 2559 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565956 2559 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565959 2559 flags.go:64] FLAG: --manifest-url="" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565962 2559 flags.go:64] FLAG: --manifest-url-header="" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565966 2559 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.565979 2559 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566000 2559 flags.go:64] FLAG: --max-pods="110" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566004 2559 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566007 2559 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566010 2559 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566013 2559 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566016 2559 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566021 2559 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566024 2559 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566032 2559 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566035 2559 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566038 2559 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566042 2559 flags.go:64] FLAG: --pod-cidr="" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566045 2559 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566050 2559 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566053 2559 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 08:50:34.571221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566056 2559 flags.go:64] FLAG: --pods-per-core="0" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566058 2559 flags.go:64] FLAG: --port="10250" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566062 2559 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566065 2559 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-09173fa4940a127fb" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566068 2559 flags.go:64] FLAG: --qos-reserved="" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566071 2559 flags.go:64] FLAG: --read-only-port="10255" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566075 2559 flags.go:64] FLAG: --register-node="true" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566077 2559 flags.go:64] FLAG: --register-schedulable="true" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566080 2559 flags.go:64] FLAG: --register-with-taints="" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566084 2559 flags.go:64] FLAG: --registry-burst="10" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566087 2559 flags.go:64] FLAG: --registry-qps="5" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566090 2559 flags.go:64] FLAG: --reserved-cpus="" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566092 2559 flags.go:64] FLAG: --reserved-memory="" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566096 2559 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566099 2559 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566102 2559 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566105 2559 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566108 2559 flags.go:64] FLAG: --runonce="false" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566111 2559 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566114 2559 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566117 2559 flags.go:64] FLAG: --seccomp-default="false" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566119 2559 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566122 2559 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566130 2559 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566133 2559 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566136 2559 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 08:50:34.571820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566139 2559 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566142 2559 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566145 2559 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566151 2559 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566154 2559 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566157 2559 flags.go:64] FLAG: --system-cgroups="" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566160 2559 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566166 2559 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566169 2559 flags.go:64] FLAG: --tls-cert-file="" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566172 2559 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566175 2559 flags.go:64] FLAG: --tls-min-version="" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566178 2559 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566181 2559 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566184 2559 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566187 2559 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566190 2559 flags.go:64] FLAG: --v="2" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566194 2559 flags.go:64] FLAG: --version="false" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566198 2559 flags.go:64] FLAG: --vmodule="" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566202 2559 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.566205 2559 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566302 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566306 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566309 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566312 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:50:34.572437 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566315 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566318 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566321 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566324 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566326 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566330 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566333 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566336 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566338 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566341 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566344 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566348 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566352 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566355 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566358 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566360 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566363 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566366 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566368 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566370 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:50:34.573019 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566373 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566376 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566378 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566381 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566383 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566386 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566388 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566392 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566396 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566399 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566402 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566405 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566408 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566410 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566413 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566416 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566419 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566424 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566426 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:50:34.573508 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566429 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566431 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566434 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566436 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566440 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566443 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566446 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566449 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566452 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566454 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566457 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566459 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566462 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566465 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566467 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566470 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566473 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566475 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566478 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566480 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:50:34.574011 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566483 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566485 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566488 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566490 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566493 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566496 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566498 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566500 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566503 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566506 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566510 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566513 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566515 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566518 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566520 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566523 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566527 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566529 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566533 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566536 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:50:34.574502 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566539 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:50:34.575039 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566543 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:50:34.575039 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.566546 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:50:34.575039 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.567317 2559 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:50:34.575039 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.573031 2559 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 08:50:34.575039 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.573048 2559 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 08:50:34.575039 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573095 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:50:34.575039 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573100 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:50:34.575039 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573103 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:50:34.575039 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573107 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:50:34.575039 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573110 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:50:34.575039 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573113 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:50:34.575039 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573116 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:50:34.575039 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573118 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:50:34.575039 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573121 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:50:34.575039 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573124 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573127 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573130 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573133 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573135 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573138 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573141 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573144 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573146 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573149 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573152 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573154 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573157 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573159 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573162 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573165 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573168 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573170 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573172 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573175 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:50:34.575423 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573177 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573181 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573187 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573191 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573194 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573197 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573200 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573203 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573205 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573208 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573211 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573213 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573216 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573218 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573221 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573224 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573226 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573229 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573232 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:50:34.575904 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573234 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573237 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573239 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573242 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573244 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573247 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573249 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573251 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573254 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573256 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573259 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573261 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573264 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573267 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573269 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573273 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573280 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573283 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573286 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573289 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:50:34.576478 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573291 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:50:34.576968 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573294 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:50:34.576968 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573296 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:50:34.576968 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573299 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:50:34.576968 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573302 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:50:34.576968 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573305 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:50:34.576968 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573307 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:50:34.576968 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573310 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:50:34.576968 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573313 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:50:34.576968 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573315 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:50:34.576968 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573318 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:50:34.576968 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573320 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:50:34.576968 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573323 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:50:34.576968 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573325 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:50:34.576968 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573328 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:50:34.576968 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573357 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:50:34.576968 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573360 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:50:34.576968 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573362 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:50:34.577397 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.573368 2559 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:50:34.577397 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573464 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:50:34.577397 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573468 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:50:34.577397 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573471 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:50:34.577397 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573474 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:50:34.577397 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573477 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:50:34.577397 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573480 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:50:34.577397 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573483 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:50:34.577397 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573486 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:50:34.577397 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573489 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:50:34.577397 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573492 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:50:34.577397 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573495 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:50:34.577397 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573498 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:50:34.577397 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573502 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:50:34.577397 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573506 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573509 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573512 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573515 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573518 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573521 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573524 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573527 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573529 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573532 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573534 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573537 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573539 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573542 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573545 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573547 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573549 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573552 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573554 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573557 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:50:34.577766 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573559 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573562 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573564 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573567 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573569 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573571 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573574 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573577 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573579 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573582 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573585 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573588 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573590 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573593 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573596 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573598 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573601 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573603 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573606 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573609 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:50:34.578274 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573611 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573614 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573616 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573618 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573621 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573623 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573626 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573628 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573631 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573634 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573636 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573639 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573641 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573643 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573646 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573648 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573651 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573653 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573656 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573658 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:50:34.578764 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573661 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:50:34.579267 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573664 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:50:34.579267 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573666 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:50:34.579267 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573670 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:50:34.579267 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573672 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:50:34.579267 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573675 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:50:34.579267 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573677 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:50:34.579267 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573681 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:50:34.579267 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573685 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:50:34.579267 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573687 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:50:34.579267 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573690 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:50:34.579267 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573693 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:50:34.579267 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:34.573695 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:50:34.579267 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.573700 2559 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:50:34.579267 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.574492 2559 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 08:50:34.579605 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.576716 2559 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 08:50:34.579605 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.577804 2559 server.go:1019] "Starting client certificate rotation" Apr 23 08:50:34.579605 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.578284 2559 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:50:34.579605 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.578330 2559 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:50:34.605452 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.605436 2559 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:50:34.610600 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.610574 2559 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:50:34.624410 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.624377 2559 log.go:25] "Validated CRI v1 runtime API" Apr 23 08:50:34.630212 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.630194 2559 log.go:25] "Validated CRI v1 image API" Apr 23 08:50:34.632184 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.632162 2559 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 08:50:34.634832 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.634813 2559 fs.go:135] Filesystem UUIDs: map[32e17f45-c3e0-449c-be00-d7eb6b0f68ea:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 eb68d8c3-f453-439b-b98f-a183442498fe:/dev/nvme0n1p3] Apr 23 08:50:34.634890 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.634835 2559 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 08:50:34.637628 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.637613 2559 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:50:34.640581 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.640473 2559 manager.go:217] Machine: {Timestamp:2026-04-23 08:50:34.638493442 +0000 UTC m=+0.451098821 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098339 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29afe56366fc3adb5c4a5aa8b09444 SystemUUID:ec29afe5-6366-fc3a-db5c-4a5aa8b09444 BootID:7ef830b2-67d8-4b87-82bd-b25e0c5af558 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a7:4b:56:8d:d5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a7:4b:56:8d:d5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:26:da:65:5d:5d:b8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 08:50:34.640581 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.640576 2559 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 08:50:34.640689 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.640649 2559 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 08:50:34.642835 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.642810 2559 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 08:50:34.642979 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.642836 2559 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-31.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 08:50:34.643044 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.643005 2559 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 08:50:34.643044 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.643015 2559 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 08:50:34.643044 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.643029 2559 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:50:34.644012 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.644002 2559 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:50:34.645585 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.645575 2559 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:50:34.645817 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.645807 2559 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 08:50:34.648430 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.648419 2559 kubelet.go:491] "Attempting to sync node with API server" Apr 23 08:50:34.648474 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.648441 2559 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 08:50:34.648474 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.648452 2559 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 08:50:34.648474 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.648461 2559 kubelet.go:397] "Adding apiserver pod source" Apr 23 08:50:34.648474 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.648469 2559 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 08:50:34.650624 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.650609 2559 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:50:34.650696 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.650629 2559 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:50:34.653690 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.653676 2559 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 08:50:34.655168 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.655149 2559 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 08:50:34.655292 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.655277 2559 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-q2jrw" Apr 23 08:50:34.656965 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.656951 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 08:50:34.657047 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.656972 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 08:50:34.657047 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.656995 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 08:50:34.657047 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.657004 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 08:50:34.657047 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.657013 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 08:50:34.657047 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.657021 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 08:50:34.657047 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.657029 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 08:50:34.657047 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.657038 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 08:50:34.657047 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.657048 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 08:50:34.657298 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.657056 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 08:50:34.657298 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.657075 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 08:50:34.657298 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.657089 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 08:50:34.657927 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.657915 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 08:50:34.658022 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.657932 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 08:50:34.659249 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:34.659217 2559 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 08:50:34.659348 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:34.659232 2559 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-31.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 08:50:34.661434 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.661419 2559 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 08:50:34.661517 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.661457 2559 server.go:1295] "Started kubelet" Apr 23 08:50:34.661565 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.661501 2559 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 08:50:34.661667 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.661612 2559 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 08:50:34.661714 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.661695 2559 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 08:50:34.662091 ip-10-0-137-31 systemd[1]: Started Kubernetes Kubelet. Apr 23 08:50:34.663038 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.662973 2559 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 08:50:34.666867 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.666844 2559 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-q2jrw" Apr 23 08:50:34.667948 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.667930 2559 server.go:317] "Adding debug handlers to kubelet server" Apr 23 08:50:34.673025 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.673005 2559 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 08:50:34.673678 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.673662 2559 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 08:50:34.674505 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.674487 2559 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 08:50:34.674623 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.674610 2559 factory.go:55] Registering systemd factory Apr 23 08:50:34.674695 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.674680 2559 factory.go:223] Registration of the systemd container factory successfully Apr 23 08:50:34.674751 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:34.674521 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-31.ec2.internal\" not found" Apr 23 08:50:34.675485 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.675464 2559 factory.go:153] Registering CRI-O factory Apr 23 08:50:34.675560 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.675488 2559 factory.go:223] Registration of the crio container factory successfully Apr 23 08:50:34.675560 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.675512 2559 factory.go:103] Registering Raw factory Apr 23 08:50:34.675560 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.675534 2559 manager.go:1196] Started watching for new ooms in manager Apr 23 08:50:34.676062 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.676043 2559 manager.go:319] Starting recovery of all containers Apr 23 08:50:34.676176 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.676157 2559 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 08:50:34.676292 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:34.676158 2559 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 08:50:34.676292 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.676158 2559 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 08:50:34.676429 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.676299 2559 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 08:50:34.676519 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.676507 2559 reconstruct.go:97] "Volume reconstruction finished" Apr 23 08:50:34.676563 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.676519 2559 reconciler.go:26] "Reconciler: start to sync state" Apr 23 08:50:34.676872 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.676855 2559 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:50:34.679359 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.679337 2559 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-31.ec2.internal" not found Apr 23 08:50:34.680617 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:34.680573 2559 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-31.ec2.internal\" not found" node="ip-10-0-137-31.ec2.internal" Apr 23 08:50:34.685368 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.685269 2559 manager.go:324] Recovery completed Apr 23 08:50:34.689728 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.689714 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:50:34.692581 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.692567 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:50:34.692671 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.692596 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:50:34.692671 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.692609 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:50:34.693095 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.693081 2559 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 08:50:34.693095 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.693094 2559 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 08:50:34.693211 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.693110 2559 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:50:34.695808 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.695792 2559 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-31.ec2.internal" not found Apr 23 08:50:34.696754 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.696741 2559 policy_none.go:49] "None policy: Start" Apr 23 08:50:34.696813 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.696757 2559 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 08:50:34.696813 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.696766 2559 state_mem.go:35] "Initializing new in-memory state store" Apr 23 08:50:34.753670 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.736891 2559 manager.go:341] "Starting Device Plugin manager" Apr 23 08:50:34.753670 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:34.736922 2559 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 08:50:34.753670 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.736931 2559 server.go:85] "Starting device plugin registration server" Apr 23 08:50:34.753670 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.737115 2559 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 08:50:34.753670 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.737126 2559 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 08:50:34.753670 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.737204 2559 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 08:50:34.753670 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.737275 2559 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 08:50:34.753670 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.737282 2559 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 08:50:34.753670 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:34.737610 2559 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 08:50:34.753670 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:34.737650 2559 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-31.ec2.internal\" not found" Apr 23 08:50:34.757438 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.757418 2559 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-31.ec2.internal" not found Apr 23 08:50:34.809467 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.809411 2559 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 08:50:34.810521 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.810502 2559 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 08:50:34.810582 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.810526 2559 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 08:50:34.810582 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.810541 2559 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 08:50:34.810582 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.810548 2559 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 08:50:34.810582 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:34.810578 2559 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 08:50:34.813079 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.813060 2559 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:50:34.837260 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.837244 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:50:34.838116 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.838102 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:50:34.838184 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.838127 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:50:34.838184 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.838139 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:50:34.838184 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.838158 2559 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-31.ec2.internal" Apr 23 08:50:34.846700 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.846687 2559 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-31.ec2.internal" Apr 23 08:50:34.846755 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:34.846704 2559 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-31.ec2.internal\": node \"ip-10-0-137-31.ec2.internal\" not found" Apr 23 08:50:34.856467 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:34.856449 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-31.ec2.internal\" not found" Apr 23 08:50:34.911486 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.911467 2559 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-31.ec2.internal"] Apr 23 08:50:34.911542 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.911521 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:50:34.912294 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.912276 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:50:34.912339 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.912301 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:50:34.912339 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.912315 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:50:34.914566 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.914554 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:50:34.914715 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.914703 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal" Apr 23 08:50:34.914751 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.914728 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:50:34.915202 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.915183 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:50:34.915279 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.915210 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:50:34.915279 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.915187 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:50:34.915279 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.915241 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:50:34.915279 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.915250 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:50:34.915279 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.915223 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:50:34.917385 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.917371 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-31.ec2.internal" Apr 23 08:50:34.917459 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.917394 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:50:34.917949 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.917929 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:50:34.917949 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.917950 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:50:34.918120 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.917961 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:50:34.941541 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:34.941521 2559 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-31.ec2.internal\" not found" node="ip-10-0-137-31.ec2.internal" Apr 23 08:50:34.945866 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:34.945852 2559 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-31.ec2.internal\" not found" node="ip-10-0-137-31.ec2.internal" Apr 23 08:50:34.956645 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:34.956631 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-31.ec2.internal\" not found" Apr 23 08:50:34.977511 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.977493 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c94877e605152d4e7bb221bdc00e5aec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal\" (UID: \"c94877e605152d4e7bb221bdc00e5aec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal" Apr 23 08:50:34.977558 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.977516 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c00815b7f1126dd8cdee9bcddd809207-config\") pod \"kube-apiserver-proxy-ip-10-0-137-31.ec2.internal\" (UID: \"c00815b7f1126dd8cdee9bcddd809207\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-31.ec2.internal" Apr 23 08:50:34.977558 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:34.977534 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c94877e605152d4e7bb221bdc00e5aec-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal\" (UID: \"c94877e605152d4e7bb221bdc00e5aec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal" Apr 23 08:50:35.056987 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:35.056965 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-31.ec2.internal\" not found" Apr 23 08:50:35.078409 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.078362 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c94877e605152d4e7bb221bdc00e5aec-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal\" (UID: \"c94877e605152d4e7bb221bdc00e5aec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal" Apr 23 08:50:35.078409 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.078388 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c94877e605152d4e7bb221bdc00e5aec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal\" (UID: \"c94877e605152d4e7bb221bdc00e5aec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal" Apr 23 08:50:35.078409 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.078409 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c00815b7f1126dd8cdee9bcddd809207-config\") pod \"kube-apiserver-proxy-ip-10-0-137-31.ec2.internal\" (UID: \"c00815b7f1126dd8cdee9bcddd809207\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-31.ec2.internal" Apr 23 08:50:35.078588 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.078445 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c00815b7f1126dd8cdee9bcddd809207-config\") pod \"kube-apiserver-proxy-ip-10-0-137-31.ec2.internal\" (UID: \"c00815b7f1126dd8cdee9bcddd809207\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-31.ec2.internal" Apr 23 08:50:35.078588 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.078460 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c94877e605152d4e7bb221bdc00e5aec-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal\" (UID: \"c94877e605152d4e7bb221bdc00e5aec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal" Apr 23 08:50:35.078588 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.078486 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c94877e605152d4e7bb221bdc00e5aec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal\" (UID: \"c94877e605152d4e7bb221bdc00e5aec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal" Apr 23 08:50:35.157701 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:35.157675 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-31.ec2.internal\" not found" Apr 23 08:50:35.244284 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.244254 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal" Apr 23 08:50:35.248919 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.248904 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-31.ec2.internal" Apr 23 08:50:35.258759 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:35.258741 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-31.ec2.internal\" not found" Apr 23 08:50:35.359352 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:35.359302 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-31.ec2.internal\" not found" Apr 23 08:50:35.459911 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:35.459891 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-31.ec2.internal\" not found" Apr 23 08:50:35.560496 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:35.560476 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-31.ec2.internal\" not found" Apr 23 08:50:35.577864 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.577849 2559 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 08:50:35.577995 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.577959 2559 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:50:35.578054 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.578020 2559 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:50:35.661541 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:35.661513 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-31.ec2.internal\" not found" Apr 23 08:50:35.669683 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.669650 2559 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 08:45:34 +0000 UTC" deadline="2027-12-29 20:05:46.976526008 +0000 UTC" Apr 23 08:50:35.669683 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.669678 2559 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14771h15m11.306850766s" Apr 23 08:50:35.673814 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.673797 2559 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 08:50:35.690565 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.690549 2559 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:50:35.693075 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.693059 2559 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:50:35.710300 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.710283 2559 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-96rc7" Apr 23 08:50:35.717462 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.717448 2559 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-96rc7" Apr 23 08:50:35.751780 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:35.751753 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc00815b7f1126dd8cdee9bcddd809207.slice/crio-fdadfc363560a43e5126f27b39cdba4b3ad4fc8da9aca44f8cdc44b5cf6457b4 WatchSource:0}: Error finding container fdadfc363560a43e5126f27b39cdba4b3ad4fc8da9aca44f8cdc44b5cf6457b4: Status 404 returned error can't find the container with id fdadfc363560a43e5126f27b39cdba4b3ad4fc8da9aca44f8cdc44b5cf6457b4 Apr 23 08:50:35.752164 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:35.752133 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc94877e605152d4e7bb221bdc00e5aec.slice/crio-698ea379600fc1ac4be0c080b81ce9c3a7a2cd8c64ea0d54bffa351890a7026c WatchSource:0}: Error finding container 698ea379600fc1ac4be0c080b81ce9c3a7a2cd8c64ea0d54bffa351890a7026c: Status 404 returned error can't find the container with id 698ea379600fc1ac4be0c080b81ce9c3a7a2cd8c64ea0d54bffa351890a7026c Apr 23 08:50:35.757580 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.757531 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:50:35.774952 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.774936 2559 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal" Apr 23 08:50:35.785060 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.785046 2559 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:50:35.786955 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.786944 2559 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-31.ec2.internal" Apr 23 08:50:35.792387 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.792372 2559 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:50:35.813080 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.813040 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal" event={"ID":"c94877e605152d4e7bb221bdc00e5aec","Type":"ContainerStarted","Data":"698ea379600fc1ac4be0c080b81ce9c3a7a2cd8c64ea0d54bffa351890a7026c"} Apr 23 08:50:35.813905 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:35.813886 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-31.ec2.internal" event={"ID":"c00815b7f1126dd8cdee9bcddd809207","Type":"ContainerStarted","Data":"fdadfc363560a43e5126f27b39cdba4b3ad4fc8da9aca44f8cdc44b5cf6457b4"} Apr 23 08:50:36.056241 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.056193 2559 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:50:36.430559 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.430482 2559 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:50:36.649383 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.649357 2559 apiserver.go:52] "Watching apiserver" Apr 23 08:50:36.654937 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.654915 2559 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 08:50:36.657401 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.657355 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-2zjq6","kube-system/kube-apiserver-proxy-ip-10-0-137-31.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc","openshift-image-registry/node-ca-j7htq","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal","openshift-multus/multus-pbxp4","openshift-network-operator/iptables-alerter-fx7dn","openshift-ovn-kubernetes/ovnkube-node-sqslz","openshift-cluster-node-tuning-operator/tuned-jjkvv","openshift-multus/multus-additional-cni-plugins-nxtdf","openshift-multus/network-metrics-daemon-lx4sg","openshift-network-diagnostics/network-check-target-7z6cq"] Apr 23 08:50:36.662684 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.662657 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2zjq6" Apr 23 08:50:36.664954 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.664552 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 08:50:36.664954 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.664580 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 08:50:36.664954 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.664580 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qvdn9\"" Apr 23 08:50:36.664954 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.664774 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.666469 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.666447 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 08:50:36.666469 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.666467 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-j4qxb\"" Apr 23 08:50:36.666663 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.666640 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 08:50:36.666804 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.666714 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 08:50:36.666943 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.666925 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j7htq" Apr 23 08:50:36.668774 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.668754 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 08:50:36.669046 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.668964 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2snlq\"" Apr 23 08:50:36.669137 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.669122 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 08:50:36.669195 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.669179 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 08:50:36.669684 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.669623 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.671899 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.671254 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 08:50:36.671899 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.671288 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 08:50:36.671899 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.671307 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 08:50:36.671899 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.671254 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 08:50:36.671899 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.671553 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-ggpnv\"" Apr 23 08:50:36.672908 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.672890 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fx7dn" Apr 23 08:50:36.673114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.673093 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.674670 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.674649 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 08:50:36.674763 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.674746 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:50:36.674763 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.674756 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-mphhw\"" Apr 23 08:50:36.675580 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.675471 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 08:50:36.675580 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.675490 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 08:50:36.675580 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.675581 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 08:50:36.675778 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.675620 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 08:50:36.675778 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.675709 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 08:50:36.675778 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.675729 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 08:50:36.675921 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.675797 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 08:50:36.675921 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.675913 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-d7sbx\"" Apr 23 08:50:36.679703 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.679492 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.679703 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.679622 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.681266 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.681208 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:50:36.681266 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.681248 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 08:50:36.681396 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.681293 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5fj2c\"" Apr 23 08:50:36.681663 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.681641 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-lghp9\"" Apr 23 08:50:36.681751 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.681690 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 08:50:36.681751 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.681711 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 08:50:36.682349 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.682327 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:36.682456 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:36.682425 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lx4sg" podUID="61450b58-933b-4b5d-bf40-9e4408670e3e" Apr 23 08:50:36.685718 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.685702 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:36.685816 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:36.685762 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7z6cq" podUID="4dc177c2-0f81-4db4-ac46-adbf96e2b0c5" Apr 23 08:50:36.688225 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688204 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/76d63805-1298-4d8e-8c56-96c2091a8697-env-overrides\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.688302 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688241 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-sys\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.688302 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688264 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a8842ac-bb6d-4882-b788-6c3e16e84191-tmp\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.688302 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688286 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-multus-cni-dir\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.688400 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688308 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-sysconfig\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.688400 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688330 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/739856dc-433b-4287-917c-c4009595df8c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.688400 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688353 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/757f8beb-3271-44ad-88be-22369a09a56a-os-release\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.688400 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688378 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-host-var-lib-kubelet\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.688537 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688408 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-sysctl-d\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.688537 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688435 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-var-lib-openvswitch\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.688537 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688471 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/76d63805-1298-4d8e-8c56-96c2091a8697-ovnkube-script-lib\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.688537 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688494 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/de15a57c-d8b0-4a96-8a86-1a6ba933e6d9-host-slash\") pod \"iptables-alerter-fx7dn\" (UID: \"de15a57c-d8b0-4a96-8a86-1a6ba933e6d9\") " pod="openshift-network-operator/iptables-alerter-fx7dn" Apr 23 08:50:36.688537 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688517 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/757f8beb-3271-44ad-88be-22369a09a56a-system-cni-dir\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.688537 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688531 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-host-run-multus-certs\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.688792 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688545 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-lib-modules\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.688792 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688558 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-tuned\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.688792 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688578 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/de15a57c-d8b0-4a96-8a86-1a6ba933e6d9-iptables-alerter-script\") pod \"iptables-alerter-fx7dn\" (UID: \"de15a57c-d8b0-4a96-8a86-1a6ba933e6d9\") " pod="openshift-network-operator/iptables-alerter-fx7dn" Apr 23 08:50:36.688792 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688611 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/757f8beb-3271-44ad-88be-22369a09a56a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.688792 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688635 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-os-release\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.688792 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688655 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-host-var-lib-cni-multus\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.688792 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688694 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-run-netns\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.688792 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688736 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.688792 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688763 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-sysctl-conf\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.688792 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688791 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-host\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.689255 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688854 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d2f6fea9-05ba-40e3-bbdf-c915828b21ac-konnectivity-ca\") pod \"konnectivity-agent-2zjq6\" (UID: \"d2f6fea9-05ba-40e3-bbdf-c915828b21ac\") " pod="kube-system/konnectivity-agent-2zjq6" Apr 23 08:50:36.689255 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688888 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-run-systemd\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.689255 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688913 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-etc-openvswitch\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.689255 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.688936 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-log-socket\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.689255 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689001 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74wbz\" (UniqueName: \"kubernetes.io/projected/3a8842ac-bb6d-4882-b788-6c3e16e84191-kube-api-access-74wbz\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.689255 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689037 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/739856dc-433b-4287-917c-c4009595df8c-registration-dir\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.689255 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689075 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/739856dc-433b-4287-917c-c4009595df8c-etc-selinux\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.689255 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689113 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-host-run-k8s-cni-cncf-io\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.689255 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689143 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-host-run-netns\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.689255 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689165 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-node-log\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.689255 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689188 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-kubernetes\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.689255 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689209 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-systemd\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.689255 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689247 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-run\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.689747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689282 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/757f8beb-3271-44ad-88be-22369a09a56a-cni-binary-copy\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.689747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689308 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-multus-socket-dir-parent\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.689747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689339 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-run-ovn-kubernetes\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.689747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689362 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-modprobe-d\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.689747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689393 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-run-ovn\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.689747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689414 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9lwc\" (UniqueName: \"kubernetes.io/projected/739856dc-433b-4287-917c-c4009595df8c-kube-api-access-j9lwc\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.689747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689451 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n6p9\" (UniqueName: \"kubernetes.io/projected/de15a57c-d8b0-4a96-8a86-1a6ba933e6d9-kube-api-access-9n6p9\") pod \"iptables-alerter-fx7dn\" (UID: \"de15a57c-d8b0-4a96-8a86-1a6ba933e6d9\") " pod="openshift-network-operator/iptables-alerter-fx7dn" Apr 23 08:50:36.689747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689484 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dlh8\" (UniqueName: \"kubernetes.io/projected/757f8beb-3271-44ad-88be-22369a09a56a-kube-api-access-9dlh8\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.689747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689525 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-etc-kubernetes\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.689747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689569 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/739856dc-433b-4287-917c-c4009595df8c-device-dir\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.689747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689602 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d23173d3-6e3a-4b55-b1dc-7075f2278e15-serviceca\") pod \"node-ca-j7htq\" (UID: \"d23173d3-6e3a-4b55-b1dc-7075f2278e15\") " pod="openshift-image-registry/node-ca-j7htq" Apr 23 08:50:36.689747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689627 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49ggm\" (UniqueName: \"kubernetes.io/projected/d23173d3-6e3a-4b55-b1dc-7075f2278e15-kube-api-access-49ggm\") pod \"node-ca-j7htq\" (UID: \"d23173d3-6e3a-4b55-b1dc-7075f2278e15\") " pod="openshift-image-registry/node-ca-j7htq" Apr 23 08:50:36.689747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689648 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-cnibin\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.689747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689682 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-run-openvswitch\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.689747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689709 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/76d63805-1298-4d8e-8c56-96c2091a8697-ovnkube-config\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.689747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689752 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-var-lib-kubelet\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.690395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689793 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-hostroot\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.690395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689819 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-kubelet\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.690395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689853 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-systemd-units\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.690395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689877 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/76d63805-1298-4d8e-8c56-96c2091a8697-ovn-node-metrics-cert\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.690395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689906 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gxw5\" (UniqueName: \"kubernetes.io/projected/76d63805-1298-4d8e-8c56-96c2091a8697-kube-api-access-4gxw5\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.690395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689949 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/757f8beb-3271-44ad-88be-22369a09a56a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.690395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.689998 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/be6f3563-0a3f-4959-9cda-d87e7a467749-cni-binary-copy\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.690395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.690024 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-cni-netd\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.690395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.690046 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/739856dc-433b-4287-917c-c4009595df8c-socket-dir\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.690395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.690067 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ppls\" (UniqueName: \"kubernetes.io/projected/be6f3563-0a3f-4959-9cda-d87e7a467749-kube-api-access-4ppls\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.690395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.690089 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/739856dc-433b-4287-917c-c4009595df8c-sys-fs\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.690395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.690112 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d23173d3-6e3a-4b55-b1dc-7075f2278e15-host\") pod \"node-ca-j7htq\" (UID: \"d23173d3-6e3a-4b55-b1dc-7075f2278e15\") " pod="openshift-image-registry/node-ca-j7htq" Apr 23 08:50:36.690395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.690134 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/757f8beb-3271-44ad-88be-22369a09a56a-cnibin\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.690395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.690172 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-system-cni-dir\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.690395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.690250 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-host-var-lib-cni-bin\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.690395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.690274 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/be6f3563-0a3f-4959-9cda-d87e7a467749-multus-daemon-config\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.691143 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.690326 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-cni-bin\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.691143 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.690410 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/757f8beb-3271-44ad-88be-22369a09a56a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.691143 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.690438 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d2f6fea9-05ba-40e3-bbdf-c915828b21ac-agent-certs\") pod \"konnectivity-agent-2zjq6\" (UID: \"d2f6fea9-05ba-40e3-bbdf-c915828b21ac\") " pod="kube-system/konnectivity-agent-2zjq6" Apr 23 08:50:36.691143 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.690462 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-multus-conf-dir\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.691143 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.690487 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-slash\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.718071 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.718006 2559 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:45:35 +0000 UTC" deadline="2027-12-10 18:09:55.631689686 +0000 UTC" Apr 23 08:50:36.718071 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.718034 2559 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14313h19m18.913658712s" Apr 23 08:50:36.777237 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.777218 2559 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 08:50:36.791235 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791213 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/739856dc-433b-4287-917c-c4009595df8c-registration-dir\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.791320 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791247 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/739856dc-433b-4287-917c-c4009595df8c-etc-selinux\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.791320 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791274 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-host-run-k8s-cni-cncf-io\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.791320 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791290 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-host-run-netns\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.791320 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791304 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-node-log\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.791320 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791310 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/739856dc-433b-4287-917c-c4009595df8c-registration-dir\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.791570 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791352 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-host-run-k8s-cni-cncf-io\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.791570 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791319 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-kubernetes\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.791570 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791372 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/739856dc-433b-4287-917c-c4009595df8c-etc-selinux\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.791570 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791387 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-systemd\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.791570 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791388 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-host-run-netns\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.791570 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791378 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-kubernetes\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.791570 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791406 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-run\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.791570 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791392 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-node-log\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.791570 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791430 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/757f8beb-3271-44ad-88be-22369a09a56a-cni-binary-copy\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.791570 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791441 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-systemd\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.791570 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791467 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-multus-socket-dir-parent\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.791570 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791486 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-run-ovn-kubernetes\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.791570 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791466 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-run\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.791570 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791499 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-multus-socket-dir-parent\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.791570 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791537 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-modprobe-d\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.791570 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791566 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-run-ovn\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.791570 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791574 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-run-ovn-kubernetes\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.792256 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791591 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9lwc\" (UniqueName: \"kubernetes.io/projected/739856dc-433b-4287-917c-c4009595df8c-kube-api-access-j9lwc\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.792256 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791619 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9n6p9\" (UniqueName: \"kubernetes.io/projected/de15a57c-d8b0-4a96-8a86-1a6ba933e6d9-kube-api-access-9n6p9\") pod \"iptables-alerter-fx7dn\" (UID: \"de15a57c-d8b0-4a96-8a86-1a6ba933e6d9\") " pod="openshift-network-operator/iptables-alerter-fx7dn" Apr 23 08:50:36.792256 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791645 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dlh8\" (UniqueName: \"kubernetes.io/projected/757f8beb-3271-44ad-88be-22369a09a56a-kube-api-access-9dlh8\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.792256 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791649 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-run-ovn\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.792256 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791683 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-modprobe-d\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.792256 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791752 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-etc-kubernetes\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.792256 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791784 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/739856dc-433b-4287-917c-c4009595df8c-device-dir\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.792256 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791809 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d23173d3-6e3a-4b55-b1dc-7075f2278e15-serviceca\") pod \"node-ca-j7htq\" (UID: \"d23173d3-6e3a-4b55-b1dc-7075f2278e15\") " pod="openshift-image-registry/node-ca-j7htq" Apr 23 08:50:36.792256 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791853 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-etc-kubernetes\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.792256 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791888 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/739856dc-433b-4287-917c-c4009595df8c-device-dir\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.792256 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791900 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/757f8beb-3271-44ad-88be-22369a09a56a-cni-binary-copy\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.792256 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791902 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49ggm\" (UniqueName: \"kubernetes.io/projected/d23173d3-6e3a-4b55-b1dc-7075f2278e15-kube-api-access-49ggm\") pod \"node-ca-j7htq\" (UID: \"d23173d3-6e3a-4b55-b1dc-7075f2278e15\") " pod="openshift-image-registry/node-ca-j7htq" Apr 23 08:50:36.792256 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791960 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-cnibin\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.792256 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.791998 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-run-openvswitch\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.792256 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792024 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/76d63805-1298-4d8e-8c56-96c2091a8697-ovnkube-config\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.792256 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792049 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-var-lib-kubelet\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.792256 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792050 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-cnibin\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.793090 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792050 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-run-openvswitch\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.793090 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792099 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-hostroot\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.793090 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792125 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-var-lib-kubelet\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.793090 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792130 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-kubelet\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.793090 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792171 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-kubelet\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.793090 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792178 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-systemd-units\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.793090 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792207 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/76d63805-1298-4d8e-8c56-96c2091a8697-ovn-node-metrics-cert\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.793090 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792212 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d23173d3-6e3a-4b55-b1dc-7075f2278e15-serviceca\") pod \"node-ca-j7htq\" (UID: \"d23173d3-6e3a-4b55-b1dc-7075f2278e15\") " pod="openshift-image-registry/node-ca-j7htq" Apr 23 08:50:36.793090 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792179 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-hostroot\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.793090 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792219 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-systemd-units\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.793090 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792232 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gxw5\" (UniqueName: \"kubernetes.io/projected/76d63805-1298-4d8e-8c56-96c2091a8697-kube-api-access-4gxw5\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.793090 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792282 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxhzq\" (UniqueName: \"kubernetes.io/projected/61450b58-933b-4b5d-bf40-9e4408670e3e-kube-api-access-cxhzq\") pod \"network-metrics-daemon-lx4sg\" (UID: \"61450b58-933b-4b5d-bf40-9e4408670e3e\") " pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:36.793090 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792314 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/757f8beb-3271-44ad-88be-22369a09a56a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.793090 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792338 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/be6f3563-0a3f-4959-9cda-d87e7a467749-cni-binary-copy\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.793090 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792362 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-cni-netd\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.793090 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792386 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/739856dc-433b-4287-917c-c4009595df8c-socket-dir\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.793090 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792432 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ppls\" (UniqueName: \"kubernetes.io/projected/be6f3563-0a3f-4959-9cda-d87e7a467749-kube-api-access-4ppls\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.793857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792457 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/739856dc-433b-4287-917c-c4009595df8c-sys-fs\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.793857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792486 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs\") pod \"network-metrics-daemon-lx4sg\" (UID: \"61450b58-933b-4b5d-bf40-9e4408670e3e\") " pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:36.793857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792510 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d23173d3-6e3a-4b55-b1dc-7075f2278e15-host\") pod \"node-ca-j7htq\" (UID: \"d23173d3-6e3a-4b55-b1dc-7075f2278e15\") " pod="openshift-image-registry/node-ca-j7htq" Apr 23 08:50:36.793857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792536 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/757f8beb-3271-44ad-88be-22369a09a56a-cnibin\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.793857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792560 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-system-cni-dir\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.793857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792545 2559 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 08:50:36.793857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792589 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-host-var-lib-cni-bin\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.793857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792605 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/76d63805-1298-4d8e-8c56-96c2091a8697-ovnkube-config\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.793857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792612 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/be6f3563-0a3f-4959-9cda-d87e7a467749-multus-daemon-config\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.793857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792638 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-cni-bin\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.793857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792659 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/757f8beb-3271-44ad-88be-22369a09a56a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.793857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792675 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d2f6fea9-05ba-40e3-bbdf-c915828b21ac-agent-certs\") pod \"konnectivity-agent-2zjq6\" (UID: \"d2f6fea9-05ba-40e3-bbdf-c915828b21ac\") " pod="kube-system/konnectivity-agent-2zjq6" Apr 23 08:50:36.793857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792718 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/757f8beb-3271-44ad-88be-22369a09a56a-cnibin\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.793857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792718 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d23173d3-6e3a-4b55-b1dc-7075f2278e15-host\") pod \"node-ca-j7htq\" (UID: \"d23173d3-6e3a-4b55-b1dc-7075f2278e15\") " pod="openshift-image-registry/node-ca-j7htq" Apr 23 08:50:36.793857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792757 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-system-cni-dir\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.793857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792754 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/757f8beb-3271-44ad-88be-22369a09a56a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.793857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792794 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-host-var-lib-cni-bin\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.793857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792879 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/be6f3563-0a3f-4959-9cda-d87e7a467749-cni-binary-copy\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.794675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.792943 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-cni-bin\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.794675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793099 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/757f8beb-3271-44ad-88be-22369a09a56a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.794675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793118 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-cni-netd\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.794675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793159 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-multus-conf-dir\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.794675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793176 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/739856dc-433b-4287-917c-c4009595df8c-sys-fs\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.794675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793185 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-slash\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.794675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793210 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/76d63805-1298-4d8e-8c56-96c2091a8697-env-overrides\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.794675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793216 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-multus-conf-dir\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.794675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793232 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-sys\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.794675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793239 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/739856dc-433b-4287-917c-c4009595df8c-socket-dir\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.794675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793251 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-slash\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.794675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793255 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a8842ac-bb6d-4882-b788-6c3e16e84191-tmp\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.794675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793288 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-multus-cni-dir\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.794675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793293 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-sys\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.794675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793249 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/be6f3563-0a3f-4959-9cda-d87e7a467749-multus-daemon-config\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.794675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793314 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-sysconfig\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.794675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793338 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/739856dc-433b-4287-917c-c4009595df8c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.794675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793352 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-multus-cni-dir\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.795688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793363 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/757f8beb-3271-44ad-88be-22369a09a56a-os-release\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.795688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793392 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-sysconfig\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.795688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793392 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-host-var-lib-kubelet\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.795688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793424 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-host-var-lib-kubelet\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.795688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793426 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-sysctl-d\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.795688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793458 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-var-lib-openvswitch\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.795688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793486 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/76d63805-1298-4d8e-8c56-96c2091a8697-ovnkube-script-lib\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.795688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793505 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/739856dc-433b-4287-917c-c4009595df8c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.795688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793513 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-sysctl-d\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.795688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793517 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjkdd\" (UniqueName: \"kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd\") pod \"network-check-target-7z6cq\" (UID: \"4dc177c2-0f81-4db4-ac46-adbf96e2b0c5\") " pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:36.795688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793561 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/de15a57c-d8b0-4a96-8a86-1a6ba933e6d9-host-slash\") pod \"iptables-alerter-fx7dn\" (UID: \"de15a57c-d8b0-4a96-8a86-1a6ba933e6d9\") " pod="openshift-network-operator/iptables-alerter-fx7dn" Apr 23 08:50:36.795688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793565 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-var-lib-openvswitch\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.795688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793585 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/76d63805-1298-4d8e-8c56-96c2091a8697-env-overrides\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.795688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793610 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/757f8beb-3271-44ad-88be-22369a09a56a-system-cni-dir\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.795688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793629 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/757f8beb-3271-44ad-88be-22369a09a56a-os-release\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.795688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793636 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/de15a57c-d8b0-4a96-8a86-1a6ba933e6d9-host-slash\") pod \"iptables-alerter-fx7dn\" (UID: \"de15a57c-d8b0-4a96-8a86-1a6ba933e6d9\") " pod="openshift-network-operator/iptables-alerter-fx7dn" Apr 23 08:50:36.795688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793691 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-host-run-multus-certs\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.796395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793691 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/757f8beb-3271-44ad-88be-22369a09a56a-system-cni-dir\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.796395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793716 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-lib-modules\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.796395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793728 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-host-run-multus-certs\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.796395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793738 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-tuned\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.796395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793775 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/de15a57c-d8b0-4a96-8a86-1a6ba933e6d9-iptables-alerter-script\") pod \"iptables-alerter-fx7dn\" (UID: \"de15a57c-d8b0-4a96-8a86-1a6ba933e6d9\") " pod="openshift-network-operator/iptables-alerter-fx7dn" Apr 23 08:50:36.796395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793800 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/757f8beb-3271-44ad-88be-22369a09a56a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.796395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.793813 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-lib-modules\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.796395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794237 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-os-release\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.796395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794270 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-host-var-lib-cni-multus\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.796395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794298 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-run-netns\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.796395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794323 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.796395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794353 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-sysctl-conf\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.796395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794376 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-host\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.796395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794396 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/de15a57c-d8b0-4a96-8a86-1a6ba933e6d9-iptables-alerter-script\") pod \"iptables-alerter-fx7dn\" (UID: \"de15a57c-d8b0-4a96-8a86-1a6ba933e6d9\") " pod="openshift-network-operator/iptables-alerter-fx7dn" Apr 23 08:50:36.796395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794400 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d2f6fea9-05ba-40e3-bbdf-c915828b21ac-konnectivity-ca\") pod \"konnectivity-agent-2zjq6\" (UID: \"d2f6fea9-05ba-40e3-bbdf-c915828b21ac\") " pod="kube-system/konnectivity-agent-2zjq6" Apr 23 08:50:36.796395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794425 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-run-systemd\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.796395 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794448 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-etc-openvswitch\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.797114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794457 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.797114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794471 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-log-socket\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.797114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794496 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74wbz\" (UniqueName: \"kubernetes.io/projected/3a8842ac-bb6d-4882-b788-6c3e16e84191-kube-api-access-74wbz\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.797114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794515 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-os-release\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.797114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794554 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/be6f3563-0a3f-4959-9cda-d87e7a467749-host-var-lib-cni-multus\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.797114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794580 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-host-run-netns\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.797114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794648 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-etc-openvswitch\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.797114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794689 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-run-systemd\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.797114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794749 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/76d63805-1298-4d8e-8c56-96c2091a8697-log-socket\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.797114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794795 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-sysctl-conf\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.797114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794805 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a8842ac-bb6d-4882-b788-6c3e16e84191-host\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.797114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.794829 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/757f8beb-3271-44ad-88be-22369a09a56a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.797114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.795070 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/76d63805-1298-4d8e-8c56-96c2091a8697-ovnkube-script-lib\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.797114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.795085 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d2f6fea9-05ba-40e3-bbdf-c915828b21ac-konnectivity-ca\") pod \"konnectivity-agent-2zjq6\" (UID: \"d2f6fea9-05ba-40e3-bbdf-c915828b21ac\") " pod="kube-system/konnectivity-agent-2zjq6" Apr 23 08:50:36.797114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.796239 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a8842ac-bb6d-4882-b788-6c3e16e84191-tmp\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.797114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.796575 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3a8842ac-bb6d-4882-b788-6c3e16e84191-etc-tuned\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.797114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.796905 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/76d63805-1298-4d8e-8c56-96c2091a8697-ovn-node-metrics-cert\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.797114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.797025 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d2f6fea9-05ba-40e3-bbdf-c915828b21ac-agent-certs\") pod \"konnectivity-agent-2zjq6\" (UID: \"d2f6fea9-05ba-40e3-bbdf-c915828b21ac\") " pod="kube-system/konnectivity-agent-2zjq6" Apr 23 08:50:36.801407 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.801329 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n6p9\" (UniqueName: \"kubernetes.io/projected/de15a57c-d8b0-4a96-8a86-1a6ba933e6d9-kube-api-access-9n6p9\") pod \"iptables-alerter-fx7dn\" (UID: \"de15a57c-d8b0-4a96-8a86-1a6ba933e6d9\") " pod="openshift-network-operator/iptables-alerter-fx7dn" Apr 23 08:50:36.801619 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.801522 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9lwc\" (UniqueName: \"kubernetes.io/projected/739856dc-433b-4287-917c-c4009595df8c-kube-api-access-j9lwc\") pod \"aws-ebs-csi-driver-node-2xcwc\" (UID: \"739856dc-433b-4287-917c-c4009595df8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:36.801889 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.801850 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ppls\" (UniqueName: \"kubernetes.io/projected/be6f3563-0a3f-4959-9cda-d87e7a467749-kube-api-access-4ppls\") pod \"multus-pbxp4\" (UID: \"be6f3563-0a3f-4959-9cda-d87e7a467749\") " pod="openshift-multus/multus-pbxp4" Apr 23 08:50:36.802011 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.801891 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49ggm\" (UniqueName: \"kubernetes.io/projected/d23173d3-6e3a-4b55-b1dc-7075f2278e15-kube-api-access-49ggm\") pod \"node-ca-j7htq\" (UID: \"d23173d3-6e3a-4b55-b1dc-7075f2278e15\") " pod="openshift-image-registry/node-ca-j7htq" Apr 23 08:50:36.802011 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.801947 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dlh8\" (UniqueName: \"kubernetes.io/projected/757f8beb-3271-44ad-88be-22369a09a56a-kube-api-access-9dlh8\") pod \"multus-additional-cni-plugins-nxtdf\" (UID: \"757f8beb-3271-44ad-88be-22369a09a56a\") " pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:36.803934 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.803898 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74wbz\" (UniqueName: \"kubernetes.io/projected/3a8842ac-bb6d-4882-b788-6c3e16e84191-kube-api-access-74wbz\") pod \"tuned-jjkvv\" (UID: \"3a8842ac-bb6d-4882-b788-6c3e16e84191\") " pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:36.803934 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.803907 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gxw5\" (UniqueName: \"kubernetes.io/projected/76d63805-1298-4d8e-8c56-96c2091a8697-kube-api-access-4gxw5\") pod \"ovnkube-node-sqslz\" (UID: \"76d63805-1298-4d8e-8c56-96c2091a8697\") " pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:36.895072 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.895043 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjkdd\" (UniqueName: \"kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd\") pod \"network-check-target-7z6cq\" (UID: \"4dc177c2-0f81-4db4-ac46-adbf96e2b0c5\") " pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:36.895249 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.895118 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxhzq\" (UniqueName: \"kubernetes.io/projected/61450b58-933b-4b5d-bf40-9e4408670e3e-kube-api-access-cxhzq\") pod \"network-metrics-daemon-lx4sg\" (UID: \"61450b58-933b-4b5d-bf40-9e4408670e3e\") " pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:36.895249 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.895141 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs\") pod \"network-metrics-daemon-lx4sg\" (UID: \"61450b58-933b-4b5d-bf40-9e4408670e3e\") " pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:36.895249 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:36.895242 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:36.895389 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:36.895332 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs podName:61450b58-933b-4b5d-bf40-9e4408670e3e nodeName:}" failed. No retries permitted until 2026-04-23 08:50:37.39528643 +0000 UTC m=+3.207891795 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs") pod "network-metrics-daemon-lx4sg" (UID: "61450b58-933b-4b5d-bf40-9e4408670e3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:36.901730 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:36.901706 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:50:36.901730 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:36.901726 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:50:36.901730 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:36.901736 2559 projected.go:194] Error preparing data for projected volume kube-api-access-qjkdd for pod openshift-network-diagnostics/network-check-target-7z6cq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:36.901954 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:36.901819 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd podName:4dc177c2-0f81-4db4-ac46-adbf96e2b0c5 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:37.401807002 +0000 UTC m=+3.214412368 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qjkdd" (UniqueName: "kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd") pod "network-check-target-7z6cq" (UID: "4dc177c2-0f81-4db4-ac46-adbf96e2b0c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:36.907057 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.907036 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxhzq\" (UniqueName: \"kubernetes.io/projected/61450b58-933b-4b5d-bf40-9e4408670e3e-kube-api-access-cxhzq\") pod \"network-metrics-daemon-lx4sg\" (UID: \"61450b58-933b-4b5d-bf40-9e4408670e3e\") " pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:36.974746 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.974679 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2zjq6" Apr 23 08:50:36.985463 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.985444 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j7htq" Apr 23 08:50:36.993052 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:36.993030 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" Apr 23 08:50:37.000722 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.000701 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pbxp4" Apr 23 08:50:37.009293 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.009275 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fx7dn" Apr 23 08:50:37.015865 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.015847 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:37.021481 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.021464 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" Apr 23 08:50:37.025995 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.025959 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nxtdf" Apr 23 08:50:37.062449 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.062430 2559 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:50:37.397782 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.397760 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs\") pod \"network-metrics-daemon-lx4sg\" (UID: \"61450b58-933b-4b5d-bf40-9e4408670e3e\") " pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:37.397913 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:37.397894 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:37.398001 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:37.397965 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs podName:61450b58-933b-4b5d-bf40-9e4408670e3e nodeName:}" failed. No retries permitted until 2026-04-23 08:50:38.397946187 +0000 UTC m=+4.210551558 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs") pod "network-metrics-daemon-lx4sg" (UID: "61450b58-933b-4b5d-bf40-9e4408670e3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:37.401961 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:37.401933 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a8842ac_bb6d_4882_b788_6c3e16e84191.slice/crio-48b21eaaac935e4899ef356699f1ab48efc27e73635cec433d179fc4ed4716d3 WatchSource:0}: Error finding container 48b21eaaac935e4899ef356699f1ab48efc27e73635cec433d179fc4ed4716d3: Status 404 returned error can't find the container with id 48b21eaaac935e4899ef356699f1ab48efc27e73635cec433d179fc4ed4716d3 Apr 23 08:50:37.404545 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:37.404309 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe6f3563_0a3f_4959_9cda_d87e7a467749.slice/crio-6ac0e59c3cfafc3477fc3a8bebe1804d8b6529b8fc88e035189ae49fde03b9b1 WatchSource:0}: Error finding container 6ac0e59c3cfafc3477fc3a8bebe1804d8b6529b8fc88e035189ae49fde03b9b1: Status 404 returned error can't find the container with id 6ac0e59c3cfafc3477fc3a8bebe1804d8b6529b8fc88e035189ae49fde03b9b1 Apr 23 08:50:37.406351 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:37.406083 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde15a57c_d8b0_4a96_8a86_1a6ba933e6d9.slice/crio-0cd86dac6e7374a6e061ae8b997242ff6ef57b3debfc00a6dab20e905657cc17 WatchSource:0}: Error finding container 0cd86dac6e7374a6e061ae8b997242ff6ef57b3debfc00a6dab20e905657cc17: Status 404 returned error can't find the container with id 0cd86dac6e7374a6e061ae8b997242ff6ef57b3debfc00a6dab20e905657cc17 Apr 23 08:50:37.407793 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:37.407769 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd23173d3_6e3a_4b55_b1dc_7075f2278e15.slice/crio-2090ab1cf04d34d22b8cab4c9d30bfeeb5e9fec5780a110dc43a0ce52a474d79 WatchSource:0}: Error finding container 2090ab1cf04d34d22b8cab4c9d30bfeeb5e9fec5780a110dc43a0ce52a474d79: Status 404 returned error can't find the container with id 2090ab1cf04d34d22b8cab4c9d30bfeeb5e9fec5780a110dc43a0ce52a474d79 Apr 23 08:50:37.409155 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:37.409038 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod739856dc_433b_4287_917c_c4009595df8c.slice/crio-c3080f9e7f1a20a28b2a4d39c47ac4a8693ce5e9eab6c589ff16c58756d3c8c6 WatchSource:0}: Error finding container c3080f9e7f1a20a28b2a4d39c47ac4a8693ce5e9eab6c589ff16c58756d3c8c6: Status 404 returned error can't find the container with id c3080f9e7f1a20a28b2a4d39c47ac4a8693ce5e9eab6c589ff16c58756d3c8c6 Apr 23 08:50:37.409528 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:37.409489 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod757f8beb_3271_44ad_88be_22369a09a56a.slice/crio-754d495e6c76a041c6e4130d3f9a0abedbcf7305160a61342bce230b9e89927f WatchSource:0}: Error finding container 754d495e6c76a041c6e4130d3f9a0abedbcf7305160a61342bce230b9e89927f: Status 404 returned error can't find the container with id 754d495e6c76a041c6e4130d3f9a0abedbcf7305160a61342bce230b9e89927f Apr 23 08:50:37.410461 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:37.410348 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76d63805_1298_4d8e_8c56_96c2091a8697.slice/crio-e0e68d7e747bb554246ad1205045294d481169b316a8cbb4bf3dbbec4c738ea1 WatchSource:0}: Error finding container e0e68d7e747bb554246ad1205045294d481169b316a8cbb4bf3dbbec4c738ea1: Status 404 returned error can't find the container with id e0e68d7e747bb554246ad1205045294d481169b316a8cbb4bf3dbbec4c738ea1 Apr 23 08:50:37.411850 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:37.411819 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2f6fea9_05ba_40e3_bbdf_c915828b21ac.slice/crio-38713a29ba6643721cb737125324413e30bbd1302adf9c3a09d20039c6ed5c79 WatchSource:0}: Error finding container 38713a29ba6643721cb737125324413e30bbd1302adf9c3a09d20039c6ed5c79: Status 404 returned error can't find the container with id 38713a29ba6643721cb737125324413e30bbd1302adf9c3a09d20039c6ed5c79 Apr 23 08:50:37.498182 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.498056 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjkdd\" (UniqueName: \"kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd\") pod \"network-check-target-7z6cq\" (UID: \"4dc177c2-0f81-4db4-ac46-adbf96e2b0c5\") " pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:37.498256 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:37.498212 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:50:37.498256 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:37.498229 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:50:37.498256 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:37.498237 2559 projected.go:194] Error preparing data for projected volume kube-api-access-qjkdd for pod openshift-network-diagnostics/network-check-target-7z6cq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:37.498363 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:37.498282 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd podName:4dc177c2-0f81-4db4-ac46-adbf96e2b0c5 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:38.498266593 +0000 UTC m=+4.310871975 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qjkdd" (UniqueName: "kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd") pod "network-check-target-7z6cq" (UID: "4dc177c2-0f81-4db4-ac46-adbf96e2b0c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:37.719079 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.718970 2559 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:45:35 +0000 UTC" deadline="2027-10-04 09:28:14.831147911 +0000 UTC" Apr 23 08:50:37.719079 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.719019 2559 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12696h37m37.112132799s" Apr 23 08:50:37.819075 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.818922 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nxtdf" event={"ID":"757f8beb-3271-44ad-88be-22369a09a56a","Type":"ContainerStarted","Data":"754d495e6c76a041c6e4130d3f9a0abedbcf7305160a61342bce230b9e89927f"} Apr 23 08:50:37.823383 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.823352 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fx7dn" event={"ID":"de15a57c-d8b0-4a96-8a86-1a6ba933e6d9","Type":"ContainerStarted","Data":"0cd86dac6e7374a6e061ae8b997242ff6ef57b3debfc00a6dab20e905657cc17"} Apr 23 08:50:37.832848 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.832005 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-31.ec2.internal" event={"ID":"c00815b7f1126dd8cdee9bcddd809207","Type":"ContainerStarted","Data":"18e98bd06fbd1247780c7e38fb919d353bc4b80061ad6a432a0f1f0b00827aa3"} Apr 23 08:50:37.838842 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.838605 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2zjq6" event={"ID":"d2f6fea9-05ba-40e3-bbdf-c915828b21ac","Type":"ContainerStarted","Data":"38713a29ba6643721cb737125324413e30bbd1302adf9c3a09d20039c6ed5c79"} Apr 23 08:50:37.847459 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.846259 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" event={"ID":"76d63805-1298-4d8e-8c56-96c2091a8697","Type":"ContainerStarted","Data":"e0e68d7e747bb554246ad1205045294d481169b316a8cbb4bf3dbbec4c738ea1"} Apr 23 08:50:37.847459 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.846770 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-31.ec2.internal" podStartSLOduration=2.846758231 podStartE2EDuration="2.846758231s" podCreationTimestamp="2026-04-23 08:50:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:50:37.846381078 +0000 UTC m=+3.658986467" watchObservedRunningTime="2026-04-23 08:50:37.846758231 +0000 UTC m=+3.659363622" Apr 23 08:50:37.856048 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.856000 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" event={"ID":"739856dc-433b-4287-917c-c4009595df8c","Type":"ContainerStarted","Data":"c3080f9e7f1a20a28b2a4d39c47ac4a8693ce5e9eab6c589ff16c58756d3c8c6"} Apr 23 08:50:37.857769 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.857711 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j7htq" event={"ID":"d23173d3-6e3a-4b55-b1dc-7075f2278e15","Type":"ContainerStarted","Data":"2090ab1cf04d34d22b8cab4c9d30bfeeb5e9fec5780a110dc43a0ce52a474d79"} Apr 23 08:50:37.860277 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.860246 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pbxp4" event={"ID":"be6f3563-0a3f-4959-9cda-d87e7a467749","Type":"ContainerStarted","Data":"6ac0e59c3cfafc3477fc3a8bebe1804d8b6529b8fc88e035189ae49fde03b9b1"} Apr 23 08:50:37.864450 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:37.864410 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" event={"ID":"3a8842ac-bb6d-4882-b788-6c3e16e84191","Type":"ContainerStarted","Data":"48b21eaaac935e4899ef356699f1ab48efc27e73635cec433d179fc4ed4716d3"} Apr 23 08:50:38.407741 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:38.407714 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs\") pod \"network-metrics-daemon-lx4sg\" (UID: \"61450b58-933b-4b5d-bf40-9e4408670e3e\") " pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:38.407932 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:38.407914 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:38.408023 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:38.408001 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs podName:61450b58-933b-4b5d-bf40-9e4408670e3e nodeName:}" failed. No retries permitted until 2026-04-23 08:50:40.407966351 +0000 UTC m=+6.220571729 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs") pod "network-metrics-daemon-lx4sg" (UID: "61450b58-933b-4b5d-bf40-9e4408670e3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:38.508137 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:38.508102 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjkdd\" (UniqueName: \"kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd\") pod \"network-check-target-7z6cq\" (UID: \"4dc177c2-0f81-4db4-ac46-adbf96e2b0c5\") " pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:38.508312 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:38.508275 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:50:38.508312 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:38.508298 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:50:38.508312 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:38.508313 2559 projected.go:194] Error preparing data for projected volume kube-api-access-qjkdd for pod openshift-network-diagnostics/network-check-target-7z6cq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:38.508467 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:38.508368 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd podName:4dc177c2-0f81-4db4-ac46-adbf96e2b0c5 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:40.508348938 +0000 UTC m=+6.320954323 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qjkdd" (UniqueName: "kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd") pod "network-check-target-7z6cq" (UID: "4dc177c2-0f81-4db4-ac46-adbf96e2b0c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:38.811980 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:38.811906 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:38.812418 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:38.812059 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lx4sg" podUID="61450b58-933b-4b5d-bf40-9e4408670e3e" Apr 23 08:50:38.812508 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:38.812489 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:38.812601 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:38.812582 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7z6cq" podUID="4dc177c2-0f81-4db4-ac46-adbf96e2b0c5" Apr 23 08:50:38.875738 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:38.875706 2559 generic.go:358] "Generic (PLEG): container finished" podID="c94877e605152d4e7bb221bdc00e5aec" containerID="64eed04056334f52a2baf6b4a8713dc2527bdcde20c99bedad8abcc5a11a0f78" exitCode=0 Apr 23 08:50:38.876603 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:38.876575 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal" event={"ID":"c94877e605152d4e7bb221bdc00e5aec","Type":"ContainerDied","Data":"64eed04056334f52a2baf6b4a8713dc2527bdcde20c99bedad8abcc5a11a0f78"} Apr 23 08:50:39.880572 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:39.880535 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal" event={"ID":"c94877e605152d4e7bb221bdc00e5aec","Type":"ContainerStarted","Data":"56ce224f77119ad56eab76c755ed574a57ed664a3269e3e2bd078faca878a8e8"} Apr 23 08:50:40.421242 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:40.421059 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs\") pod \"network-metrics-daemon-lx4sg\" (UID: \"61450b58-933b-4b5d-bf40-9e4408670e3e\") " pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:40.421242 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:40.421200 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:40.421465 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:40.421266 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs podName:61450b58-933b-4b5d-bf40-9e4408670e3e nodeName:}" failed. No retries permitted until 2026-04-23 08:50:44.42124828 +0000 UTC m=+10.233853650 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs") pod "network-metrics-daemon-lx4sg" (UID: "61450b58-933b-4b5d-bf40-9e4408670e3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:40.522234 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:40.522200 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjkdd\" (UniqueName: \"kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd\") pod \"network-check-target-7z6cq\" (UID: \"4dc177c2-0f81-4db4-ac46-adbf96e2b0c5\") " pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:40.522405 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:40.522348 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:50:40.522405 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:40.522367 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:50:40.522405 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:40.522379 2559 projected.go:194] Error preparing data for projected volume kube-api-access-qjkdd for pod openshift-network-diagnostics/network-check-target-7z6cq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:40.522557 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:40.522436 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd podName:4dc177c2-0f81-4db4-ac46-adbf96e2b0c5 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:44.522416872 +0000 UTC m=+10.335022253 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-qjkdd" (UniqueName: "kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd") pod "network-check-target-7z6cq" (UID: "4dc177c2-0f81-4db4-ac46-adbf96e2b0c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:40.812460 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:40.811516 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:40.812460 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:40.811646 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lx4sg" podUID="61450b58-933b-4b5d-bf40-9e4408670e3e" Apr 23 08:50:40.812460 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:40.812069 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:40.812460 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:40.812180 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7z6cq" podUID="4dc177c2-0f81-4db4-ac46-adbf96e2b0c5" Apr 23 08:50:42.811369 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:42.811258 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:42.811369 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:42.811295 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:42.811866 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:42.811405 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lx4sg" podUID="61450b58-933b-4b5d-bf40-9e4408670e3e" Apr 23 08:50:42.811866 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:42.811827 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7z6cq" podUID="4dc177c2-0f81-4db4-ac46-adbf96e2b0c5" Apr 23 08:50:44.450538 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:44.450483 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs\") pod \"network-metrics-daemon-lx4sg\" (UID: \"61450b58-933b-4b5d-bf40-9e4408670e3e\") " pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:44.450976 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:44.450659 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:44.450976 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:44.450737 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs podName:61450b58-933b-4b5d-bf40-9e4408670e3e nodeName:}" failed. No retries permitted until 2026-04-23 08:50:52.45071656 +0000 UTC m=+18.263321928 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs") pod "network-metrics-daemon-lx4sg" (UID: "61450b58-933b-4b5d-bf40-9e4408670e3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:44.551793 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:44.551752 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjkdd\" (UniqueName: \"kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd\") pod \"network-check-target-7z6cq\" (UID: \"4dc177c2-0f81-4db4-ac46-adbf96e2b0c5\") " pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:44.551975 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:44.551935 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:50:44.551975 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:44.551956 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:50:44.551975 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:44.551969 2559 projected.go:194] Error preparing data for projected volume kube-api-access-qjkdd for pod openshift-network-diagnostics/network-check-target-7z6cq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:44.552154 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:44.552039 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd podName:4dc177c2-0f81-4db4-ac46-adbf96e2b0c5 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:52.55202363 +0000 UTC m=+18.364628999 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-qjkdd" (UniqueName: "kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd") pod "network-check-target-7z6cq" (UID: "4dc177c2-0f81-4db4-ac46-adbf96e2b0c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:44.812724 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:44.812336 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:44.812724 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:44.812451 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lx4sg" podUID="61450b58-933b-4b5d-bf40-9e4408670e3e" Apr 23 08:50:44.812724 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:44.812562 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:44.812724 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:44.812638 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7z6cq" podUID="4dc177c2-0f81-4db4-ac46-adbf96e2b0c5" Apr 23 08:50:46.562462 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:46.562414 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-31.ec2.internal" podStartSLOduration=11.562400462 podStartE2EDuration="11.562400462s" podCreationTimestamp="2026-04-23 08:50:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:50:39.893299282 +0000 UTC m=+5.705904671" watchObservedRunningTime="2026-04-23 08:50:46.562400462 +0000 UTC m=+12.375005850" Apr 23 08:50:46.563134 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:46.563116 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-dnx2l"] Apr 23 08:50:46.596923 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:46.596901 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dnx2l" Apr 23 08:50:46.598774 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:46.598719 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 08:50:46.598884 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:46.598809 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 08:50:46.598961 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:46.598889 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-fjfk4\"" Apr 23 08:50:46.669700 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:46.669675 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zlnf\" (UniqueName: \"kubernetes.io/projected/c5cbe02b-2850-4700-8339-43f2fe5f24d5-kube-api-access-9zlnf\") pod \"node-resolver-dnx2l\" (UID: \"c5cbe02b-2850-4700-8339-43f2fe5f24d5\") " pod="openshift-dns/node-resolver-dnx2l" Apr 23 08:50:46.669798 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:46.669710 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5cbe02b-2850-4700-8339-43f2fe5f24d5-tmp-dir\") pod \"node-resolver-dnx2l\" (UID: \"c5cbe02b-2850-4700-8339-43f2fe5f24d5\") " pod="openshift-dns/node-resolver-dnx2l" Apr 23 08:50:46.669798 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:46.669740 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c5cbe02b-2850-4700-8339-43f2fe5f24d5-hosts-file\") pod \"node-resolver-dnx2l\" (UID: \"c5cbe02b-2850-4700-8339-43f2fe5f24d5\") " pod="openshift-dns/node-resolver-dnx2l" Apr 23 08:50:46.770669 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:46.770642 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zlnf\" (UniqueName: \"kubernetes.io/projected/c5cbe02b-2850-4700-8339-43f2fe5f24d5-kube-api-access-9zlnf\") pod \"node-resolver-dnx2l\" (UID: \"c5cbe02b-2850-4700-8339-43f2fe5f24d5\") " pod="openshift-dns/node-resolver-dnx2l" Apr 23 08:50:46.770776 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:46.770676 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5cbe02b-2850-4700-8339-43f2fe5f24d5-tmp-dir\") pod \"node-resolver-dnx2l\" (UID: \"c5cbe02b-2850-4700-8339-43f2fe5f24d5\") " pod="openshift-dns/node-resolver-dnx2l" Apr 23 08:50:46.770776 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:46.770707 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c5cbe02b-2850-4700-8339-43f2fe5f24d5-hosts-file\") pod \"node-resolver-dnx2l\" (UID: \"c5cbe02b-2850-4700-8339-43f2fe5f24d5\") " pod="openshift-dns/node-resolver-dnx2l" Apr 23 08:50:46.770881 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:46.770799 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c5cbe02b-2850-4700-8339-43f2fe5f24d5-hosts-file\") pod \"node-resolver-dnx2l\" (UID: \"c5cbe02b-2850-4700-8339-43f2fe5f24d5\") " pod="openshift-dns/node-resolver-dnx2l" Apr 23 08:50:46.771028 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:46.771011 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5cbe02b-2850-4700-8339-43f2fe5f24d5-tmp-dir\") pod \"node-resolver-dnx2l\" (UID: \"c5cbe02b-2850-4700-8339-43f2fe5f24d5\") " pod="openshift-dns/node-resolver-dnx2l" Apr 23 08:50:46.780978 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:46.780954 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zlnf\" (UniqueName: \"kubernetes.io/projected/c5cbe02b-2850-4700-8339-43f2fe5f24d5-kube-api-access-9zlnf\") pod \"node-resolver-dnx2l\" (UID: \"c5cbe02b-2850-4700-8339-43f2fe5f24d5\") " pod="openshift-dns/node-resolver-dnx2l" Apr 23 08:50:46.810721 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:46.810701 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:46.810831 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:46.810741 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:46.810831 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:46.810819 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lx4sg" podUID="61450b58-933b-4b5d-bf40-9e4408670e3e" Apr 23 08:50:46.810937 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:46.810904 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7z6cq" podUID="4dc177c2-0f81-4db4-ac46-adbf96e2b0c5" Apr 23 08:50:46.907161 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:46.907106 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dnx2l" Apr 23 08:50:48.811114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:48.811083 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:48.811554 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:48.811230 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lx4sg" podUID="61450b58-933b-4b5d-bf40-9e4408670e3e" Apr 23 08:50:48.811554 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:48.811283 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:48.811554 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:48.811366 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7z6cq" podUID="4dc177c2-0f81-4db4-ac46-adbf96e2b0c5" Apr 23 08:50:50.811234 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:50.811202 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:50.811674 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:50.811213 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:50.811674 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:50.811325 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7z6cq" podUID="4dc177c2-0f81-4db4-ac46-adbf96e2b0c5" Apr 23 08:50:50.811674 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:50.811414 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lx4sg" podUID="61450b58-933b-4b5d-bf40-9e4408670e3e" Apr 23 08:50:52.516670 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:52.516633 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs\") pod \"network-metrics-daemon-lx4sg\" (UID: \"61450b58-933b-4b5d-bf40-9e4408670e3e\") " pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:52.517210 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:52.516800 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:52.517210 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:52.516882 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs podName:61450b58-933b-4b5d-bf40-9e4408670e3e nodeName:}" failed. No retries permitted until 2026-04-23 08:51:08.516859469 +0000 UTC m=+34.329464848 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs") pod "network-metrics-daemon-lx4sg" (UID: "61450b58-933b-4b5d-bf40-9e4408670e3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:52.617357 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:52.617326 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjkdd\" (UniqueName: \"kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd\") pod \"network-check-target-7z6cq\" (UID: \"4dc177c2-0f81-4db4-ac46-adbf96e2b0c5\") " pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:52.617524 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:52.617469 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:50:52.617524 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:52.617496 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:50:52.617524 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:52.617510 2559 projected.go:194] Error preparing data for projected volume kube-api-access-qjkdd for pod openshift-network-diagnostics/network-check-target-7z6cq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:52.617635 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:52.617569 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd podName:4dc177c2-0f81-4db4-ac46-adbf96e2b0c5 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:08.61755296 +0000 UTC m=+34.430158328 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-qjkdd" (UniqueName: "kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd") pod "network-check-target-7z6cq" (UID: "4dc177c2-0f81-4db4-ac46-adbf96e2b0c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:52.811477 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:52.811391 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:52.811477 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:52.811392 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:52.811677 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:52.811542 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lx4sg" podUID="61450b58-933b-4b5d-bf40-9e4408670e3e" Apr 23 08:50:52.811731 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:52.811690 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7z6cq" podUID="4dc177c2-0f81-4db4-ac46-adbf96e2b0c5" Apr 23 08:50:53.788589 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:50:53.788563 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5cbe02b_2850_4700_8339_43f2fe5f24d5.slice/crio-bd5bc95d994fa3ad657ea5b14926c3c68fd3915f4ae34f45ce6e6e6c44dbeaf1 WatchSource:0}: Error finding container bd5bc95d994fa3ad657ea5b14926c3c68fd3915f4ae34f45ce6e6e6c44dbeaf1: Status 404 returned error can't find the container with id bd5bc95d994fa3ad657ea5b14926c3c68fd3915f4ae34f45ce6e6e6c44dbeaf1 Apr 23 08:50:53.906214 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:53.906013 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" event={"ID":"3a8842ac-bb6d-4882-b788-6c3e16e84191","Type":"ContainerStarted","Data":"cbebd9a82960d901fcb3dbcf76db301bb819e273373107f34f99c14b3579f39c"} Apr 23 08:50:53.907454 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:53.907427 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dnx2l" event={"ID":"c5cbe02b-2850-4700-8339-43f2fe5f24d5","Type":"ContainerStarted","Data":"bd5bc95d994fa3ad657ea5b14926c3c68fd3915f4ae34f45ce6e6e6c44dbeaf1"} Apr 23 08:50:54.812546 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.812331 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:54.813355 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.812410 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:54.813355 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:54.812631 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7z6cq" podUID="4dc177c2-0f81-4db4-ac46-adbf96e2b0c5" Apr 23 08:50:54.813355 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:54.812730 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lx4sg" podUID="61450b58-933b-4b5d-bf40-9e4408670e3e" Apr 23 08:50:54.910949 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.910919 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dnx2l" event={"ID":"c5cbe02b-2850-4700-8339-43f2fe5f24d5","Type":"ContainerStarted","Data":"51f6dac2f4d99a39b2203be504658a33bfc2536187ad672439b78b2889bf98ff"} Apr 23 08:50:54.912556 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.912529 2559 generic.go:358] "Generic (PLEG): container finished" podID="757f8beb-3271-44ad-88be-22369a09a56a" containerID="d6672c54be5c2631ee14b7d5dc58570636a6eb2c5b920828dba069753e364a28" exitCode=0 Apr 23 08:50:54.912703 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.912683 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nxtdf" event={"ID":"757f8beb-3271-44ad-88be-22369a09a56a","Type":"ContainerDied","Data":"d6672c54be5c2631ee14b7d5dc58570636a6eb2c5b920828dba069753e364a28"} Apr 23 08:50:54.914778 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.914528 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2zjq6" event={"ID":"d2f6fea9-05ba-40e3-bbdf-c915828b21ac","Type":"ContainerStarted","Data":"6ec84faeba6747766e4d5def5081406b1cc8ab0a162a5013190710cafc0d237d"} Apr 23 08:50:54.917556 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.917533 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" event={"ID":"76d63805-1298-4d8e-8c56-96c2091a8697","Type":"ContainerStarted","Data":"a3f483c01261cf551e285540caf75f3dde37430b4e026a82fc512155fcc69ce5"} Apr 23 08:50:54.917639 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.917563 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" event={"ID":"76d63805-1298-4d8e-8c56-96c2091a8697","Type":"ContainerStarted","Data":"29a58b9c3432e795a9d341e28d07fcea4d8812062bd43fc0d24ea20e925891db"} Apr 23 08:50:54.917639 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.917573 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" event={"ID":"76d63805-1298-4d8e-8c56-96c2091a8697","Type":"ContainerStarted","Data":"84728c112b304846021767446fdef23e7f3d03a2077af36da69dc3f403aa8514"} Apr 23 08:50:54.917639 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.917581 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" event={"ID":"76d63805-1298-4d8e-8c56-96c2091a8697","Type":"ContainerStarted","Data":"3cb3c1ac09f8eb84807d2a20b9c1f21db69db19cc4f6176a0b76d57d47eb0e84"} Apr 23 08:50:54.917639 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.917590 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" event={"ID":"76d63805-1298-4d8e-8c56-96c2091a8697","Type":"ContainerStarted","Data":"56a60420971bd917c665bff402662eae0cfb1ac70f0eaa7793fcfe55dc7ace26"} Apr 23 08:50:54.917639 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.917599 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" event={"ID":"76d63805-1298-4d8e-8c56-96c2091a8697","Type":"ContainerStarted","Data":"03da01150b46282977db2b6f7040117716f16991322964e2c9677903362ecab3"} Apr 23 08:50:54.918687 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.918655 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" event={"ID":"739856dc-433b-4287-917c-c4009595df8c","Type":"ContainerStarted","Data":"c2fa400883191fdf0d85b7f9778d0499d77e085d5f6702a654458a07e05e0be3"} Apr 23 08:50:54.920042 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.920020 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j7htq" event={"ID":"d23173d3-6e3a-4b55-b1dc-7075f2278e15","Type":"ContainerStarted","Data":"7b8ffe8950596ac8a53bd757336790cdaa63ad1c4a1d0835c20a1504e6581c1c"} Apr 23 08:50:54.921394 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.921367 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pbxp4" event={"ID":"be6f3563-0a3f-4959-9cda-d87e7a467749","Type":"ContainerStarted","Data":"eae6190fcc51d9e6987e5e25444f76268ff1af5d51da03d8c2b69e43f708401d"} Apr 23 08:50:54.925424 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.925379 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dnx2l" podStartSLOduration=8.925365711 podStartE2EDuration="8.925365711s" podCreationTimestamp="2026-04-23 08:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:50:54.92473637 +0000 UTC m=+20.737341760" watchObservedRunningTime="2026-04-23 08:50:54.925365711 +0000 UTC m=+20.737971100" Apr 23 08:50:54.938359 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.938319 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2zjq6" podStartSLOduration=4.575287384 podStartE2EDuration="20.938309484s" podCreationTimestamp="2026-04-23 08:50:34 +0000 UTC" firstStartedPulling="2026-04-23 08:50:37.413814112 +0000 UTC m=+3.226419492" lastFinishedPulling="2026-04-23 08:50:53.77683622 +0000 UTC m=+19.589441592" observedRunningTime="2026-04-23 08:50:54.937949028 +0000 UTC m=+20.750554413" watchObservedRunningTime="2026-04-23 08:50:54.938309484 +0000 UTC m=+20.750914871" Apr 23 08:50:54.972843 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.972798 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-j7htq" podStartSLOduration=4.605487616 podStartE2EDuration="20.972785507s" podCreationTimestamp="2026-04-23 08:50:34 +0000 UTC" firstStartedPulling="2026-04-23 08:50:37.409575687 +0000 UTC m=+3.222181065" lastFinishedPulling="2026-04-23 08:50:53.776873576 +0000 UTC m=+19.589478956" observedRunningTime="2026-04-23 08:50:54.972673296 +0000 UTC m=+20.785278684" watchObservedRunningTime="2026-04-23 08:50:54.972785507 +0000 UTC m=+20.785390901" Apr 23 08:50:54.987737 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.987695 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pbxp4" podStartSLOduration=4.576601282 podStartE2EDuration="20.987682255s" podCreationTimestamp="2026-04-23 08:50:34 +0000 UTC" firstStartedPulling="2026-04-23 08:50:37.407042116 +0000 UTC m=+3.219647496" lastFinishedPulling="2026-04-23 08:50:53.818123089 +0000 UTC m=+19.630728469" observedRunningTime="2026-04-23 08:50:54.987151598 +0000 UTC m=+20.799756998" watchObservedRunningTime="2026-04-23 08:50:54.987682255 +0000 UTC m=+20.800287645" Apr 23 08:50:54.997111 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:54.997076 2559 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 08:50:55.002135 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.002102 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jjkvv" podStartSLOduration=4.630121345 podStartE2EDuration="21.00209334s" podCreationTimestamp="2026-04-23 08:50:34 +0000 UTC" firstStartedPulling="2026-04-23 08:50:37.404847493 +0000 UTC m=+3.217452861" lastFinishedPulling="2026-04-23 08:50:53.776819484 +0000 UTC m=+19.589424856" observedRunningTime="2026-04-23 08:50:55.001735706 +0000 UTC m=+20.814341116" watchObservedRunningTime="2026-04-23 08:50:55.00209334 +0000 UTC m=+20.814698727" Apr 23 08:50:55.718978 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.718770 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2zjq6" Apr 23 08:50:55.719530 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.719512 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2zjq6" Apr 23 08:50:55.749646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.749620 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8jbv7"] Apr 23 08:50:55.753008 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.752915 2559 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T08:50:54.997091109Z","UUID":"fa8b9d23-78f3-4b64-acf9-2ed53835828a","Handler":null,"Name":"","Endpoint":""} Apr 23 08:50:55.753169 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.752998 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:50:55.753169 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:55.753125 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8jbv7" podUID="4bd15f5a-3c09-4ed5-b70b-145f046716e5" Apr 23 08:50:55.754793 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.754774 2559 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 08:50:55.754882 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.754802 2559 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 08:50:55.838061 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.838032 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4bd15f5a-3c09-4ed5-b70b-145f046716e5-dbus\") pod \"global-pull-secret-syncer-8jbv7\" (UID: \"4bd15f5a-3c09-4ed5-b70b-145f046716e5\") " pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:50:55.838428 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.838085 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4bd15f5a-3c09-4ed5-b70b-145f046716e5-kubelet-config\") pod \"global-pull-secret-syncer-8jbv7\" (UID: \"4bd15f5a-3c09-4ed5-b70b-145f046716e5\") " pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:50:55.838428 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.838142 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4bd15f5a-3c09-4ed5-b70b-145f046716e5-original-pull-secret\") pod \"global-pull-secret-syncer-8jbv7\" (UID: \"4bd15f5a-3c09-4ed5-b70b-145f046716e5\") " pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:50:55.925102 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.925072 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" event={"ID":"739856dc-433b-4287-917c-c4009595df8c","Type":"ContainerStarted","Data":"3055af3d8ec9f70c6ca0612453f3329871c5218d333db6dde008c544dd7e1805"} Apr 23 08:50:55.925219 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.925112 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" event={"ID":"739856dc-433b-4287-917c-c4009595df8c","Type":"ContainerStarted","Data":"d2bd78fe9f9eedae34127be16ab14b4006e7e9762a9cb6cc6804a66d442aa2e5"} Apr 23 08:50:55.926457 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.926432 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fx7dn" event={"ID":"de15a57c-d8b0-4a96-8a86-1a6ba933e6d9","Type":"ContainerStarted","Data":"1458825109b49e119a789c2af04669972d83f9dba88bb914af2ebdcb751338b2"} Apr 23 08:50:55.927297 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.927272 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2zjq6" Apr 23 08:50:55.927728 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.927707 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2zjq6" Apr 23 08:50:55.938625 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.938601 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4bd15f5a-3c09-4ed5-b70b-145f046716e5-dbus\") pod \"global-pull-secret-syncer-8jbv7\" (UID: \"4bd15f5a-3c09-4ed5-b70b-145f046716e5\") " pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:50:55.938754 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.938647 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4bd15f5a-3c09-4ed5-b70b-145f046716e5-kubelet-config\") pod \"global-pull-secret-syncer-8jbv7\" (UID: \"4bd15f5a-3c09-4ed5-b70b-145f046716e5\") " pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:50:55.938754 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.938689 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4bd15f5a-3c09-4ed5-b70b-145f046716e5-original-pull-secret\") pod \"global-pull-secret-syncer-8jbv7\" (UID: \"4bd15f5a-3c09-4ed5-b70b-145f046716e5\") " pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:50:55.938876 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.938756 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4bd15f5a-3c09-4ed5-b70b-145f046716e5-kubelet-config\") pod \"global-pull-secret-syncer-8jbv7\" (UID: \"4bd15f5a-3c09-4ed5-b70b-145f046716e5\") " pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:50:55.938876 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:55.938789 2559 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:55.938876 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:55.938841 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bd15f5a-3c09-4ed5-b70b-145f046716e5-original-pull-secret podName:4bd15f5a-3c09-4ed5-b70b-145f046716e5 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:56.438824989 +0000 UTC m=+22.251430357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4bd15f5a-3c09-4ed5-b70b-145f046716e5-original-pull-secret") pod "global-pull-secret-syncer-8jbv7" (UID: "4bd15f5a-3c09-4ed5-b70b-145f046716e5") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:55.938876 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.938890 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4bd15f5a-3c09-4ed5-b70b-145f046716e5-dbus\") pod \"global-pull-secret-syncer-8jbv7\" (UID: \"4bd15f5a-3c09-4ed5-b70b-145f046716e5\") " pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:50:55.940594 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.940550 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xcwc" podStartSLOduration=3.56244738 podStartE2EDuration="21.940539033s" podCreationTimestamp="2026-04-23 08:50:34 +0000 UTC" firstStartedPulling="2026-04-23 08:50:37.41084929 +0000 UTC m=+3.223454657" lastFinishedPulling="2026-04-23 08:50:55.788940943 +0000 UTC m=+21.601546310" observedRunningTime="2026-04-23 08:50:55.940168962 +0000 UTC m=+21.752774346" watchObservedRunningTime="2026-04-23 08:50:55.940539033 +0000 UTC m=+21.753144414" Apr 23 08:50:55.970077 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:55.970005 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fx7dn" podStartSLOduration=5.601060027 podStartE2EDuration="21.969974913s" podCreationTimestamp="2026-04-23 08:50:34 +0000 UTC" firstStartedPulling="2026-04-23 08:50:37.407905664 +0000 UTC m=+3.220511035" lastFinishedPulling="2026-04-23 08:50:53.776820541 +0000 UTC m=+19.589425921" observedRunningTime="2026-04-23 08:50:55.969792127 +0000 UTC m=+21.782397516" watchObservedRunningTime="2026-04-23 08:50:55.969974913 +0000 UTC m=+21.782580300" Apr 23 08:50:56.442142 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:56.442059 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4bd15f5a-3c09-4ed5-b70b-145f046716e5-original-pull-secret\") pod \"global-pull-secret-syncer-8jbv7\" (UID: \"4bd15f5a-3c09-4ed5-b70b-145f046716e5\") " pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:50:56.442372 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:56.442222 2559 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:56.442372 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:56.442283 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bd15f5a-3c09-4ed5-b70b-145f046716e5-original-pull-secret podName:4bd15f5a-3c09-4ed5-b70b-145f046716e5 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:57.442264859 +0000 UTC m=+23.254870231 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4bd15f5a-3c09-4ed5-b70b-145f046716e5-original-pull-secret") pod "global-pull-secret-syncer-8jbv7" (UID: "4bd15f5a-3c09-4ed5-b70b-145f046716e5") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:56.811780 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:56.811711 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:56.811928 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:56.811836 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7z6cq" podUID="4dc177c2-0f81-4db4-ac46-adbf96e2b0c5" Apr 23 08:50:56.811928 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:56.811899 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:56.812075 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:56.812040 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lx4sg" podUID="61450b58-933b-4b5d-bf40-9e4408670e3e" Apr 23 08:50:56.930969 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:56.930941 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" event={"ID":"76d63805-1298-4d8e-8c56-96c2091a8697","Type":"ContainerStarted","Data":"b9733d5c3719a696e0b692a2ccccd18d30ca8a8fd35d6daf34d2cbeea2027df4"} Apr 23 08:50:57.450362 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:57.450333 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4bd15f5a-3c09-4ed5-b70b-145f046716e5-original-pull-secret\") pod \"global-pull-secret-syncer-8jbv7\" (UID: \"4bd15f5a-3c09-4ed5-b70b-145f046716e5\") " pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:50:57.450525 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:57.450461 2559 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:57.450569 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:57.450525 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bd15f5a-3c09-4ed5-b70b-145f046716e5-original-pull-secret podName:4bd15f5a-3c09-4ed5-b70b-145f046716e5 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:59.450507177 +0000 UTC m=+25.263112547 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4bd15f5a-3c09-4ed5-b70b-145f046716e5-original-pull-secret") pod "global-pull-secret-syncer-8jbv7" (UID: "4bd15f5a-3c09-4ed5-b70b-145f046716e5") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:57.811062 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:57.810973 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:50:57.811223 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:57.811104 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8jbv7" podUID="4bd15f5a-3c09-4ed5-b70b-145f046716e5" Apr 23 08:50:58.811847 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:58.811572 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:50:58.812666 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:58.811618 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:50:58.812666 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:58.811914 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7z6cq" podUID="4dc177c2-0f81-4db4-ac46-adbf96e2b0c5" Apr 23 08:50:58.812666 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:58.812012 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lx4sg" podUID="61450b58-933b-4b5d-bf40-9e4408670e3e" Apr 23 08:50:58.940880 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:58.940839 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" event={"ID":"76d63805-1298-4d8e-8c56-96c2091a8697","Type":"ContainerStarted","Data":"d2a4f806f0d209a94527dfdabd0ad635986e0490c6ffd30de128b7f46c23e1c0"} Apr 23 08:50:58.941199 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:58.941170 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:58.941299 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:58.941205 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:58.958303 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:58.958281 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:58.958888 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:58.958869 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:50:58.967544 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:58.967494 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" podStartSLOduration=8.316639563 podStartE2EDuration="24.967481127s" podCreationTimestamp="2026-04-23 08:50:34 +0000 UTC" firstStartedPulling="2026-04-23 08:50:37.41266689 +0000 UTC m=+3.225272270" lastFinishedPulling="2026-04-23 08:50:54.063508468 +0000 UTC m=+19.876113834" observedRunningTime="2026-04-23 08:50:58.96683425 +0000 UTC m=+24.779439632" watchObservedRunningTime="2026-04-23 08:50:58.967481127 +0000 UTC m=+24.780086513" Apr 23 08:50:59.467122 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:59.467062 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4bd15f5a-3c09-4ed5-b70b-145f046716e5-original-pull-secret\") pod \"global-pull-secret-syncer-8jbv7\" (UID: \"4bd15f5a-3c09-4ed5-b70b-145f046716e5\") " pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:50:59.467288 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:59.467172 2559 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:59.467288 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:59.467219 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bd15f5a-3c09-4ed5-b70b-145f046716e5-original-pull-secret podName:4bd15f5a-3c09-4ed5-b70b-145f046716e5 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:03.467205972 +0000 UTC m=+29.279811337 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4bd15f5a-3c09-4ed5-b70b-145f046716e5-original-pull-secret") pod "global-pull-secret-syncer-8jbv7" (UID: "4bd15f5a-3c09-4ed5-b70b-145f046716e5") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:59.810766 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:59.810716 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:50:59.810864 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:50:59.810803 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8jbv7" podUID="4bd15f5a-3c09-4ed5-b70b-145f046716e5" Apr 23 08:50:59.945124 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:59.945084 2559 generic.go:358] "Generic (PLEG): container finished" podID="757f8beb-3271-44ad-88be-22369a09a56a" containerID="6363c75a684d88253d8edbeaec9ada9df363b68f7aeb8b881a5582935316da18" exitCode=0 Apr 23 08:50:59.945718 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:59.945177 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nxtdf" event={"ID":"757f8beb-3271-44ad-88be-22369a09a56a","Type":"ContainerDied","Data":"6363c75a684d88253d8edbeaec9ada9df363b68f7aeb8b881a5582935316da18"} Apr 23 08:50:59.945718 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:50:59.945449 2559 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 08:51:00.688394 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:00.687844 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7z6cq"] Apr 23 08:51:00.688394 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:00.688185 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:51:00.688394 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:00.688290 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7z6cq" podUID="4dc177c2-0f81-4db4-ac46-adbf96e2b0c5" Apr 23 08:51:00.694426 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:00.694289 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lx4sg"] Apr 23 08:51:00.694426 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:00.694416 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:51:00.694594 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:00.694539 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lx4sg" podUID="61450b58-933b-4b5d-bf40-9e4408670e3e" Apr 23 08:51:00.694954 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:00.694931 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8jbv7"] Apr 23 08:51:00.695071 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:00.695044 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:51:00.695153 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:00.695130 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8jbv7" podUID="4bd15f5a-3c09-4ed5-b70b-145f046716e5" Apr 23 08:51:00.948498 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:00.948473 2559 generic.go:358] "Generic (PLEG): container finished" podID="757f8beb-3271-44ad-88be-22369a09a56a" containerID="49cb90ee00b40fa912df427d3945c00e31283e6a214d7e8efd94aad2106cb962" exitCode=0 Apr 23 08:51:00.948883 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:00.948565 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nxtdf" event={"ID":"757f8beb-3271-44ad-88be-22369a09a56a","Type":"ContainerDied","Data":"49cb90ee00b40fa912df427d3945c00e31283e6a214d7e8efd94aad2106cb962"} Apr 23 08:51:00.948883 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:00.948745 2559 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 08:51:01.810747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:01.810685 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:51:01.810854 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:01.810822 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lx4sg" podUID="61450b58-933b-4b5d-bf40-9e4408670e3e" Apr 23 08:51:01.952458 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:01.952428 2559 generic.go:358] "Generic (PLEG): container finished" podID="757f8beb-3271-44ad-88be-22369a09a56a" containerID="939664cd9bfaf4a7f91fd3354b5c1e3263607cb192b3a654fb0f5ffd32c62691" exitCode=0 Apr 23 08:51:01.952765 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:01.952466 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nxtdf" event={"ID":"757f8beb-3271-44ad-88be-22369a09a56a","Type":"ContainerDied","Data":"939664cd9bfaf4a7f91fd3354b5c1e3263607cb192b3a654fb0f5ffd32c62691"} Apr 23 08:51:02.810781 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:02.810752 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:51:02.810904 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:02.810798 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:51:02.810904 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:02.810874 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7z6cq" podUID="4dc177c2-0f81-4db4-ac46-adbf96e2b0c5" Apr 23 08:51:02.811050 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:02.811010 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8jbv7" podUID="4bd15f5a-3c09-4ed5-b70b-145f046716e5" Apr 23 08:51:03.466838 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:03.466801 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:51:03.467359 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:03.467077 2559 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 08:51:03.478106 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:03.478050 2559 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" podUID="76d63805-1298-4d8e-8c56-96c2091a8697" containerName="ovnkube-controller" probeResult="failure" output="" Apr 23 08:51:03.487445 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:03.487413 2559 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" podUID="76d63805-1298-4d8e-8c56-96c2091a8697" containerName="ovnkube-controller" probeResult="failure" output="" Apr 23 08:51:03.503229 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:03.503201 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4bd15f5a-3c09-4ed5-b70b-145f046716e5-original-pull-secret\") pod \"global-pull-secret-syncer-8jbv7\" (UID: \"4bd15f5a-3c09-4ed5-b70b-145f046716e5\") " pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:51:03.503385 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:03.503367 2559 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:51:03.503444 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:03.503436 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bd15f5a-3c09-4ed5-b70b-145f046716e5-original-pull-secret podName:4bd15f5a-3c09-4ed5-b70b-145f046716e5 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:11.503416655 +0000 UTC m=+37.316022027 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4bd15f5a-3c09-4ed5-b70b-145f046716e5-original-pull-secret") pod "global-pull-secret-syncer-8jbv7" (UID: "4bd15f5a-3c09-4ed5-b70b-145f046716e5") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:51:03.811716 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:03.811689 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:51:03.811855 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:03.811797 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lx4sg" podUID="61450b58-933b-4b5d-bf40-9e4408670e3e" Apr 23 08:51:04.812128 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:04.812103 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:51:04.812609 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:04.812194 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:51:04.812609 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:04.812229 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8jbv7" podUID="4bd15f5a-3c09-4ed5-b70b-145f046716e5" Apr 23 08:51:04.812609 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:04.812273 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7z6cq" podUID="4dc177c2-0f81-4db4-ac46-adbf96e2b0c5" Apr 23 08:51:05.811311 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:05.811282 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:51:05.811476 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:05.811404 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lx4sg" podUID="61450b58-933b-4b5d-bf40-9e4408670e3e" Apr 23 08:51:06.811079 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:06.811006 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:51:06.811079 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:06.811026 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:51:06.811547 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:06.811125 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7z6cq" podUID="4dc177c2-0f81-4db4-ac46-adbf96e2b0c5" Apr 23 08:51:06.811547 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:06.811260 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8jbv7" podUID="4bd15f5a-3c09-4ed5-b70b-145f046716e5" Apr 23 08:51:06.986339 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:06.986311 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-31.ec2.internal" event="NodeReady" Apr 23 08:51:06.986515 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:06.986439 2559 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 08:51:07.033823 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.033784 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ghbpk"] Apr 23 08:51:07.072582 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.072522 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-l59kj"] Apr 23 08:51:07.072713 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.072689 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ghbpk" Apr 23 08:51:07.074665 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.074645 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 08:51:07.074890 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.074868 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-cjh2d\"" Apr 23 08:51:07.075032 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.074966 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 08:51:07.086957 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.086941 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ghbpk"] Apr 23 08:51:07.086957 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.086961 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l59kj"] Apr 23 08:51:07.087118 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.087056 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l59kj" Apr 23 08:51:07.089250 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.088891 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 08:51:07.089250 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.089082 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 08:51:07.089250 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.089135 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 08:51:07.089250 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.089218 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nrkt9\"" Apr 23 08:51:07.233475 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.233442 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg9tx\" (UniqueName: \"kubernetes.io/projected/492da4be-9e87-49d4-91cf-b96c4bece553-kube-api-access-gg9tx\") pod \"ingress-canary-l59kj\" (UID: \"492da4be-9e87-49d4-91cf-b96c4bece553\") " pod="openshift-ingress-canary/ingress-canary-l59kj" Apr 23 08:51:07.233631 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.233493 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert\") pod \"ingress-canary-l59kj\" (UID: \"492da4be-9e87-49d4-91cf-b96c4bece553\") " pod="openshift-ingress-canary/ingress-canary-l59kj" Apr 23 08:51:07.233631 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.233523 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/26cb0581-bb5b-4912-8a69-48378d6dc35b-tmp-dir\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:51:07.233631 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.233569 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:51:07.233631 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.233601 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7tr6\" (UniqueName: \"kubernetes.io/projected/26cb0581-bb5b-4912-8a69-48378d6dc35b-kube-api-access-t7tr6\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:51:07.233811 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.233655 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26cb0581-bb5b-4912-8a69-48378d6dc35b-config-volume\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:51:07.334741 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.334665 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert\") pod \"ingress-canary-l59kj\" (UID: \"492da4be-9e87-49d4-91cf-b96c4bece553\") " pod="openshift-ingress-canary/ingress-canary-l59kj" Apr 23 08:51:07.334741 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.334702 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/26cb0581-bb5b-4912-8a69-48378d6dc35b-tmp-dir\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:51:07.334925 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.334751 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:51:07.334925 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:07.334832 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:07.334925 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:07.334841 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:07.334925 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:07.334898 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert podName:492da4be-9e87-49d4-91cf-b96c4bece553 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:07.834878463 +0000 UTC m=+33.647483832 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert") pod "ingress-canary-l59kj" (UID: "492da4be-9e87-49d4-91cf-b96c4bece553") : secret "canary-serving-cert" not found Apr 23 08:51:07.334925 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:07.334918 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls podName:26cb0581-bb5b-4912-8a69-48378d6dc35b nodeName:}" failed. No retries permitted until 2026-04-23 08:51:07.834908875 +0000 UTC m=+33.647514242 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls") pod "dns-default-ghbpk" (UID: "26cb0581-bb5b-4912-8a69-48378d6dc35b") : secret "dns-default-metrics-tls" not found Apr 23 08:51:07.335208 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.335007 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7tr6\" (UniqueName: \"kubernetes.io/projected/26cb0581-bb5b-4912-8a69-48378d6dc35b-kube-api-access-t7tr6\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:51:07.335208 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.335086 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/26cb0581-bb5b-4912-8a69-48378d6dc35b-tmp-dir\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:51:07.335208 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.335090 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26cb0581-bb5b-4912-8a69-48378d6dc35b-config-volume\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:51:07.335208 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.335152 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gg9tx\" (UniqueName: \"kubernetes.io/projected/492da4be-9e87-49d4-91cf-b96c4bece553-kube-api-access-gg9tx\") pod \"ingress-canary-l59kj\" (UID: \"492da4be-9e87-49d4-91cf-b96c4bece553\") " pod="openshift-ingress-canary/ingress-canary-l59kj" Apr 23 08:51:07.335621 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.335601 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26cb0581-bb5b-4912-8a69-48378d6dc35b-config-volume\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:51:07.348342 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.348183 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7tr6\" (UniqueName: \"kubernetes.io/projected/26cb0581-bb5b-4912-8a69-48378d6dc35b-kube-api-access-t7tr6\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:51:07.348431 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.348299 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg9tx\" (UniqueName: \"kubernetes.io/projected/492da4be-9e87-49d4-91cf-b96c4bece553-kube-api-access-gg9tx\") pod \"ingress-canary-l59kj\" (UID: \"492da4be-9e87-49d4-91cf-b96c4bece553\") " pod="openshift-ingress-canary/ingress-canary-l59kj" Apr 23 08:51:07.811140 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.811062 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:51:07.812857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.812836 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:51:07.812967 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.812840 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wtr6s\"" Apr 23 08:51:07.838108 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.838085 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert\") pod \"ingress-canary-l59kj\" (UID: \"492da4be-9e87-49d4-91cf-b96c4bece553\") " pod="openshift-ingress-canary/ingress-canary-l59kj" Apr 23 08:51:07.838201 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:07.838122 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:51:07.838244 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:07.838212 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:07.838244 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:07.838214 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:07.838313 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:07.838272 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert podName:492da4be-9e87-49d4-91cf-b96c4bece553 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:08.838252807 +0000 UTC m=+34.650858189 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert") pod "ingress-canary-l59kj" (UID: "492da4be-9e87-49d4-91cf-b96c4bece553") : secret "canary-serving-cert" not found Apr 23 08:51:07.838313 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:07.838289 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls podName:26cb0581-bb5b-4912-8a69-48378d6dc35b nodeName:}" failed. No retries permitted until 2026-04-23 08:51:08.838280933 +0000 UTC m=+34.650886303 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls") pod "dns-default-ghbpk" (UID: "26cb0581-bb5b-4912-8a69-48378d6dc35b") : secret "dns-default-metrics-tls" not found Apr 23 08:51:08.542020 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:08.541926 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs\") pod \"network-metrics-daemon-lx4sg\" (UID: \"61450b58-933b-4b5d-bf40-9e4408670e3e\") " pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:51:08.542141 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:08.542078 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:51:08.542181 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:08.542150 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs podName:61450b58-933b-4b5d-bf40-9e4408670e3e nodeName:}" failed. No retries permitted until 2026-04-23 08:51:40.542123364 +0000 UTC m=+66.354728747 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs") pod "network-metrics-daemon-lx4sg" (UID: "61450b58-933b-4b5d-bf40-9e4408670e3e") : secret "metrics-daemon-secret" not found Apr 23 08:51:08.643029 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:08.643004 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjkdd\" (UniqueName: \"kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd\") pod \"network-check-target-7z6cq\" (UID: \"4dc177c2-0f81-4db4-ac46-adbf96e2b0c5\") " pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:51:08.643133 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:08.643115 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:51:08.643133 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:08.643129 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:51:08.643203 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:08.643138 2559 projected.go:194] Error preparing data for projected volume kube-api-access-qjkdd for pod openshift-network-diagnostics/network-check-target-7z6cq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:51:08.643203 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:08.643180 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd podName:4dc177c2-0f81-4db4-ac46-adbf96e2b0c5 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:40.643167803 +0000 UTC m=+66.455773168 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-qjkdd" (UniqueName: "kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd") pod "network-check-target-7z6cq" (UID: "4dc177c2-0f81-4db4-ac46-adbf96e2b0c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:51:08.811401 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:08.811339 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:51:08.811826 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:08.811549 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:51:08.813538 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:08.813514 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:51:08.813661 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:08.813558 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 08:51:08.813914 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:08.813895 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-9mgcs\"" Apr 23 08:51:08.813914 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:08.813906 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:51:08.844568 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:08.844544 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:51:08.844651 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:08.844602 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert\") pod \"ingress-canary-l59kj\" (UID: \"492da4be-9e87-49d4-91cf-b96c4bece553\") " pod="openshift-ingress-canary/ingress-canary-l59kj" Apr 23 08:51:08.844690 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:08.844667 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:08.844690 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:08.844674 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:08.844764 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:08.844715 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls podName:26cb0581-bb5b-4912-8a69-48378d6dc35b nodeName:}" failed. No retries permitted until 2026-04-23 08:51:10.844702146 +0000 UTC m=+36.657307517 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls") pod "dns-default-ghbpk" (UID: "26cb0581-bb5b-4912-8a69-48378d6dc35b") : secret "dns-default-metrics-tls" not found Apr 23 08:51:08.844764 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:08.844730 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert podName:492da4be-9e87-49d4-91cf-b96c4bece553 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:10.844723919 +0000 UTC m=+36.657329284 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert") pod "ingress-canary-l59kj" (UID: "492da4be-9e87-49d4-91cf-b96c4bece553") : secret "canary-serving-cert" not found Apr 23 08:51:08.967416 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:08.967393 2559 generic.go:358] "Generic (PLEG): container finished" podID="757f8beb-3271-44ad-88be-22369a09a56a" containerID="9bf76ef1313c324810be4d504a960ba1218a9e537129240488b032633ea65561" exitCode=0 Apr 23 08:51:08.967508 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:08.967446 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nxtdf" event={"ID":"757f8beb-3271-44ad-88be-22369a09a56a","Type":"ContainerDied","Data":"9bf76ef1313c324810be4d504a960ba1218a9e537129240488b032633ea65561"} Apr 23 08:51:09.971598 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:09.971567 2559 generic.go:358] "Generic (PLEG): container finished" podID="757f8beb-3271-44ad-88be-22369a09a56a" containerID="f2ae79b426c6c175c28f0a357576f864c600c43384ac1ad4a12b3c70d876ead5" exitCode=0 Apr 23 08:51:09.971947 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:09.971604 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nxtdf" event={"ID":"757f8beb-3271-44ad-88be-22369a09a56a","Type":"ContainerDied","Data":"f2ae79b426c6c175c28f0a357576f864c600c43384ac1ad4a12b3c70d876ead5"} Apr 23 08:51:10.858840 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:10.858811 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert\") pod \"ingress-canary-l59kj\" (UID: \"492da4be-9e87-49d4-91cf-b96c4bece553\") " pod="openshift-ingress-canary/ingress-canary-l59kj" Apr 23 08:51:10.858949 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:10.858856 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:51:10.858949 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:10.858935 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:10.858949 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:10.858938 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:10.859085 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:10.859008 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert podName:492da4be-9e87-49d4-91cf-b96c4bece553 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:14.85897453 +0000 UTC m=+40.671579896 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert") pod "ingress-canary-l59kj" (UID: "492da4be-9e87-49d4-91cf-b96c4bece553") : secret "canary-serving-cert" not found Apr 23 08:51:10.859085 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:10.859026 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls podName:26cb0581-bb5b-4912-8a69-48378d6dc35b nodeName:}" failed. No retries permitted until 2026-04-23 08:51:14.859018724 +0000 UTC m=+40.671624091 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls") pod "dns-default-ghbpk" (UID: "26cb0581-bb5b-4912-8a69-48378d6dc35b") : secret "dns-default-metrics-tls" not found Apr 23 08:51:10.975851 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:10.975826 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nxtdf" event={"ID":"757f8beb-3271-44ad-88be-22369a09a56a","Type":"ContainerStarted","Data":"40dc385054d588acc0fd347f95d3750171ba46bd3588f829a744efb725c098ee"} Apr 23 08:51:10.996344 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:10.996304 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nxtdf" podStartSLOduration=6.542955408 podStartE2EDuration="36.996290375s" podCreationTimestamp="2026-04-23 08:50:34 +0000 UTC" firstStartedPulling="2026-04-23 08:50:37.41187368 +0000 UTC m=+3.224479060" lastFinishedPulling="2026-04-23 08:51:07.865208658 +0000 UTC m=+33.677814027" observedRunningTime="2026-04-23 08:51:10.995229489 +0000 UTC m=+36.807834879" watchObservedRunningTime="2026-04-23 08:51:10.996290375 +0000 UTC m=+36.808895763" Apr 23 08:51:11.565114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:11.565078 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4bd15f5a-3c09-4ed5-b70b-145f046716e5-original-pull-secret\") pod \"global-pull-secret-syncer-8jbv7\" (UID: \"4bd15f5a-3c09-4ed5-b70b-145f046716e5\") " pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:51:11.568400 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:11.568375 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4bd15f5a-3c09-4ed5-b70b-145f046716e5-original-pull-secret\") pod \"global-pull-secret-syncer-8jbv7\" (UID: \"4bd15f5a-3c09-4ed5-b70b-145f046716e5\") " pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:51:11.821053 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:11.820980 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8jbv7" Apr 23 08:51:12.010495 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:12.010347 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8jbv7"] Apr 23 08:51:12.014262 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:51:12.014239 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bd15f5a_3c09_4ed5_b70b_145f046716e5.slice/crio-53a46576038850eaa65befe2ccdd46713685a4ce9c998ad7b13f9ce02a8c41e1 WatchSource:0}: Error finding container 53a46576038850eaa65befe2ccdd46713685a4ce9c998ad7b13f9ce02a8c41e1: Status 404 returned error can't find the container with id 53a46576038850eaa65befe2ccdd46713685a4ce9c998ad7b13f9ce02a8c41e1 Apr 23 08:51:12.980275 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:12.980241 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8jbv7" event={"ID":"4bd15f5a-3c09-4ed5-b70b-145f046716e5","Type":"ContainerStarted","Data":"53a46576038850eaa65befe2ccdd46713685a4ce9c998ad7b13f9ce02a8c41e1"} Apr 23 08:51:14.890603 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:14.890567 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert\") pod \"ingress-canary-l59kj\" (UID: \"492da4be-9e87-49d4-91cf-b96c4bece553\") " pod="openshift-ingress-canary/ingress-canary-l59kj" Apr 23 08:51:14.891211 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:14.890639 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:51:14.891211 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:14.890723 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:14.891211 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:14.890750 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:14.891211 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:14.890788 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert podName:492da4be-9e87-49d4-91cf-b96c4bece553 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:22.890771543 +0000 UTC m=+48.703376909 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert") pod "ingress-canary-l59kj" (UID: "492da4be-9e87-49d4-91cf-b96c4bece553") : secret "canary-serving-cert" not found Apr 23 08:51:14.891211 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:14.890805 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls podName:26cb0581-bb5b-4912-8a69-48378d6dc35b nodeName:}" failed. No retries permitted until 2026-04-23 08:51:22.890797257 +0000 UTC m=+48.703402623 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls") pod "dns-default-ghbpk" (UID: "26cb0581-bb5b-4912-8a69-48378d6dc35b") : secret "dns-default-metrics-tls" not found Apr 23 08:51:16.988942 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:16.988899 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8jbv7" event={"ID":"4bd15f5a-3c09-4ed5-b70b-145f046716e5","Type":"ContainerStarted","Data":"2ef6daddff1109171a187dd6f4d596ccf8d6ab69476979f907550882836c1a22"} Apr 23 08:51:17.006020 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:17.005949 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8jbv7" podStartSLOduration=18.126006214 podStartE2EDuration="22.005936819s" podCreationTimestamp="2026-04-23 08:50:55 +0000 UTC" firstStartedPulling="2026-04-23 08:51:12.015844993 +0000 UTC m=+37.828450359" lastFinishedPulling="2026-04-23 08:51:15.895775585 +0000 UTC m=+41.708380964" observedRunningTime="2026-04-23 08:51:17.005565537 +0000 UTC m=+42.818170924" watchObservedRunningTime="2026-04-23 08:51:17.005936819 +0000 UTC m=+42.818542203" Apr 23 08:51:22.943644 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:22.943607 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert\") pod \"ingress-canary-l59kj\" (UID: \"492da4be-9e87-49d4-91cf-b96c4bece553\") " pod="openshift-ingress-canary/ingress-canary-l59kj" Apr 23 08:51:22.944018 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:22.943654 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:51:22.944018 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:22.943753 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:22.944018 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:22.943756 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:22.944018 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:22.943818 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert podName:492da4be-9e87-49d4-91cf-b96c4bece553 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:38.943803213 +0000 UTC m=+64.756408579 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert") pod "ingress-canary-l59kj" (UID: "492da4be-9e87-49d4-91cf-b96c4bece553") : secret "canary-serving-cert" not found Apr 23 08:51:22.944018 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:22.943832 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls podName:26cb0581-bb5b-4912-8a69-48378d6dc35b nodeName:}" failed. No retries permitted until 2026-04-23 08:51:38.943825941 +0000 UTC m=+64.756431307 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls") pod "dns-default-ghbpk" (UID: "26cb0581-bb5b-4912-8a69-48378d6dc35b") : secret "dns-default-metrics-tls" not found Apr 23 08:51:33.487790 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:33.487763 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sqslz" Apr 23 08:51:38.944493 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:38.944461 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert\") pod \"ingress-canary-l59kj\" (UID: \"492da4be-9e87-49d4-91cf-b96c4bece553\") " pod="openshift-ingress-canary/ingress-canary-l59kj" Apr 23 08:51:38.944493 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:38.944504 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:51:38.945032 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:38.944604 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:38.945032 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:38.944663 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls podName:26cb0581-bb5b-4912-8a69-48378d6dc35b nodeName:}" failed. No retries permitted until 2026-04-23 08:52:10.944649409 +0000 UTC m=+96.757254774 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls") pod "dns-default-ghbpk" (UID: "26cb0581-bb5b-4912-8a69-48378d6dc35b") : secret "dns-default-metrics-tls" not found Apr 23 08:51:38.945032 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:38.944604 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:38.945032 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:38.944747 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert podName:492da4be-9e87-49d4-91cf-b96c4bece553 nodeName:}" failed. No retries permitted until 2026-04-23 08:52:10.944734365 +0000 UTC m=+96.757339743 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert") pod "ingress-canary-l59kj" (UID: "492da4be-9e87-49d4-91cf-b96c4bece553") : secret "canary-serving-cert" not found Apr 23 08:51:40.554210 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:40.554170 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs\") pod \"network-metrics-daemon-lx4sg\" (UID: \"61450b58-933b-4b5d-bf40-9e4408670e3e\") " pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:51:40.554598 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:40.554309 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:51:40.554598 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:51:40.554377 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs podName:61450b58-933b-4b5d-bf40-9e4408670e3e nodeName:}" failed. No retries permitted until 2026-04-23 08:52:44.554362172 +0000 UTC m=+130.366967541 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs") pod "network-metrics-daemon-lx4sg" (UID: "61450b58-933b-4b5d-bf40-9e4408670e3e") : secret "metrics-daemon-secret" not found Apr 23 08:51:40.654883 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:40.654856 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjkdd\" (UniqueName: \"kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd\") pod \"network-check-target-7z6cq\" (UID: \"4dc177c2-0f81-4db4-ac46-adbf96e2b0c5\") " pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:51:40.657061 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:40.657036 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:51:40.666962 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:40.666942 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:51:40.679327 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:40.679301 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjkdd\" (UniqueName: \"kubernetes.io/projected/4dc177c2-0f81-4db4-ac46-adbf96e2b0c5-kube-api-access-qjkdd\") pod \"network-check-target-7z6cq\" (UID: \"4dc177c2-0f81-4db4-ac46-adbf96e2b0c5\") " pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:51:40.927923 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:40.927864 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-9mgcs\"" Apr 23 08:51:40.936595 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:40.936580 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:51:41.059566 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:41.059529 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7z6cq"] Apr 23 08:51:41.063083 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:51:41.063057 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dc177c2_0f81_4db4_ac46_adbf96e2b0c5.slice/crio-c0c9ef641f2105e6f75ef333667e8ae78a91d3f697cff54647a5aa041e2eaa61 WatchSource:0}: Error finding container c0c9ef641f2105e6f75ef333667e8ae78a91d3f697cff54647a5aa041e2eaa61: Status 404 returned error can't find the container with id c0c9ef641f2105e6f75ef333667e8ae78a91d3f697cff54647a5aa041e2eaa61 Apr 23 08:51:42.054534 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:42.054493 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7z6cq" event={"ID":"4dc177c2-0f81-4db4-ac46-adbf96e2b0c5","Type":"ContainerStarted","Data":"c0c9ef641f2105e6f75ef333667e8ae78a91d3f697cff54647a5aa041e2eaa61"} Apr 23 08:51:44.059822 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:44.059743 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7z6cq" event={"ID":"4dc177c2-0f81-4db4-ac46-adbf96e2b0c5","Type":"ContainerStarted","Data":"2becf51d147716cdc1f3b07063caeaa706dd70b2b8c51dee3266cf376adb1062"} Apr 23 08:51:44.060162 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:44.059879 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:51:44.077425 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:51:44.077380 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-7z6cq" podStartSLOduration=67.393243623 podStartE2EDuration="1m10.077366573s" podCreationTimestamp="2026-04-23 08:50:34 +0000 UTC" firstStartedPulling="2026-04-23 08:51:41.06524766 +0000 UTC m=+66.877853025" lastFinishedPulling="2026-04-23 08:51:43.749370604 +0000 UTC m=+69.561975975" observedRunningTime="2026-04-23 08:51:44.076590071 +0000 UTC m=+69.889195458" watchObservedRunningTime="2026-04-23 08:51:44.077366573 +0000 UTC m=+69.889971961" Apr 23 08:52:11.038292 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:11.038256 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert\") pod \"ingress-canary-l59kj\" (UID: \"492da4be-9e87-49d4-91cf-b96c4bece553\") " pod="openshift-ingress-canary/ingress-canary-l59kj" Apr 23 08:52:11.038697 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:11.038303 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:52:11.038697 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:52:11.038399 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:52:11.038697 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:52:11.038474 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert podName:492da4be-9e87-49d4-91cf-b96c4bece553 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:15.038458508 +0000 UTC m=+160.851063875 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert") pod "ingress-canary-l59kj" (UID: "492da4be-9e87-49d4-91cf-b96c4bece553") : secret "canary-serving-cert" not found Apr 23 08:52:11.038697 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:52:11.038402 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:52:11.038697 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:52:11.038549 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls podName:26cb0581-bb5b-4912-8a69-48378d6dc35b nodeName:}" failed. No retries permitted until 2026-04-23 08:53:15.038534433 +0000 UTC m=+160.851139805 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls") pod "dns-default-ghbpk" (UID: "26cb0581-bb5b-4912-8a69-48378d6dc35b") : secret "dns-default-metrics-tls" not found Apr 23 08:52:15.063826 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:15.063790 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-7z6cq" Apr 23 08:52:44.554964 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:44.554925 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs\") pod \"network-metrics-daemon-lx4sg\" (UID: \"61450b58-933b-4b5d-bf40-9e4408670e3e\") " pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:52:44.555496 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:52:44.555106 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:52:44.555496 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:52:44.555196 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs podName:61450b58-933b-4b5d-bf40-9e4408670e3e nodeName:}" failed. No retries permitted until 2026-04-23 08:54:46.555180261 +0000 UTC m=+252.367785628 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs") pod "network-metrics-daemon-lx4sg" (UID: "61450b58-933b-4b5d-bf40-9e4408670e3e") : secret "metrics-daemon-secret" not found Apr 23 08:52:58.668767 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.668727 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6"] Apr 23 08:52:58.671628 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.671599 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" Apr 23 08:52:58.674150 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.674122 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 08:52:58.674575 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.674545 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-4cvxj\"" Apr 23 08:52:58.674690 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.674549 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 23 08:52:58.674690 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.674549 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 08:52:58.674690 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.674561 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 23 08:52:58.681119 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.681098 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6"] Apr 23 08:52:58.747974 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.747953 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/88c6d6b4-1465-4419-b9d5-b3d67fda3332-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vjqf6\" (UID: \"88c6d6b4-1465-4419-b9d5-b3d67fda3332\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" Apr 23 08:52:58.748105 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.747979 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdhn6\" (UniqueName: \"kubernetes.io/projected/88c6d6b4-1465-4419-b9d5-b3d67fda3332-kube-api-access-zdhn6\") pod \"cluster-monitoring-operator-75587bd455-vjqf6\" (UID: \"88c6d6b4-1465-4419-b9d5-b3d67fda3332\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" Apr 23 08:52:58.748105 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.748061 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/88c6d6b4-1465-4419-b9d5-b3d67fda3332-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vjqf6\" (UID: \"88c6d6b4-1465-4419-b9d5-b3d67fda3332\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" Apr 23 08:52:58.774947 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.774922 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7bcb9d88d6-qvq77"] Apr 23 08:52:58.777699 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.777682 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.779319 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.779302 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 08:52:58.779534 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.779508 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 08:52:58.779614 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.779561 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 08:52:58.780154 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.780139 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-48746\"" Apr 23 08:52:58.785462 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.785444 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 08:52:58.794111 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.794089 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bcb9d88d6-qvq77"] Apr 23 08:52:58.849048 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.849029 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-bound-sa-token\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.849157 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.849064 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ddd3684-ea96-4a10-88e8-669c7b940363-installation-pull-secrets\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.849157 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.849093 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.849157 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.849117 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ddd3684-ea96-4a10-88e8-669c7b940363-trusted-ca\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.849322 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.849168 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ddd3684-ea96-4a10-88e8-669c7b940363-ca-trust-extracted\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.849322 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.849225 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/88c6d6b4-1465-4419-b9d5-b3d67fda3332-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vjqf6\" (UID: \"88c6d6b4-1465-4419-b9d5-b3d67fda3332\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" Apr 23 08:52:58.849322 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.849259 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4ddd3684-ea96-4a10-88e8-669c7b940363-image-registry-private-configuration\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.849322 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:52:58.849293 2559 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:52:58.849546 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:52:58.849350 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88c6d6b4-1465-4419-b9d5-b3d67fda3332-cluster-monitoring-operator-tls podName:88c6d6b4-1465-4419-b9d5-b3d67fda3332 nodeName:}" failed. No retries permitted until 2026-04-23 08:52:59.349335825 +0000 UTC m=+145.161941196 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/88c6d6b4-1465-4419-b9d5-b3d67fda3332-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vjqf6" (UID: "88c6d6b4-1465-4419-b9d5-b3d67fda3332") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:52:58.849546 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.849291 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdhn6\" (UniqueName: \"kubernetes.io/projected/88c6d6b4-1465-4419-b9d5-b3d67fda3332-kube-api-access-zdhn6\") pod \"cluster-monitoring-operator-75587bd455-vjqf6\" (UID: \"88c6d6b4-1465-4419-b9d5-b3d67fda3332\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" Apr 23 08:52:58.849546 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.849421 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgzt6\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-kube-api-access-cgzt6\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.849546 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.849472 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/88c6d6b4-1465-4419-b9d5-b3d67fda3332-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vjqf6\" (UID: \"88c6d6b4-1465-4419-b9d5-b3d67fda3332\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" Apr 23 08:52:58.849546 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.849499 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-certificates\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.850031 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.850016 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/88c6d6b4-1465-4419-b9d5-b3d67fda3332-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vjqf6\" (UID: \"88c6d6b4-1465-4419-b9d5-b3d67fda3332\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" Apr 23 08:52:58.857169 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.857152 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdhn6\" (UniqueName: \"kubernetes.io/projected/88c6d6b4-1465-4419-b9d5-b3d67fda3332-kube-api-access-zdhn6\") pod \"cluster-monitoring-operator-75587bd455-vjqf6\" (UID: \"88c6d6b4-1465-4419-b9d5-b3d67fda3332\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" Apr 23 08:52:58.868726 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.868703 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6"] Apr 23 08:52:58.871431 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.871416 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6" Apr 23 08:52:58.872148 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.872132 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr"] Apr 23 08:52:58.873645 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.873623 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 23 08:52:58.874126 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.874104 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 23 08:52:58.874208 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.873664 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:52:58.874268 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.874226 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 23 08:52:58.874773 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.874752 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-mcfnj\"" Apr 23 08:52:58.877932 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.877917 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr" Apr 23 08:52:58.879798 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.879775 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:52:58.879889 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.879808 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 23 08:52:58.879889 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.879815 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 23 08:52:58.879889 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.879871 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 23 08:52:58.880162 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.880139 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-fjrvd\"" Apr 23 08:52:58.880704 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.880683 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6"] Apr 23 08:52:58.883248 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.883230 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr"] Apr 23 08:52:58.950117 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.950064 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-bound-sa-token\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.950117 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.950098 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ddd3684-ea96-4a10-88e8-669c7b940363-installation-pull-secrets\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.950117 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.950115 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.950547 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.950132 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ddd3684-ea96-4a10-88e8-669c7b940363-trusted-ca\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.950547 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.950158 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4rmq\" (UniqueName: \"kubernetes.io/projected/93831e97-cd75-4159-a7be-102ac3929f81-kube-api-access-q4rmq\") pod \"service-ca-operator-d6fc45fc5-4kczr\" (UID: \"93831e97-cd75-4159-a7be-102ac3929f81\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr" Apr 23 08:52:58.950547 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:52:58.950221 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:52:58.950547 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:52:58.950238 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcb9d88d6-qvq77: secret "image-registry-tls" not found Apr 23 08:52:58.950547 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.950287 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93831e97-cd75-4159-a7be-102ac3929f81-serving-cert\") pod \"service-ca-operator-d6fc45fc5-4kczr\" (UID: \"93831e97-cd75-4159-a7be-102ac3929f81\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr" Apr 23 08:52:58.950547 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:52:58.950305 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls podName:4ddd3684-ea96-4a10-88e8-669c7b940363 nodeName:}" failed. No retries permitted until 2026-04-23 08:52:59.45028735 +0000 UTC m=+145.262892722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls") pod "image-registry-7bcb9d88d6-qvq77" (UID: "4ddd3684-ea96-4a10-88e8-669c7b940363") : secret "image-registry-tls" not found Apr 23 08:52:58.950547 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.950343 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ddd3684-ea96-4a10-88e8-669c7b940363-ca-trust-extracted\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.950547 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.950374 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93831e97-cd75-4159-a7be-102ac3929f81-config\") pod \"service-ca-operator-d6fc45fc5-4kczr\" (UID: \"93831e97-cd75-4159-a7be-102ac3929f81\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr" Apr 23 08:52:58.950547 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.950524 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4ddd3684-ea96-4a10-88e8-669c7b940363-image-registry-private-configuration\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.951017 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.950563 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/158e7a7b-4a03-4678-8b2c-0dc7d0b7913c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-2hzz6\" (UID: \"158e7a7b-4a03-4678-8b2c-0dc7d0b7913c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6" Apr 23 08:52:58.951017 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.950593 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2psqg\" (UniqueName: \"kubernetes.io/projected/158e7a7b-4a03-4678-8b2c-0dc7d0b7913c-kube-api-access-2psqg\") pod \"kube-storage-version-migrator-operator-6769c5d45-2hzz6\" (UID: \"158e7a7b-4a03-4678-8b2c-0dc7d0b7913c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6" Apr 23 08:52:58.951017 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.950650 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cgzt6\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-kube-api-access-cgzt6\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.951017 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.950686 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158e7a7b-4a03-4678-8b2c-0dc7d0b7913c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-2hzz6\" (UID: \"158e7a7b-4a03-4678-8b2c-0dc7d0b7913c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6" Apr 23 08:52:58.951017 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.950703 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ddd3684-ea96-4a10-88e8-669c7b940363-ca-trust-extracted\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.951017 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.950781 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-certificates\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.951321 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.951216 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ddd3684-ea96-4a10-88e8-669c7b940363-trusted-ca\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.951321 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.951285 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-certificates\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.952969 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.952948 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4ddd3684-ea96-4a10-88e8-669c7b940363-image-registry-private-configuration\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.953265 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.953245 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ddd3684-ea96-4a10-88e8-669c7b940363-installation-pull-secrets\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.960081 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.960061 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-bound-sa-token\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:58.961168 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:58.961144 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgzt6\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-kube-api-access-cgzt6\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:59.051591 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:59.051557 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4rmq\" (UniqueName: \"kubernetes.io/projected/93831e97-cd75-4159-a7be-102ac3929f81-kube-api-access-q4rmq\") pod \"service-ca-operator-d6fc45fc5-4kczr\" (UID: \"93831e97-cd75-4159-a7be-102ac3929f81\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr" Apr 23 08:52:59.051681 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:59.051654 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93831e97-cd75-4159-a7be-102ac3929f81-serving-cert\") pod \"service-ca-operator-d6fc45fc5-4kczr\" (UID: \"93831e97-cd75-4159-a7be-102ac3929f81\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr" Apr 23 08:52:59.051723 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:59.051682 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93831e97-cd75-4159-a7be-102ac3929f81-config\") pod \"service-ca-operator-d6fc45fc5-4kczr\" (UID: \"93831e97-cd75-4159-a7be-102ac3929f81\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr" Apr 23 08:52:59.051723 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:59.051720 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/158e7a7b-4a03-4678-8b2c-0dc7d0b7913c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-2hzz6\" (UID: \"158e7a7b-4a03-4678-8b2c-0dc7d0b7913c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6" Apr 23 08:52:59.051817 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:59.051737 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2psqg\" (UniqueName: \"kubernetes.io/projected/158e7a7b-4a03-4678-8b2c-0dc7d0b7913c-kube-api-access-2psqg\") pod \"kube-storage-version-migrator-operator-6769c5d45-2hzz6\" (UID: \"158e7a7b-4a03-4678-8b2c-0dc7d0b7913c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6" Apr 23 08:52:59.051817 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:59.051768 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158e7a7b-4a03-4678-8b2c-0dc7d0b7913c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-2hzz6\" (UID: \"158e7a7b-4a03-4678-8b2c-0dc7d0b7913c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6" Apr 23 08:52:59.052223 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:59.052208 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158e7a7b-4a03-4678-8b2c-0dc7d0b7913c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-2hzz6\" (UID: \"158e7a7b-4a03-4678-8b2c-0dc7d0b7913c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6" Apr 23 08:52:59.052777 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:59.052756 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93831e97-cd75-4159-a7be-102ac3929f81-config\") pod \"service-ca-operator-d6fc45fc5-4kczr\" (UID: \"93831e97-cd75-4159-a7be-102ac3929f81\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr" Apr 23 08:52:59.053558 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:59.053536 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/158e7a7b-4a03-4678-8b2c-0dc7d0b7913c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-2hzz6\" (UID: \"158e7a7b-4a03-4678-8b2c-0dc7d0b7913c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6" Apr 23 08:52:59.054126 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:59.054110 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93831e97-cd75-4159-a7be-102ac3929f81-serving-cert\") pod \"service-ca-operator-d6fc45fc5-4kczr\" (UID: \"93831e97-cd75-4159-a7be-102ac3929f81\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr" Apr 23 08:52:59.058245 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:59.058224 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4rmq\" (UniqueName: \"kubernetes.io/projected/93831e97-cd75-4159-a7be-102ac3929f81-kube-api-access-q4rmq\") pod \"service-ca-operator-d6fc45fc5-4kczr\" (UID: \"93831e97-cd75-4159-a7be-102ac3929f81\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr" Apr 23 08:52:59.058457 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:59.058440 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2psqg\" (UniqueName: \"kubernetes.io/projected/158e7a7b-4a03-4678-8b2c-0dc7d0b7913c-kube-api-access-2psqg\") pod \"kube-storage-version-migrator-operator-6769c5d45-2hzz6\" (UID: \"158e7a7b-4a03-4678-8b2c-0dc7d0b7913c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6" Apr 23 08:52:59.182035 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:59.182015 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6" Apr 23 08:52:59.188623 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:59.188559 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr" Apr 23 08:52:59.305574 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:59.305546 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6"] Apr 23 08:52:59.308278 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:52:59.308246 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod158e7a7b_4a03_4678_8b2c_0dc7d0b7913c.slice/crio-af68e9537aed1157ac94b763f6f5b21489b29a2eebe7deef358f54e690a21145 WatchSource:0}: Error finding container af68e9537aed1157ac94b763f6f5b21489b29a2eebe7deef358f54e690a21145: Status 404 returned error can't find the container with id af68e9537aed1157ac94b763f6f5b21489b29a2eebe7deef358f54e690a21145 Apr 23 08:52:59.326643 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:59.326621 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr"] Apr 23 08:52:59.330830 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:52:59.330804 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93831e97_cd75_4159_a7be_102ac3929f81.slice/crio-16d89faff6eb21ae3a97f398ecfb04fb54042fb3418f39739f4de8ed607f98f7 WatchSource:0}: Error finding container 16d89faff6eb21ae3a97f398ecfb04fb54042fb3418f39739f4de8ed607f98f7: Status 404 returned error can't find the container with id 16d89faff6eb21ae3a97f398ecfb04fb54042fb3418f39739f4de8ed607f98f7 Apr 23 08:52:59.355346 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:59.355325 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/88c6d6b4-1465-4419-b9d5-b3d67fda3332-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vjqf6\" (UID: \"88c6d6b4-1465-4419-b9d5-b3d67fda3332\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" Apr 23 08:52:59.355488 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:52:59.355472 2559 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:52:59.355553 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:52:59.355542 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88c6d6b4-1465-4419-b9d5-b3d67fda3332-cluster-monitoring-operator-tls podName:88c6d6b4-1465-4419-b9d5-b3d67fda3332 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:00.355519329 +0000 UTC m=+146.168124700 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/88c6d6b4-1465-4419-b9d5-b3d67fda3332-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vjqf6" (UID: "88c6d6b4-1465-4419-b9d5-b3d67fda3332") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:52:59.456636 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:52:59.456612 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:52:59.456725 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:52:59.456712 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:52:59.456776 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:52:59.456726 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcb9d88d6-qvq77: secret "image-registry-tls" not found Apr 23 08:52:59.456776 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:52:59.456770 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls podName:4ddd3684-ea96-4a10-88e8-669c7b940363 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:00.456752009 +0000 UTC m=+146.269357395 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls") pod "image-registry-7bcb9d88d6-qvq77" (UID: "4ddd3684-ea96-4a10-88e8-669c7b940363") : secret "image-registry-tls" not found Apr 23 08:53:00.200869 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:00.200830 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6" event={"ID":"158e7a7b-4a03-4678-8b2c-0dc7d0b7913c","Type":"ContainerStarted","Data":"af68e9537aed1157ac94b763f6f5b21489b29a2eebe7deef358f54e690a21145"} Apr 23 08:53:00.201941 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:00.201917 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr" event={"ID":"93831e97-cd75-4159-a7be-102ac3929f81","Type":"ContainerStarted","Data":"16d89faff6eb21ae3a97f398ecfb04fb54042fb3418f39739f4de8ed607f98f7"} Apr 23 08:53:00.363144 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:00.363110 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/88c6d6b4-1465-4419-b9d5-b3d67fda3332-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vjqf6\" (UID: \"88c6d6b4-1465-4419-b9d5-b3d67fda3332\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" Apr 23 08:53:00.363397 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:00.363366 2559 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:53:00.363507 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:00.363448 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88c6d6b4-1465-4419-b9d5-b3d67fda3332-cluster-monitoring-operator-tls podName:88c6d6b4-1465-4419-b9d5-b3d67fda3332 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:02.36342709 +0000 UTC m=+148.176032462 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/88c6d6b4-1465-4419-b9d5-b3d67fda3332-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vjqf6" (UID: "88c6d6b4-1465-4419-b9d5-b3d67fda3332") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:53:00.464121 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:00.464039 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:53:00.464238 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:00.464204 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:53:00.464238 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:00.464226 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcb9d88d6-qvq77: secret "image-registry-tls" not found Apr 23 08:53:00.464354 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:00.464282 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls podName:4ddd3684-ea96-4a10-88e8-669c7b940363 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:02.464267473 +0000 UTC m=+148.276872851 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls") pod "image-registry-7bcb9d88d6-qvq77" (UID: "4ddd3684-ea96-4a10-88e8-669c7b940363") : secret "image-registry-tls" not found Apr 23 08:53:02.206718 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:02.206684 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr" event={"ID":"93831e97-cd75-4159-a7be-102ac3929f81","Type":"ContainerStarted","Data":"897ac52b1c5a2cb4f1322a451a63edcb38a04b3b7311a5092bf9cb3ecb62c16f"} Apr 23 08:53:02.207915 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:02.207890 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6" event={"ID":"158e7a7b-4a03-4678-8b2c-0dc7d0b7913c","Type":"ContainerStarted","Data":"5f7c49e4248473aa94bfe4a8621045240d0d3a44397bffc1501d04ffbe668a6e"} Apr 23 08:53:02.221173 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:02.221132 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr" podStartSLOduration=2.207911044 podStartE2EDuration="4.221119849s" podCreationTimestamp="2026-04-23 08:52:58 +0000 UTC" firstStartedPulling="2026-04-23 08:52:59.332337247 +0000 UTC m=+145.144942613" lastFinishedPulling="2026-04-23 08:53:01.345546039 +0000 UTC m=+147.158151418" observedRunningTime="2026-04-23 08:53:02.220430662 +0000 UTC m=+148.033036051" watchObservedRunningTime="2026-04-23 08:53:02.221119849 +0000 UTC m=+148.033725234" Apr 23 08:53:02.233222 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:02.233179 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6" podStartSLOduration=1.5490516250000002 podStartE2EDuration="4.233168689s" podCreationTimestamp="2026-04-23 08:52:58 +0000 UTC" firstStartedPulling="2026-04-23 08:52:59.310140308 +0000 UTC m=+145.122745677" lastFinishedPulling="2026-04-23 08:53:01.994257375 +0000 UTC m=+147.806862741" observedRunningTime="2026-04-23 08:53:02.233100556 +0000 UTC m=+148.045705945" watchObservedRunningTime="2026-04-23 08:53:02.233168689 +0000 UTC m=+148.045774111" Apr 23 08:53:02.380231 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:02.380166 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/88c6d6b4-1465-4419-b9d5-b3d67fda3332-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vjqf6\" (UID: \"88c6d6b4-1465-4419-b9d5-b3d67fda3332\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" Apr 23 08:53:02.380346 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:02.380238 2559 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:53:02.380346 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:02.380284 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88c6d6b4-1465-4419-b9d5-b3d67fda3332-cluster-monitoring-operator-tls podName:88c6d6b4-1465-4419-b9d5-b3d67fda3332 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:06.380272418 +0000 UTC m=+152.192877783 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/88c6d6b4-1465-4419-b9d5-b3d67fda3332-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vjqf6" (UID: "88c6d6b4-1465-4419-b9d5-b3d67fda3332") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:53:02.481430 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:02.481404 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:53:02.481530 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:02.481477 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:53:02.481530 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:02.481486 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcb9d88d6-qvq77: secret "image-registry-tls" not found Apr 23 08:53:02.481631 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:02.481543 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls podName:4ddd3684-ea96-4a10-88e8-669c7b940363 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:06.481531877 +0000 UTC m=+152.294137243 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls") pod "image-registry-7bcb9d88d6-qvq77" (UID: "4ddd3684-ea96-4a10-88e8-669c7b940363") : secret "image-registry-tls" not found Apr 23 08:53:03.243360 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:03.243329 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-mbsts"] Apr 23 08:53:03.246221 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:03.246205 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mbsts" Apr 23 08:53:03.247872 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:03.247854 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-7j6r8\"" Apr 23 08:53:03.255368 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:03.255345 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-mbsts"] Apr 23 08:53:03.388369 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:03.388339 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gckjl\" (UniqueName: \"kubernetes.io/projected/80e7d027-15dd-449f-b406-53648881a780-kube-api-access-gckjl\") pod \"network-check-source-8894fc9bd-mbsts\" (UID: \"80e7d027-15dd-449f-b406-53648881a780\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mbsts" Apr 23 08:53:03.489352 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:03.489325 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gckjl\" (UniqueName: \"kubernetes.io/projected/80e7d027-15dd-449f-b406-53648881a780-kube-api-access-gckjl\") pod \"network-check-source-8894fc9bd-mbsts\" (UID: \"80e7d027-15dd-449f-b406-53648881a780\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mbsts" Apr 23 08:53:03.501235 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:03.501186 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gckjl\" (UniqueName: \"kubernetes.io/projected/80e7d027-15dd-449f-b406-53648881a780-kube-api-access-gckjl\") pod \"network-check-source-8894fc9bd-mbsts\" (UID: \"80e7d027-15dd-449f-b406-53648881a780\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mbsts" Apr 23 08:53:03.554122 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:03.554080 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mbsts" Apr 23 08:53:03.666312 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:03.666279 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-mbsts"] Apr 23 08:53:03.669002 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:53:03.668947 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80e7d027_15dd_449f_b406_53648881a780.slice/crio-91ac8f5632bc5972d35a18bc047d08c0f842167206a51c1bdae2d784f8251044 WatchSource:0}: Error finding container 91ac8f5632bc5972d35a18bc047d08c0f842167206a51c1bdae2d784f8251044: Status 404 returned error can't find the container with id 91ac8f5632bc5972d35a18bc047d08c0f842167206a51c1bdae2d784f8251044 Apr 23 08:53:04.212462 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:04.212430 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mbsts" event={"ID":"80e7d027-15dd-449f-b406-53648881a780","Type":"ContainerStarted","Data":"b714e55f3a076082d8930a6633b8b5c7430c3f246b9ef59bf61946fb44d6a703"} Apr 23 08:53:04.212462 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:04.212465 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mbsts" event={"ID":"80e7d027-15dd-449f-b406-53648881a780","Type":"ContainerStarted","Data":"91ac8f5632bc5972d35a18bc047d08c0f842167206a51c1bdae2d784f8251044"} Apr 23 08:53:04.228819 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:04.228774 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mbsts" podStartSLOduration=1.228757253 podStartE2EDuration="1.228757253s" podCreationTimestamp="2026-04-23 08:53:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:53:04.228337507 +0000 UTC m=+150.040942896" watchObservedRunningTime="2026-04-23 08:53:04.228757253 +0000 UTC m=+150.041362643" Apr 23 08:53:05.123203 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.123173 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-dnx2l_c5cbe02b-2850-4700-8339-43f2fe5f24d5/dns-node-resolver/0.log" Apr 23 08:53:05.282974 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.282945 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-hmlz7"] Apr 23 08:53:05.285836 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.285822 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-hmlz7" Apr 23 08:53:05.287581 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.287559 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 23 08:53:05.287581 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.287577 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 23 08:53:05.287718 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.287564 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 23 08:53:05.287975 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.287958 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-v4lfv\"" Apr 23 08:53:05.288040 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.287962 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 23 08:53:05.293208 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.293187 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-hmlz7"] Apr 23 08:53:05.402760 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.402686 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/973545b9-390d-46f9-8628-2e039ad29e3c-signing-cabundle\") pod \"service-ca-865cb79987-hmlz7\" (UID: \"973545b9-390d-46f9-8628-2e039ad29e3c\") " pod="openshift-service-ca/service-ca-865cb79987-hmlz7" Apr 23 08:53:05.403037 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.403019 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/973545b9-390d-46f9-8628-2e039ad29e3c-signing-key\") pod \"service-ca-865cb79987-hmlz7\" (UID: \"973545b9-390d-46f9-8628-2e039ad29e3c\") " pod="openshift-service-ca/service-ca-865cb79987-hmlz7" Apr 23 08:53:05.403321 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.403303 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5rvf\" (UniqueName: \"kubernetes.io/projected/973545b9-390d-46f9-8628-2e039ad29e3c-kube-api-access-s5rvf\") pod \"service-ca-865cb79987-hmlz7\" (UID: \"973545b9-390d-46f9-8628-2e039ad29e3c\") " pod="openshift-service-ca/service-ca-865cb79987-hmlz7" Apr 23 08:53:05.504698 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.504674 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/973545b9-390d-46f9-8628-2e039ad29e3c-signing-key\") pod \"service-ca-865cb79987-hmlz7\" (UID: \"973545b9-390d-46f9-8628-2e039ad29e3c\") " pod="openshift-service-ca/service-ca-865cb79987-hmlz7" Apr 23 08:53:05.504781 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.504721 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5rvf\" (UniqueName: \"kubernetes.io/projected/973545b9-390d-46f9-8628-2e039ad29e3c-kube-api-access-s5rvf\") pod \"service-ca-865cb79987-hmlz7\" (UID: \"973545b9-390d-46f9-8628-2e039ad29e3c\") " pod="openshift-service-ca/service-ca-865cb79987-hmlz7" Apr 23 08:53:05.504781 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.504756 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/973545b9-390d-46f9-8628-2e039ad29e3c-signing-cabundle\") pod \"service-ca-865cb79987-hmlz7\" (UID: \"973545b9-390d-46f9-8628-2e039ad29e3c\") " pod="openshift-service-ca/service-ca-865cb79987-hmlz7" Apr 23 08:53:05.505340 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.505324 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/973545b9-390d-46f9-8628-2e039ad29e3c-signing-cabundle\") pod \"service-ca-865cb79987-hmlz7\" (UID: \"973545b9-390d-46f9-8628-2e039ad29e3c\") " pod="openshift-service-ca/service-ca-865cb79987-hmlz7" Apr 23 08:53:05.506948 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.506922 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/973545b9-390d-46f9-8628-2e039ad29e3c-signing-key\") pod \"service-ca-865cb79987-hmlz7\" (UID: \"973545b9-390d-46f9-8628-2e039ad29e3c\") " pod="openshift-service-ca/service-ca-865cb79987-hmlz7" Apr 23 08:53:05.512122 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.512100 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5rvf\" (UniqueName: \"kubernetes.io/projected/973545b9-390d-46f9-8628-2e039ad29e3c-kube-api-access-s5rvf\") pod \"service-ca-865cb79987-hmlz7\" (UID: \"973545b9-390d-46f9-8628-2e039ad29e3c\") " pod="openshift-service-ca/service-ca-865cb79987-hmlz7" Apr 23 08:53:05.594768 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.594747 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-hmlz7" Apr 23 08:53:05.705253 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:05.705227 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-hmlz7"] Apr 23 08:53:05.708622 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:53:05.708599 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod973545b9_390d_46f9_8628_2e039ad29e3c.slice/crio-9e31917b02480ced2a249f82be9a943db56ab5414c8513d4c4b007b002d21a34 WatchSource:0}: Error finding container 9e31917b02480ced2a249f82be9a943db56ab5414c8513d4c4b007b002d21a34: Status 404 returned error can't find the container with id 9e31917b02480ced2a249f82be9a943db56ab5414c8513d4c4b007b002d21a34 Apr 23 08:53:06.123158 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:06.123136 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j7htq_d23173d3-6e3a-4b55-b1dc-7075f2278e15/node-ca/0.log" Apr 23 08:53:06.217361 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:06.217334 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-hmlz7" event={"ID":"973545b9-390d-46f9-8628-2e039ad29e3c","Type":"ContainerStarted","Data":"f507817207b2c733717fdec07bb7cceeb1006e03a586dc837fca790f33a2755d"} Apr 23 08:53:06.217764 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:06.217368 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-hmlz7" event={"ID":"973545b9-390d-46f9-8628-2e039ad29e3c","Type":"ContainerStarted","Data":"9e31917b02480ced2a249f82be9a943db56ab5414c8513d4c4b007b002d21a34"} Apr 23 08:53:06.231779 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:06.231734 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-hmlz7" podStartSLOduration=1.231717904 podStartE2EDuration="1.231717904s" podCreationTimestamp="2026-04-23 08:53:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:53:06.231298951 +0000 UTC m=+152.043904340" watchObservedRunningTime="2026-04-23 08:53:06.231717904 +0000 UTC m=+152.044323294" Apr 23 08:53:06.412978 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:06.412912 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/88c6d6b4-1465-4419-b9d5-b3d67fda3332-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vjqf6\" (UID: \"88c6d6b4-1465-4419-b9d5-b3d67fda3332\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" Apr 23 08:53:06.413116 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:06.413036 2559 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:53:06.413116 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:06.413099 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88c6d6b4-1465-4419-b9d5-b3d67fda3332-cluster-monitoring-operator-tls podName:88c6d6b4-1465-4419-b9d5-b3d67fda3332 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:14.413086531 +0000 UTC m=+160.225691897 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/88c6d6b4-1465-4419-b9d5-b3d67fda3332-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vjqf6" (UID: "88c6d6b4-1465-4419-b9d5-b3d67fda3332") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:53:06.514375 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:06.514346 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:53:06.514600 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:06.514578 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:53:06.514654 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:06.514605 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcb9d88d6-qvq77: secret "image-registry-tls" not found Apr 23 08:53:06.514711 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:06.514695 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls podName:4ddd3684-ea96-4a10-88e8-669c7b940363 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:14.514669867 +0000 UTC m=+160.327275252 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls") pod "image-registry-7bcb9d88d6-qvq77" (UID: "4ddd3684-ea96-4a10-88e8-669c7b940363") : secret "image-registry-tls" not found Apr 23 08:53:10.083667 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:10.083629 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-ghbpk" podUID="26cb0581-bb5b-4912-8a69-48378d6dc35b" Apr 23 08:53:10.096786 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:10.096752 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-l59kj" podUID="492da4be-9e87-49d4-91cf-b96c4bece553" Apr 23 08:53:10.226890 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:10.226865 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ghbpk" Apr 23 08:53:10.819675 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:10.819638 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-lx4sg" podUID="61450b58-933b-4b5d-bf40-9e4408670e3e" Apr 23 08:53:14.473541 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:14.473507 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/88c6d6b4-1465-4419-b9d5-b3d67fda3332-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vjqf6\" (UID: \"88c6d6b4-1465-4419-b9d5-b3d67fda3332\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" Apr 23 08:53:14.475734 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:14.475714 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/88c6d6b4-1465-4419-b9d5-b3d67fda3332-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vjqf6\" (UID: \"88c6d6b4-1465-4419-b9d5-b3d67fda3332\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" Apr 23 08:53:14.574411 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:14.574376 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:53:14.576515 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:14.576486 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls\") pod \"image-registry-7bcb9d88d6-qvq77\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:53:14.581318 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:14.581298 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" Apr 23 08:53:14.686932 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:14.686903 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:53:14.703783 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:14.703733 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6"] Apr 23 08:53:14.706347 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:53:14.706314 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88c6d6b4_1465_4419_b9d5_b3d67fda3332.slice/crio-aa396d8b7fed23876feeada8de088703edcb2f1174616d11e711e189c8561820 WatchSource:0}: Error finding container aa396d8b7fed23876feeada8de088703edcb2f1174616d11e711e189c8561820: Status 404 returned error can't find the container with id aa396d8b7fed23876feeada8de088703edcb2f1174616d11e711e189c8561820 Apr 23 08:53:14.815779 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:14.815748 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bcb9d88d6-qvq77"] Apr 23 08:53:14.819055 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:53:14.819028 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ddd3684_ea96_4a10_88e8_669c7b940363.slice/crio-d347715d25545852bdc36504c02c981a3d5e151d091e04bc3ce42595ff9dbf55 WatchSource:0}: Error finding container d347715d25545852bdc36504c02c981a3d5e151d091e04bc3ce42595ff9dbf55: Status 404 returned error can't find the container with id d347715d25545852bdc36504c02c981a3d5e151d091e04bc3ce42595ff9dbf55 Apr 23 08:53:15.078268 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:15.078195 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert\") pod \"ingress-canary-l59kj\" (UID: \"492da4be-9e87-49d4-91cf-b96c4bece553\") " pod="openshift-ingress-canary/ingress-canary-l59kj" Apr 23 08:53:15.078268 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:15.078245 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:53:15.080451 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:15.080424 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26cb0581-bb5b-4912-8a69-48378d6dc35b-metrics-tls\") pod \"dns-default-ghbpk\" (UID: \"26cb0581-bb5b-4912-8a69-48378d6dc35b\") " pod="openshift-dns/dns-default-ghbpk" Apr 23 08:53:15.080688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:15.080665 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/492da4be-9e87-49d4-91cf-b96c4bece553-cert\") pod \"ingress-canary-l59kj\" (UID: \"492da4be-9e87-49d4-91cf-b96c4bece553\") " pod="openshift-ingress-canary/ingress-canary-l59kj" Apr 23 08:53:15.239403 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:15.239362 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" event={"ID":"4ddd3684-ea96-4a10-88e8-669c7b940363","Type":"ContainerStarted","Data":"c5f247c71c6cf24ad188a43c2248e17c88c525ba6395c6855577ab312feab2dd"} Apr 23 08:53:15.239403 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:15.239408 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" event={"ID":"4ddd3684-ea96-4a10-88e8-669c7b940363","Type":"ContainerStarted","Data":"d347715d25545852bdc36504c02c981a3d5e151d091e04bc3ce42595ff9dbf55"} Apr 23 08:53:15.239643 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:15.239519 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:53:15.240435 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:15.240404 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" event={"ID":"88c6d6b4-1465-4419-b9d5-b3d67fda3332","Type":"ContainerStarted","Data":"aa396d8b7fed23876feeada8de088703edcb2f1174616d11e711e189c8561820"} Apr 23 08:53:15.256978 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:15.256926 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" podStartSLOduration=17.256911152 podStartE2EDuration="17.256911152s" podCreationTimestamp="2026-04-23 08:52:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:53:15.255332773 +0000 UTC m=+161.067938162" watchObservedRunningTime="2026-04-23 08:53:15.256911152 +0000 UTC m=+161.069516541" Apr 23 08:53:15.329998 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:15.329927 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-cjh2d\"" Apr 23 08:53:15.338547 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:15.338524 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ghbpk" Apr 23 08:53:15.465471 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:15.465444 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ghbpk"] Apr 23 08:53:15.468706 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:53:15.468671 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26cb0581_bb5b_4912_8a69_48378d6dc35b.slice/crio-06cb8fb11120e2f2167dfece667e2ad26be77899d3cd43621dd4fdf4f354d37c WatchSource:0}: Error finding container 06cb8fb11120e2f2167dfece667e2ad26be77899d3cd43621dd4fdf4f354d37c: Status 404 returned error can't find the container with id 06cb8fb11120e2f2167dfece667e2ad26be77899d3cd43621dd4fdf4f354d37c Apr 23 08:53:16.244237 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:16.244195 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ghbpk" event={"ID":"26cb0581-bb5b-4912-8a69-48378d6dc35b","Type":"ContainerStarted","Data":"06cb8fb11120e2f2167dfece667e2ad26be77899d3cd43621dd4fdf4f354d37c"} Apr 23 08:53:17.248341 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:17.248295 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" event={"ID":"88c6d6b4-1465-4419-b9d5-b3d67fda3332","Type":"ContainerStarted","Data":"a38275a210df851e973c7138869706158446290a704f2cbeea170ae002051e1e"} Apr 23 08:53:17.249914 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:17.249887 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ghbpk" event={"ID":"26cb0581-bb5b-4912-8a69-48378d6dc35b","Type":"ContainerStarted","Data":"6380ae829b46cae6d66e721f5654a51ec48856c6fc85d92e0d7572dc13fb8b02"} Apr 23 08:53:17.249914 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:17.249916 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ghbpk" event={"ID":"26cb0581-bb5b-4912-8a69-48378d6dc35b","Type":"ContainerStarted","Data":"c9727fa766d49f50f475b2a4d610e7fb7fcb6c48d3935a4a1430b348a21512d8"} Apr 23 08:53:17.250093 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:17.250028 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-ghbpk" Apr 23 08:53:17.263380 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:17.263340 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vjqf6" podStartSLOduration=17.042405966 podStartE2EDuration="19.263306832s" podCreationTimestamp="2026-04-23 08:52:58 +0000 UTC" firstStartedPulling="2026-04-23 08:53:14.708149731 +0000 UTC m=+160.520755097" lastFinishedPulling="2026-04-23 08:53:16.929050587 +0000 UTC m=+162.741655963" observedRunningTime="2026-04-23 08:53:17.262505457 +0000 UTC m=+163.075110844" watchObservedRunningTime="2026-04-23 08:53:17.263306832 +0000 UTC m=+163.075912239" Apr 23 08:53:17.279502 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:17.279461 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ghbpk" podStartSLOduration=128.818355193 podStartE2EDuration="2m10.279449977s" podCreationTimestamp="2026-04-23 08:51:07 +0000 UTC" firstStartedPulling="2026-04-23 08:53:15.470462815 +0000 UTC m=+161.283068194" lastFinishedPulling="2026-04-23 08:53:16.931557612 +0000 UTC m=+162.744162978" observedRunningTime="2026-04-23 08:53:17.278774169 +0000 UTC m=+163.091379568" watchObservedRunningTime="2026-04-23 08:53:17.279449977 +0000 UTC m=+163.092055366" Apr 23 08:53:21.811526 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:21.811490 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l59kj" Apr 23 08:53:21.813754 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:21.813734 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nrkt9\"" Apr 23 08:53:21.822281 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:21.822263 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l59kj" Apr 23 08:53:21.957377 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:21.957349 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l59kj"] Apr 23 08:53:21.960216 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:53:21.960190 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod492da4be_9e87_49d4_91cf_b96c4bece553.slice/crio-107dd76f1507aabcc0e2b6ad459cddcd3b0a7cee7487c6f2f73bbf826c3a3807 WatchSource:0}: Error finding container 107dd76f1507aabcc0e2b6ad459cddcd3b0a7cee7487c6f2f73bbf826c3a3807: Status 404 returned error can't find the container with id 107dd76f1507aabcc0e2b6ad459cddcd3b0a7cee7487c6f2f73bbf826c3a3807 Apr 23 08:53:22.265557 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:22.265525 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l59kj" event={"ID":"492da4be-9e87-49d4-91cf-b96c4bece553","Type":"ContainerStarted","Data":"107dd76f1507aabcc0e2b6ad459cddcd3b0a7cee7487c6f2f73bbf826c3a3807"} Apr 23 08:53:22.811025 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:22.810980 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:53:24.270881 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:24.270838 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l59kj" event={"ID":"492da4be-9e87-49d4-91cf-b96c4bece553","Type":"ContainerStarted","Data":"05fd0ac7de8354d554e5c98dd8c557deb9e59d340f308c970c3867119ee6be18"} Apr 23 08:53:24.286437 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:24.286393 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-l59kj" podStartSLOduration=135.332807275 podStartE2EDuration="2m17.286376983s" podCreationTimestamp="2026-04-23 08:51:07 +0000 UTC" firstStartedPulling="2026-04-23 08:53:21.96209456 +0000 UTC m=+167.774699926" lastFinishedPulling="2026-04-23 08:53:23.915664253 +0000 UTC m=+169.728269634" observedRunningTime="2026-04-23 08:53:24.286090091 +0000 UTC m=+170.098695479" watchObservedRunningTime="2026-04-23 08:53:24.286376983 +0000 UTC m=+170.098982700" Apr 23 08:53:25.475000 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.474957 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2bc5z"] Apr 23 08:53:25.477903 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.477889 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2bc5z" Apr 23 08:53:25.479981 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.479960 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 23 08:53:25.480094 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.479980 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-nbqcd\"" Apr 23 08:53:25.485980 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.485956 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2bc5z"] Apr 23 08:53:25.492744 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.492724 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7bcb9d88d6-qvq77"] Apr 23 08:53:25.547638 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.547607 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2dpsp"] Apr 23 08:53:25.550698 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.550680 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2dpsp" Apr 23 08:53:25.552366 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.552345 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vx78w\"" Apr 23 08:53:25.552467 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.552433 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 08:53:25.552691 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.552674 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 08:53:25.552764 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.552674 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 08:53:25.552823 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.552675 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 08:53:25.560367 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.560347 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2dpsp"] Apr 23 08:53:25.651450 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.651422 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5a69f4af-57b2-4d08-a860-f69a31bc13f5-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2bc5z\" (UID: \"5a69f4af-57b2-4d08-a860-f69a31bc13f5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2bc5z" Apr 23 08:53:25.651593 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.651458 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/761dbda3-1985-42a4-a075-6cb13ccc1d11-data-volume\") pod \"insights-runtime-extractor-2dpsp\" (UID: \"761dbda3-1985-42a4-a075-6cb13ccc1d11\") " pod="openshift-insights/insights-runtime-extractor-2dpsp" Apr 23 08:53:25.651593 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.651480 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/761dbda3-1985-42a4-a075-6cb13ccc1d11-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2dpsp\" (UID: \"761dbda3-1985-42a4-a075-6cb13ccc1d11\") " pod="openshift-insights/insights-runtime-extractor-2dpsp" Apr 23 08:53:25.651593 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.651541 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/761dbda3-1985-42a4-a075-6cb13ccc1d11-crio-socket\") pod \"insights-runtime-extractor-2dpsp\" (UID: \"761dbda3-1985-42a4-a075-6cb13ccc1d11\") " pod="openshift-insights/insights-runtime-extractor-2dpsp" Apr 23 08:53:25.651696 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.651610 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4lbv\" (UniqueName: \"kubernetes.io/projected/761dbda3-1985-42a4-a075-6cb13ccc1d11-kube-api-access-n4lbv\") pod \"insights-runtime-extractor-2dpsp\" (UID: \"761dbda3-1985-42a4-a075-6cb13ccc1d11\") " pod="openshift-insights/insights-runtime-extractor-2dpsp" Apr 23 08:53:25.651696 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.651639 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/761dbda3-1985-42a4-a075-6cb13ccc1d11-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2dpsp\" (UID: \"761dbda3-1985-42a4-a075-6cb13ccc1d11\") " pod="openshift-insights/insights-runtime-extractor-2dpsp" Apr 23 08:53:25.752350 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.752287 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4lbv\" (UniqueName: \"kubernetes.io/projected/761dbda3-1985-42a4-a075-6cb13ccc1d11-kube-api-access-n4lbv\") pod \"insights-runtime-extractor-2dpsp\" (UID: \"761dbda3-1985-42a4-a075-6cb13ccc1d11\") " pod="openshift-insights/insights-runtime-extractor-2dpsp" Apr 23 08:53:25.752350 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.752319 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/761dbda3-1985-42a4-a075-6cb13ccc1d11-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2dpsp\" (UID: \"761dbda3-1985-42a4-a075-6cb13ccc1d11\") " pod="openshift-insights/insights-runtime-extractor-2dpsp" Apr 23 08:53:25.752350 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.752342 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5a69f4af-57b2-4d08-a860-f69a31bc13f5-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2bc5z\" (UID: \"5a69f4af-57b2-4d08-a860-f69a31bc13f5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2bc5z" Apr 23 08:53:25.752626 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.752368 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/761dbda3-1985-42a4-a075-6cb13ccc1d11-data-volume\") pod \"insights-runtime-extractor-2dpsp\" (UID: \"761dbda3-1985-42a4-a075-6cb13ccc1d11\") " pod="openshift-insights/insights-runtime-extractor-2dpsp" Apr 23 08:53:25.752626 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.752398 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/761dbda3-1985-42a4-a075-6cb13ccc1d11-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2dpsp\" (UID: \"761dbda3-1985-42a4-a075-6cb13ccc1d11\") " pod="openshift-insights/insights-runtime-extractor-2dpsp" Apr 23 08:53:25.752626 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.752435 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/761dbda3-1985-42a4-a075-6cb13ccc1d11-crio-socket\") pod \"insights-runtime-extractor-2dpsp\" (UID: \"761dbda3-1985-42a4-a075-6cb13ccc1d11\") " pod="openshift-insights/insights-runtime-extractor-2dpsp" Apr 23 08:53:25.752626 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.752517 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/761dbda3-1985-42a4-a075-6cb13ccc1d11-crio-socket\") pod \"insights-runtime-extractor-2dpsp\" (UID: \"761dbda3-1985-42a4-a075-6cb13ccc1d11\") " pod="openshift-insights/insights-runtime-extractor-2dpsp" Apr 23 08:53:25.752852 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.752836 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/761dbda3-1985-42a4-a075-6cb13ccc1d11-data-volume\") pod \"insights-runtime-extractor-2dpsp\" (UID: \"761dbda3-1985-42a4-a075-6cb13ccc1d11\") " pod="openshift-insights/insights-runtime-extractor-2dpsp" Apr 23 08:53:25.753061 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.753034 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/761dbda3-1985-42a4-a075-6cb13ccc1d11-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2dpsp\" (UID: \"761dbda3-1985-42a4-a075-6cb13ccc1d11\") " pod="openshift-insights/insights-runtime-extractor-2dpsp" Apr 23 08:53:25.754595 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.754574 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/761dbda3-1985-42a4-a075-6cb13ccc1d11-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2dpsp\" (UID: \"761dbda3-1985-42a4-a075-6cb13ccc1d11\") " pod="openshift-insights/insights-runtime-extractor-2dpsp" Apr 23 08:53:25.754820 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.754804 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5a69f4af-57b2-4d08-a860-f69a31bc13f5-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2bc5z\" (UID: \"5a69f4af-57b2-4d08-a860-f69a31bc13f5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2bc5z" Apr 23 08:53:25.759978 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.759958 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4lbv\" (UniqueName: \"kubernetes.io/projected/761dbda3-1985-42a4-a075-6cb13ccc1d11-kube-api-access-n4lbv\") pod \"insights-runtime-extractor-2dpsp\" (UID: \"761dbda3-1985-42a4-a075-6cb13ccc1d11\") " pod="openshift-insights/insights-runtime-extractor-2dpsp" Apr 23 08:53:25.786770 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.786748 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2bc5z" Apr 23 08:53:25.859336 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.859309 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2dpsp" Apr 23 08:53:25.896011 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.895962 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2bc5z"] Apr 23 08:53:25.899632 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:53:25.899607 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a69f4af_57b2_4d08_a860_f69a31bc13f5.slice/crio-a3620b887407a0479ff112073d3c5a80ccd3cf94664fef7359d9f72d518aad6c WatchSource:0}: Error finding container a3620b887407a0479ff112073d3c5a80ccd3cf94664fef7359d9f72d518aad6c: Status 404 returned error can't find the container with id a3620b887407a0479ff112073d3c5a80ccd3cf94664fef7359d9f72d518aad6c Apr 23 08:53:25.972275 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:25.972246 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2dpsp"] Apr 23 08:53:25.975396 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:53:25.975368 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod761dbda3_1985_42a4_a075_6cb13ccc1d11.slice/crio-27b2a12a9c4c150289bef42c7b07b8623025a87ef732d9e14d5b45234168c94b WatchSource:0}: Error finding container 27b2a12a9c4c150289bef42c7b07b8623025a87ef732d9e14d5b45234168c94b: Status 404 returned error can't find the container with id 27b2a12a9c4c150289bef42c7b07b8623025a87ef732d9e14d5b45234168c94b Apr 23 08:53:26.276202 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:26.276112 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2bc5z" event={"ID":"5a69f4af-57b2-4d08-a860-f69a31bc13f5","Type":"ContainerStarted","Data":"a3620b887407a0479ff112073d3c5a80ccd3cf94664fef7359d9f72d518aad6c"} Apr 23 08:53:26.277263 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:26.277234 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2dpsp" event={"ID":"761dbda3-1985-42a4-a075-6cb13ccc1d11","Type":"ContainerStarted","Data":"b855f309251bba068d68a66caa1ca10ebc24e6fe12c063e32137567aaffdc183"} Apr 23 08:53:26.277263 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:26.277261 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2dpsp" event={"ID":"761dbda3-1985-42a4-a075-6cb13ccc1d11","Type":"ContainerStarted","Data":"27b2a12a9c4c150289bef42c7b07b8623025a87ef732d9e14d5b45234168c94b"} Apr 23 08:53:27.255019 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.254980 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ghbpk" Apr 23 08:53:27.284199 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.284171 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2bc5z" event={"ID":"5a69f4af-57b2-4d08-a860-f69a31bc13f5","Type":"ContainerStarted","Data":"3742b2adb8fbfe183d97b8c3c63ef2bbcfba6f685c821532261f489d0eedbc1b"} Apr 23 08:53:27.284463 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.284415 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2bc5z" Apr 23 08:53:27.286014 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.285960 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2dpsp" event={"ID":"761dbda3-1985-42a4-a075-6cb13ccc1d11","Type":"ContainerStarted","Data":"5e6bd31330dbad576cdf03dbb3aed1031c906a047d257d33356fca56ba0124e0"} Apr 23 08:53:27.289001 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.288969 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2bc5z" Apr 23 08:53:27.298780 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.298737 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2bc5z" podStartSLOduration=1.14596604 podStartE2EDuration="2.298726231s" podCreationTimestamp="2026-04-23 08:53:25 +0000 UTC" firstStartedPulling="2026-04-23 08:53:25.901967471 +0000 UTC m=+171.714572841" lastFinishedPulling="2026-04-23 08:53:27.054727662 +0000 UTC m=+172.867333032" observedRunningTime="2026-04-23 08:53:27.298048845 +0000 UTC m=+173.110654230" watchObservedRunningTime="2026-04-23 08:53:27.298726231 +0000 UTC m=+173.111331618" Apr 23 08:53:27.492230 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.492199 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-q8mrb"] Apr 23 08:53:27.495724 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.495707 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" Apr 23 08:53:27.497976 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.497949 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 23 08:53:27.497976 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.497950 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 23 08:53:27.498167 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.498097 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 08:53:27.498975 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.498951 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-crdf6\"" Apr 23 08:53:27.503400 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.502753 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-q8mrb"] Apr 23 08:53:27.665531 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.665502 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/34002683-14aa-4497-a022-e41ce886599a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-q8mrb\" (UID: \"34002683-14aa-4497-a022-e41ce886599a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" Apr 23 08:53:27.665653 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.665537 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/34002683-14aa-4497-a022-e41ce886599a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-q8mrb\" (UID: \"34002683-14aa-4497-a022-e41ce886599a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" Apr 23 08:53:27.665713 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.665647 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34002683-14aa-4497-a022-e41ce886599a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-q8mrb\" (UID: \"34002683-14aa-4497-a022-e41ce886599a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" Apr 23 08:53:27.665713 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.665688 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfkbk\" (UniqueName: \"kubernetes.io/projected/34002683-14aa-4497-a022-e41ce886599a-kube-api-access-gfkbk\") pod \"prometheus-operator-5676c8c784-q8mrb\" (UID: \"34002683-14aa-4497-a022-e41ce886599a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" Apr 23 08:53:27.767012 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.766936 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34002683-14aa-4497-a022-e41ce886599a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-q8mrb\" (UID: \"34002683-14aa-4497-a022-e41ce886599a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" Apr 23 08:53:27.767012 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.766973 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfkbk\" (UniqueName: \"kubernetes.io/projected/34002683-14aa-4497-a022-e41ce886599a-kube-api-access-gfkbk\") pod \"prometheus-operator-5676c8c784-q8mrb\" (UID: \"34002683-14aa-4497-a022-e41ce886599a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" Apr 23 08:53:27.767165 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.767032 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/34002683-14aa-4497-a022-e41ce886599a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-q8mrb\" (UID: \"34002683-14aa-4497-a022-e41ce886599a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" Apr 23 08:53:27.767165 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.767062 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/34002683-14aa-4497-a022-e41ce886599a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-q8mrb\" (UID: \"34002683-14aa-4497-a022-e41ce886599a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" Apr 23 08:53:27.767278 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:27.767240 2559 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 23 08:53:27.767332 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:27.767301 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34002683-14aa-4497-a022-e41ce886599a-prometheus-operator-tls podName:34002683-14aa-4497-a022-e41ce886599a nodeName:}" failed. No retries permitted until 2026-04-23 08:53:28.267282071 +0000 UTC m=+174.079887437 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/34002683-14aa-4497-a022-e41ce886599a-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-q8mrb" (UID: "34002683-14aa-4497-a022-e41ce886599a") : secret "prometheus-operator-tls" not found Apr 23 08:53:27.767672 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.767651 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34002683-14aa-4497-a022-e41ce886599a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-q8mrb\" (UID: \"34002683-14aa-4497-a022-e41ce886599a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" Apr 23 08:53:27.769727 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.769705 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/34002683-14aa-4497-a022-e41ce886599a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-q8mrb\" (UID: \"34002683-14aa-4497-a022-e41ce886599a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" Apr 23 08:53:27.774723 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:27.774700 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfkbk\" (UniqueName: \"kubernetes.io/projected/34002683-14aa-4497-a022-e41ce886599a-kube-api-access-gfkbk\") pod \"prometheus-operator-5676c8c784-q8mrb\" (UID: \"34002683-14aa-4497-a022-e41ce886599a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" Apr 23 08:53:28.270298 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:28.270255 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/34002683-14aa-4497-a022-e41ce886599a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-q8mrb\" (UID: \"34002683-14aa-4497-a022-e41ce886599a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" Apr 23 08:53:28.272889 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:28.272865 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/34002683-14aa-4497-a022-e41ce886599a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-q8mrb\" (UID: \"34002683-14aa-4497-a022-e41ce886599a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" Apr 23 08:53:28.410154 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:28.410125 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" Apr 23 08:53:28.635017 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:28.634974 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-q8mrb"] Apr 23 08:53:28.637910 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:53:28.637886 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34002683_14aa_4497_a022_e41ce886599a.slice/crio-302db83422d2f123389e4afbf17a0f8165fdf045f3db604192a49058aa9b7c66 WatchSource:0}: Error finding container 302db83422d2f123389e4afbf17a0f8165fdf045f3db604192a49058aa9b7c66: Status 404 returned error can't find the container with id 302db83422d2f123389e4afbf17a0f8165fdf045f3db604192a49058aa9b7c66 Apr 23 08:53:29.292751 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:29.292715 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" event={"ID":"34002683-14aa-4497-a022-e41ce886599a","Type":"ContainerStarted","Data":"302db83422d2f123389e4afbf17a0f8165fdf045f3db604192a49058aa9b7c66"} Apr 23 08:53:29.294425 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:29.294400 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2dpsp" event={"ID":"761dbda3-1985-42a4-a075-6cb13ccc1d11","Type":"ContainerStarted","Data":"c0d2dcfe8d3f0bd69f08e0b6b784a0acb914229d8e279ed691b44ea4f37525e3"} Apr 23 08:53:29.311386 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:29.311346 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2dpsp" podStartSLOduration=1.7626625630000001 podStartE2EDuration="4.311333576s" podCreationTimestamp="2026-04-23 08:53:25 +0000 UTC" firstStartedPulling="2026-04-23 08:53:26.024653456 +0000 UTC m=+171.837258825" lastFinishedPulling="2026-04-23 08:53:28.573324458 +0000 UTC m=+174.385929838" observedRunningTime="2026-04-23 08:53:29.310343596 +0000 UTC m=+175.122948984" watchObservedRunningTime="2026-04-23 08:53:29.311333576 +0000 UTC m=+175.123938964" Apr 23 08:53:31.301078 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:31.301046 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" event={"ID":"34002683-14aa-4497-a022-e41ce886599a","Type":"ContainerStarted","Data":"666ec016c5ace79b51e07c0c7f03852bcf830eddb3420d20ba10ba52cae844a3"} Apr 23 08:53:31.301078 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:31.301080 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" event={"ID":"34002683-14aa-4497-a022-e41ce886599a","Type":"ContainerStarted","Data":"5930767c600282c711c07fd602bf628f1e07fc6bdd78a81768e5fbe97c0c8d91"} Apr 23 08:53:31.317498 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:31.317445 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-q8mrb" podStartSLOduration=2.348365741 podStartE2EDuration="4.317432373s" podCreationTimestamp="2026-04-23 08:53:27 +0000 UTC" firstStartedPulling="2026-04-23 08:53:28.640315474 +0000 UTC m=+174.452920844" lastFinishedPulling="2026-04-23 08:53:30.609382107 +0000 UTC m=+176.421987476" observedRunningTime="2026-04-23 08:53:31.316515881 +0000 UTC m=+177.129121270" watchObservedRunningTime="2026-04-23 08:53:31.317432373 +0000 UTC m=+177.130037760" Apr 23 08:53:32.857612 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.857566 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-twl26"] Apr 23 08:53:32.861146 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.861123 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-tmsrv"] Apr 23 08:53:32.861329 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.861306 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" Apr 23 08:53:32.863498 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.863224 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 08:53:32.863498 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.863294 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-p8ptz\"" Apr 23 08:53:32.863498 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.863222 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 08:53:32.864874 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.864427 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:32.866316 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.866027 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 08:53:32.866316 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.866054 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 08:53:32.866316 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.866166 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-7cq2v\"" Apr 23 08:53:32.866316 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.866270 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 08:53:32.869671 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.869650 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tgj7l"] Apr 23 08:53:32.876104 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.876082 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-twl26"] Apr 23 08:53:32.876245 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.876231 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-tmsrv"] Apr 23 08:53:32.876440 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.876425 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:32.879976 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.879959 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 08:53:32.880428 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.880411 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 08:53:32.880789 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.880772 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-x268v\"" Apr 23 08:53:32.881175 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:32.881145 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 08:53:33.001268 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001243 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c306769a-094a-45dc-86f8-c3de6fc5d9e1-node-exporter-wtmp\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.001409 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001280 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh6t5\" (UniqueName: \"kubernetes.io/projected/c306769a-094a-45dc-86f8-c3de6fc5d9e1-kube-api-access-gh6t5\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.001409 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001307 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2642f\" (UniqueName: \"kubernetes.io/projected/b021b7b6-fa1c-439c-a9d4-1ca2e800d088-kube-api-access-2642f\") pod \"kube-state-metrics-69db897b98-tmsrv\" (UID: \"b021b7b6-fa1c-439c-a9d4-1ca2e800d088\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.001409 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001397 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b021b7b6-fa1c-439c-a9d4-1ca2e800d088-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-tmsrv\" (UID: \"b021b7b6-fa1c-439c-a9d4-1ca2e800d088\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.001568 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001449 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b021b7b6-fa1c-439c-a9d4-1ca2e800d088-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-tmsrv\" (UID: \"b021b7b6-fa1c-439c-a9d4-1ca2e800d088\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.001568 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001488 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b021b7b6-fa1c-439c-a9d4-1ca2e800d088-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-tmsrv\" (UID: \"b021b7b6-fa1c-439c-a9d4-1ca2e800d088\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.001568 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001522 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/85da8388-1bc2-4cac-b714-3814193f1216-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-twl26\" (UID: \"85da8388-1bc2-4cac-b714-3814193f1216\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" Apr 23 08:53:33.001568 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001547 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c306769a-094a-45dc-86f8-c3de6fc5d9e1-root\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.001765 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001571 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c306769a-094a-45dc-86f8-c3de6fc5d9e1-node-exporter-accelerators-collector-config\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.001765 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001608 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b021b7b6-fa1c-439c-a9d4-1ca2e800d088-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-tmsrv\" (UID: \"b021b7b6-fa1c-439c-a9d4-1ca2e800d088\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.001765 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001649 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/85da8388-1bc2-4cac-b714-3814193f1216-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-twl26\" (UID: \"85da8388-1bc2-4cac-b714-3814193f1216\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" Apr 23 08:53:33.001765 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001697 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25qxm\" (UniqueName: \"kubernetes.io/projected/85da8388-1bc2-4cac-b714-3814193f1216-kube-api-access-25qxm\") pod \"openshift-state-metrics-9d44df66c-twl26\" (UID: \"85da8388-1bc2-4cac-b714-3814193f1216\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" Apr 23 08:53:33.001765 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001725 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c306769a-094a-45dc-86f8-c3de6fc5d9e1-metrics-client-ca\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.001765 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001748 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c306769a-094a-45dc-86f8-c3de6fc5d9e1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.002088 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001801 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b021b7b6-fa1c-439c-a9d4-1ca2e800d088-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tmsrv\" (UID: \"b021b7b6-fa1c-439c-a9d4-1ca2e800d088\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.002088 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001844 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c306769a-094a-45dc-86f8-c3de6fc5d9e1-node-exporter-textfile\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.002088 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001900 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/85da8388-1bc2-4cac-b714-3814193f1216-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-twl26\" (UID: \"85da8388-1bc2-4cac-b714-3814193f1216\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" Apr 23 08:53:33.002088 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001946 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c306769a-094a-45dc-86f8-c3de6fc5d9e1-sys\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.002088 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.001999 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c306769a-094a-45dc-86f8-c3de6fc5d9e1-node-exporter-tls\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.103099 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103072 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b021b7b6-fa1c-439c-a9d4-1ca2e800d088-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-tmsrv\" (UID: \"b021b7b6-fa1c-439c-a9d4-1ca2e800d088\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.103198 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103117 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b021b7b6-fa1c-439c-a9d4-1ca2e800d088-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-tmsrv\" (UID: \"b021b7b6-fa1c-439c-a9d4-1ca2e800d088\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.103198 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103150 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b021b7b6-fa1c-439c-a9d4-1ca2e800d088-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-tmsrv\" (UID: \"b021b7b6-fa1c-439c-a9d4-1ca2e800d088\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.103198 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103182 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/85da8388-1bc2-4cac-b714-3814193f1216-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-twl26\" (UID: \"85da8388-1bc2-4cac-b714-3814193f1216\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" Apr 23 08:53:33.103327 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103206 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c306769a-094a-45dc-86f8-c3de6fc5d9e1-root\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.103327 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103263 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c306769a-094a-45dc-86f8-c3de6fc5d9e1-root\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.103425 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:33.103377 2559 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 23 08:53:33.103425 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103400 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c306769a-094a-45dc-86f8-c3de6fc5d9e1-node-exporter-accelerators-collector-config\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.103528 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:33.103439 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85da8388-1bc2-4cac-b714-3814193f1216-openshift-state-metrics-tls podName:85da8388-1bc2-4cac-b714-3814193f1216 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:33.603418411 +0000 UTC m=+179.416023791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/85da8388-1bc2-4cac-b714-3814193f1216-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-twl26" (UID: "85da8388-1bc2-4cac-b714-3814193f1216") : secret "openshift-state-metrics-tls" not found Apr 23 08:53:33.103528 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103462 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b021b7b6-fa1c-439c-a9d4-1ca2e800d088-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-tmsrv\" (UID: \"b021b7b6-fa1c-439c-a9d4-1ca2e800d088\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.103528 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103495 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/85da8388-1bc2-4cac-b714-3814193f1216-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-twl26\" (UID: \"85da8388-1bc2-4cac-b714-3814193f1216\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" Apr 23 08:53:33.103688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103535 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25qxm\" (UniqueName: \"kubernetes.io/projected/85da8388-1bc2-4cac-b714-3814193f1216-kube-api-access-25qxm\") pod \"openshift-state-metrics-9d44df66c-twl26\" (UID: \"85da8388-1bc2-4cac-b714-3814193f1216\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" Apr 23 08:53:33.103688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103564 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c306769a-094a-45dc-86f8-c3de6fc5d9e1-metrics-client-ca\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.103688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103590 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c306769a-094a-45dc-86f8-c3de6fc5d9e1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.103688 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103639 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b021b7b6-fa1c-439c-a9d4-1ca2e800d088-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tmsrv\" (UID: \"b021b7b6-fa1c-439c-a9d4-1ca2e800d088\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.103929 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103691 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c306769a-094a-45dc-86f8-c3de6fc5d9e1-node-exporter-textfile\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.103929 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103728 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/85da8388-1bc2-4cac-b714-3814193f1216-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-twl26\" (UID: \"85da8388-1bc2-4cac-b714-3814193f1216\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" Apr 23 08:53:33.103929 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103752 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c306769a-094a-45dc-86f8-c3de6fc5d9e1-sys\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.103929 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103778 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b021b7b6-fa1c-439c-a9d4-1ca2e800d088-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-tmsrv\" (UID: \"b021b7b6-fa1c-439c-a9d4-1ca2e800d088\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.103929 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103783 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c306769a-094a-45dc-86f8-c3de6fc5d9e1-node-exporter-tls\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.103929 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103836 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c306769a-094a-45dc-86f8-c3de6fc5d9e1-node-exporter-wtmp\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.103929 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103865 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gh6t5\" (UniqueName: \"kubernetes.io/projected/c306769a-094a-45dc-86f8-c3de6fc5d9e1-kube-api-access-gh6t5\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.103929 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103892 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2642f\" (UniqueName: \"kubernetes.io/projected/b021b7b6-fa1c-439c-a9d4-1ca2e800d088-kube-api-access-2642f\") pod \"kube-state-metrics-69db897b98-tmsrv\" (UID: \"b021b7b6-fa1c-439c-a9d4-1ca2e800d088\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.104353 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.103976 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c306769a-094a-45dc-86f8-c3de6fc5d9e1-node-exporter-accelerators-collector-config\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.104353 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.104242 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c306769a-094a-45dc-86f8-c3de6fc5d9e1-node-exporter-wtmp\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.104353 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.104276 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c306769a-094a-45dc-86f8-c3de6fc5d9e1-node-exporter-textfile\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.104353 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.104347 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c306769a-094a-45dc-86f8-c3de6fc5d9e1-metrics-client-ca\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.104558 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:33.103867 2559 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 08:53:33.104558 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:33.104474 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c306769a-094a-45dc-86f8-c3de6fc5d9e1-node-exporter-tls podName:c306769a-094a-45dc-86f8-c3de6fc5d9e1 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:33.604456984 +0000 UTC m=+179.417062354 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/c306769a-094a-45dc-86f8-c3de6fc5d9e1-node-exporter-tls") pod "node-exporter-tgj7l" (UID: "c306769a-094a-45dc-86f8-c3de6fc5d9e1") : secret "node-exporter-tls" not found Apr 23 08:53:33.104558 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.104424 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b021b7b6-fa1c-439c-a9d4-1ca2e800d088-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-tmsrv\" (UID: \"b021b7b6-fa1c-439c-a9d4-1ca2e800d088\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.104558 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.104535 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c306769a-094a-45dc-86f8-c3de6fc5d9e1-sys\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.105374 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.105350 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/85da8388-1bc2-4cac-b714-3814193f1216-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-twl26\" (UID: \"85da8388-1bc2-4cac-b714-3814193f1216\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" Apr 23 08:53:33.105939 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.105920 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b021b7b6-fa1c-439c-a9d4-1ca2e800d088-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-tmsrv\" (UID: \"b021b7b6-fa1c-439c-a9d4-1ca2e800d088\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.107086 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.107063 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b021b7b6-fa1c-439c-a9d4-1ca2e800d088-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-tmsrv\" (UID: \"b021b7b6-fa1c-439c-a9d4-1ca2e800d088\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.108366 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.108313 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c306769a-094a-45dc-86f8-c3de6fc5d9e1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.108454 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.108417 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/85da8388-1bc2-4cac-b714-3814193f1216-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-twl26\" (UID: \"85da8388-1bc2-4cac-b714-3814193f1216\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" Apr 23 08:53:33.108515 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.108450 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b021b7b6-fa1c-439c-a9d4-1ca2e800d088-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tmsrv\" (UID: \"b021b7b6-fa1c-439c-a9d4-1ca2e800d088\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.111912 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.111889 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25qxm\" (UniqueName: \"kubernetes.io/projected/85da8388-1bc2-4cac-b714-3814193f1216-kube-api-access-25qxm\") pod \"openshift-state-metrics-9d44df66c-twl26\" (UID: \"85da8388-1bc2-4cac-b714-3814193f1216\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" Apr 23 08:53:33.111912 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.111895 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh6t5\" (UniqueName: \"kubernetes.io/projected/c306769a-094a-45dc-86f8-c3de6fc5d9e1-kube-api-access-gh6t5\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.112619 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.112601 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2642f\" (UniqueName: \"kubernetes.io/projected/b021b7b6-fa1c-439c-a9d4-1ca2e800d088-kube-api-access-2642f\") pod \"kube-state-metrics-69db897b98-tmsrv\" (UID: \"b021b7b6-fa1c-439c-a9d4-1ca2e800d088\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.187275 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.187242 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" Apr 23 08:53:33.316027 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.316000 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-tmsrv"] Apr 23 08:53:33.318774 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:53:33.318742 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb021b7b6_fa1c_439c_a9d4_1ca2e800d088.slice/crio-a8788f4def3260874e2753c56340435f179fc863ad2bddca9bf89d8aa7dd23f6 WatchSource:0}: Error finding container a8788f4def3260874e2753c56340435f179fc863ad2bddca9bf89d8aa7dd23f6: Status 404 returned error can't find the container with id a8788f4def3260874e2753c56340435f179fc863ad2bddca9bf89d8aa7dd23f6 Apr 23 08:53:33.606159 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.606132 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c306769a-094a-45dc-86f8-c3de6fc5d9e1-node-exporter-tls\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.606308 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.606178 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/85da8388-1bc2-4cac-b714-3814193f1216-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-twl26\" (UID: \"85da8388-1bc2-4cac-b714-3814193f1216\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" Apr 23 08:53:33.608321 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.608292 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c306769a-094a-45dc-86f8-c3de6fc5d9e1-node-exporter-tls\") pod \"node-exporter-tgj7l\" (UID: \"c306769a-094a-45dc-86f8-c3de6fc5d9e1\") " pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.608435 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.608344 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/85da8388-1bc2-4cac-b714-3814193f1216-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-twl26\" (UID: \"85da8388-1bc2-4cac-b714-3814193f1216\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" Apr 23 08:53:33.776781 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.776750 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" Apr 23 08:53:33.798390 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.798359 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tgj7l" Apr 23 08:53:33.816920 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:53:33.816881 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc306769a_094a_45dc_86f8_c3de6fc5d9e1.slice/crio-dbfc339874fd8c33f4865517f511a73ee836abe7b1351419d7158dfd24da6133 WatchSource:0}: Error finding container dbfc339874fd8c33f4865517f511a73ee836abe7b1351419d7158dfd24da6133: Status 404 returned error can't find the container with id dbfc339874fd8c33f4865517f511a73ee836abe7b1351419d7158dfd24da6133 Apr 23 08:53:33.912373 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:33.912340 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-twl26"] Apr 23 08:53:33.916051 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:53:33.916023 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85da8388_1bc2_4cac_b714_3814193f1216.slice/crio-6f2bd5252312d6c8041b6df7eec047ea939d4eb5d621eda3029abc1884d0e845 WatchSource:0}: Error finding container 6f2bd5252312d6c8041b6df7eec047ea939d4eb5d621eda3029abc1884d0e845: Status 404 returned error can't find the container with id 6f2bd5252312d6c8041b6df7eec047ea939d4eb5d621eda3029abc1884d0e845 Apr 23 08:53:34.314656 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.314615 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" event={"ID":"85da8388-1bc2-4cac-b714-3814193f1216","Type":"ContainerStarted","Data":"b5fd3d30d4a5e0dd9ee6726ce7ff48504f337013d03ed26b82f925fa76958c23"} Apr 23 08:53:34.314656 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.314654 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" event={"ID":"85da8388-1bc2-4cac-b714-3814193f1216","Type":"ContainerStarted","Data":"c6e315292f3097431776dd20f588b30004fecfb025d97b76f1c78098a7316f26"} Apr 23 08:53:34.314928 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.314671 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" event={"ID":"85da8388-1bc2-4cac-b714-3814193f1216","Type":"ContainerStarted","Data":"6f2bd5252312d6c8041b6df7eec047ea939d4eb5d621eda3029abc1884d0e845"} Apr 23 08:53:34.315919 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.315890 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" event={"ID":"b021b7b6-fa1c-439c-a9d4-1ca2e800d088","Type":"ContainerStarted","Data":"a8788f4def3260874e2753c56340435f179fc863ad2bddca9bf89d8aa7dd23f6"} Apr 23 08:53:34.317051 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.317022 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tgj7l" event={"ID":"c306769a-094a-45dc-86f8-c3de6fc5d9e1","Type":"ContainerStarted","Data":"dbfc339874fd8c33f4865517f511a73ee836abe7b1351419d7158dfd24da6133"} Apr 23 08:53:34.826064 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.826032 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t"] Apr 23 08:53:34.829269 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.829251 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:34.831636 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.831611 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-bgkh39ia4s2ur\"" Apr 23 08:53:34.831768 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.831639 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 08:53:34.831768 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.831749 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 08:53:34.831768 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.831761 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 08:53:34.831914 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.831763 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 08:53:34.831914 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.831903 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 08:53:34.832128 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.832097 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-knc92\"" Apr 23 08:53:34.838644 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.838621 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t"] Apr 23 08:53:34.914944 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.914917 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/47eda6be-5d7e-4d5c-a453-64682ed1caec-secret-grpc-tls\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:34.915338 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.914951 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/47eda6be-5d7e-4d5c-a453-64682ed1caec-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:34.915338 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.915007 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/47eda6be-5d7e-4d5c-a453-64682ed1caec-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:34.915338 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.915110 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/47eda6be-5d7e-4d5c-a453-64682ed1caec-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:34.915338 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.915144 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/47eda6be-5d7e-4d5c-a453-64682ed1caec-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:34.915338 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.915247 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4r48\" (UniqueName: \"kubernetes.io/projected/47eda6be-5d7e-4d5c-a453-64682ed1caec-kube-api-access-d4r48\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:34.915338 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.915305 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47eda6be-5d7e-4d5c-a453-64682ed1caec-metrics-client-ca\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:34.915338 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:34.915337 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/47eda6be-5d7e-4d5c-a453-64682ed1caec-secret-thanos-querier-tls\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:35.016144 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.016121 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4r48\" (UniqueName: \"kubernetes.io/projected/47eda6be-5d7e-4d5c-a453-64682ed1caec-kube-api-access-d4r48\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:35.016263 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.016156 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47eda6be-5d7e-4d5c-a453-64682ed1caec-metrics-client-ca\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:35.016263 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.016175 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/47eda6be-5d7e-4d5c-a453-64682ed1caec-secret-thanos-querier-tls\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:35.016263 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.016202 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/47eda6be-5d7e-4d5c-a453-64682ed1caec-secret-grpc-tls\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:35.016263 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.016229 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/47eda6be-5d7e-4d5c-a453-64682ed1caec-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:35.016263 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.016262 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/47eda6be-5d7e-4d5c-a453-64682ed1caec-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:35.016547 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.016321 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/47eda6be-5d7e-4d5c-a453-64682ed1caec-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:35.016547 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.016370 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/47eda6be-5d7e-4d5c-a453-64682ed1caec-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:35.017975 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.017915 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47eda6be-5d7e-4d5c-a453-64682ed1caec-metrics-client-ca\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:35.019306 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.019257 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/47eda6be-5d7e-4d5c-a453-64682ed1caec-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:35.019415 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.019331 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/47eda6be-5d7e-4d5c-a453-64682ed1caec-secret-grpc-tls\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:35.019415 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.019335 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/47eda6be-5d7e-4d5c-a453-64682ed1caec-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:35.019665 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.019569 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/47eda6be-5d7e-4d5c-a453-64682ed1caec-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:35.019780 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.019754 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/47eda6be-5d7e-4d5c-a453-64682ed1caec-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:35.019939 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.019918 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/47eda6be-5d7e-4d5c-a453-64682ed1caec-secret-thanos-querier-tls\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:35.022856 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.022837 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4r48\" (UniqueName: \"kubernetes.io/projected/47eda6be-5d7e-4d5c-a453-64682ed1caec-kube-api-access-d4r48\") pod \"thanos-querier-56bdc7c8dd-qg52t\" (UID: \"47eda6be-5d7e-4d5c-a453-64682ed1caec\") " pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:35.139103 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.139077 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:35.300299 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.300268 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t"] Apr 23 08:53:35.319854 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:53:35.319821 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47eda6be_5d7e_4d5c_a453_64682ed1caec.slice/crio-6298f980ec8baf04f3cc18de9c6c080e525a6590915a4b80e1a30c031f172564 WatchSource:0}: Error finding container 6298f980ec8baf04f3cc18de9c6c080e525a6590915a4b80e1a30c031f172564: Status 404 returned error can't find the container with id 6298f980ec8baf04f3cc18de9c6c080e525a6590915a4b80e1a30c031f172564 Apr 23 08:53:35.322209 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.322175 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" event={"ID":"b021b7b6-fa1c-439c-a9d4-1ca2e800d088","Type":"ContainerStarted","Data":"f76cde3665580fb5bbd414d85205146e08bcf5abd6bc25349694a709246c7183"} Apr 23 08:53:35.322320 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.322215 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" event={"ID":"b021b7b6-fa1c-439c-a9d4-1ca2e800d088","Type":"ContainerStarted","Data":"8859f6e5e54fedee4a67567d5edadf750471099a5050a3e15f3f46a97e0dceb7"} Apr 23 08:53:35.322320 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.322230 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" event={"ID":"b021b7b6-fa1c-439c-a9d4-1ca2e800d088","Type":"ContainerStarted","Data":"4bfe34f828efa01432141a058058b0a93567d81d5614f5b70ce38fef352629f0"} Apr 23 08:53:35.323628 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.323576 2559 generic.go:358] "Generic (PLEG): container finished" podID="c306769a-094a-45dc-86f8-c3de6fc5d9e1" containerID="813476c3d507d4f6e9d1dd5fa8eee5ad6cc9d00b3d591e3233b921d61a4b2a3d" exitCode=0 Apr 23 08:53:35.323628 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.323621 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tgj7l" event={"ID":"c306769a-094a-45dc-86f8-c3de6fc5d9e1","Type":"ContainerDied","Data":"813476c3d507d4f6e9d1dd5fa8eee5ad6cc9d00b3d591e3233b921d61a4b2a3d"} Apr 23 08:53:35.339764 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.339702 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-tmsrv" podStartSLOduration=1.649041975 podStartE2EDuration="3.339691299s" podCreationTimestamp="2026-04-23 08:53:32 +0000 UTC" firstStartedPulling="2026-04-23 08:53:33.320603613 +0000 UTC m=+179.133208979" lastFinishedPulling="2026-04-23 08:53:35.011252934 +0000 UTC m=+180.823858303" observedRunningTime="2026-04-23 08:53:35.338415265 +0000 UTC m=+181.151020668" watchObservedRunningTime="2026-04-23 08:53:35.339691299 +0000 UTC m=+181.152296684" Apr 23 08:53:35.498511 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:35.498483 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:53:36.328224 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:36.328177 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" event={"ID":"47eda6be-5d7e-4d5c-a453-64682ed1caec","Type":"ContainerStarted","Data":"6298f980ec8baf04f3cc18de9c6c080e525a6590915a4b80e1a30c031f172564"} Apr 23 08:53:36.330639 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:36.330614 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tgj7l" event={"ID":"c306769a-094a-45dc-86f8-c3de6fc5d9e1","Type":"ContainerStarted","Data":"a9a5ad6220cc6f628070fc2f93bfe3f07f96664d79fcc7bb089dfa9edce9fbab"} Apr 23 08:53:36.330745 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:36.330665 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tgj7l" event={"ID":"c306769a-094a-45dc-86f8-c3de6fc5d9e1","Type":"ContainerStarted","Data":"2433ad00ba2cfbc8644748d7098e920e4c3e4cfdc72ec990bec7618e6f9382fb"} Apr 23 08:53:36.348328 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:36.348272 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tgj7l" podStartSLOduration=3.154239013 podStartE2EDuration="4.348257633s" podCreationTimestamp="2026-04-23 08:53:32 +0000 UTC" firstStartedPulling="2026-04-23 08:53:33.818784163 +0000 UTC m=+179.631389537" lastFinishedPulling="2026-04-23 08:53:35.012802791 +0000 UTC m=+180.825408157" observedRunningTime="2026-04-23 08:53:36.347287266 +0000 UTC m=+182.159892681" watchObservedRunningTime="2026-04-23 08:53:36.348257633 +0000 UTC m=+182.160863079" Apr 23 08:53:37.334615 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:37.334590 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" event={"ID":"85da8388-1bc2-4cac-b714-3814193f1216","Type":"ContainerStarted","Data":"13dde83ad056445ed52cda5de30c4ecd6d8497aed7309c8cc15fa172d30b6912"} Apr 23 08:53:37.351579 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:37.351539 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-twl26" podStartSLOduration=3.060871205 podStartE2EDuration="5.351524696s" podCreationTimestamp="2026-04-23 08:53:32 +0000 UTC" firstStartedPulling="2026-04-23 08:53:34.02582136 +0000 UTC m=+179.838426725" lastFinishedPulling="2026-04-23 08:53:36.316474851 +0000 UTC m=+182.129080216" observedRunningTime="2026-04-23 08:53:37.350443583 +0000 UTC m=+183.163048981" watchObservedRunningTime="2026-04-23 08:53:37.351524696 +0000 UTC m=+183.164130084" Apr 23 08:53:37.619122 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:37.616856 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-w2l5x"] Apr 23 08:53:37.622399 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:37.622374 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w2l5x" Apr 23 08:53:37.624166 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:37.624143 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 08:53:37.624277 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:37.624151 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-z9g7h\"" Apr 23 08:53:37.625174 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:37.625143 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-w2l5x"] Apr 23 08:53:37.641722 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:37.640135 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2921030c-c941-46f7-b825-ed90bf427d87-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-w2l5x\" (UID: \"2921030c-c941-46f7-b825-ed90bf427d87\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w2l5x" Apr 23 08:53:37.741449 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:37.741427 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2921030c-c941-46f7-b825-ed90bf427d87-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-w2l5x\" (UID: \"2921030c-c941-46f7-b825-ed90bf427d87\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w2l5x" Apr 23 08:53:37.741575 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:37.741558 2559 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 23 08:53:37.741629 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:37.741617 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2921030c-c941-46f7-b825-ed90bf427d87-monitoring-plugin-cert podName:2921030c-c941-46f7-b825-ed90bf427d87 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:38.241602239 +0000 UTC m=+184.054207604 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/2921030c-c941-46f7-b825-ed90bf427d87-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-w2l5x" (UID: "2921030c-c941-46f7-b825-ed90bf427d87") : secret "monitoring-plugin-cert" not found Apr 23 08:53:38.245828 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:38.245789 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2921030c-c941-46f7-b825-ed90bf427d87-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-w2l5x\" (UID: \"2921030c-c941-46f7-b825-ed90bf427d87\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w2l5x" Apr 23 08:53:38.253857 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:38.253833 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2921030c-c941-46f7-b825-ed90bf427d87-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-w2l5x\" (UID: \"2921030c-c941-46f7-b825-ed90bf427d87\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w2l5x" Apr 23 08:53:38.339944 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:38.339907 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" event={"ID":"47eda6be-5d7e-4d5c-a453-64682ed1caec","Type":"ContainerStarted","Data":"6530889ca7c38e41e28f6676418f694b1d83aa2091a180307e3b5a5145969f37"} Apr 23 08:53:38.340295 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:38.339948 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" event={"ID":"47eda6be-5d7e-4d5c-a453-64682ed1caec","Type":"ContainerStarted","Data":"16954db2fd4f42ea62974bd220ef7640a54145584a783dd73e9f73298061e85a"} Apr 23 08:53:38.340295 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:38.339964 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" event={"ID":"47eda6be-5d7e-4d5c-a453-64682ed1caec","Type":"ContainerStarted","Data":"eb037b37921194b0d734b63593d49ce7e8065b70170056595b345935846bcf17"} Apr 23 08:53:38.533459 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:38.533388 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w2l5x" Apr 23 08:53:38.655344 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:38.655227 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-w2l5x"] Apr 23 08:53:38.657902 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:53:38.657879 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2921030c_c941_46f7_b825_ed90bf427d87.slice/crio-e027123669f858e2e529d19ce40788f16b7085ad50ada99f10ccc83255c1b045 WatchSource:0}: Error finding container e027123669f858e2e529d19ce40788f16b7085ad50ada99f10ccc83255c1b045: Status 404 returned error can't find the container with id e027123669f858e2e529d19ce40788f16b7085ad50ada99f10ccc83255c1b045 Apr 23 08:53:39.034856 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.032605 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:53:39.037132 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.037108 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.041046 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.039627 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 08:53:39.041046 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.039954 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 08:53:39.041046 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.040199 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 08:53:39.041046 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.040564 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 08:53:39.041046 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.040744 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 08:53:39.041046 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.040896 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 08:53:39.045021 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.041546 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-pcflh\"" Apr 23 08:53:39.045021 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.041762 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 08:53:39.045021 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.042275 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-tg1oe96okg3c\"" Apr 23 08:53:39.045021 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.042503 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 08:53:39.045021 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.042672 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 08:53:39.046089 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.045777 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 08:53:39.046399 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.046379 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 08:53:39.046581 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.046555 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 08:53:39.047699 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.047677 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 08:53:39.052776 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.052404 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp4dk\" (UniqueName: \"kubernetes.io/projected/f6558060-afa9-4b47-b7f8-f9b1c572e562-kube-api-access-kp4dk\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.052776 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.052447 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.052776 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.052499 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.052776 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.052539 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.052776 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.052574 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.052776 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.052601 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.052776 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.052628 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.052776 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.052644 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:53:39.052776 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.052656 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.052776 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.052679 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.052776 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.052725 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.052776 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.052775 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.053369 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.052853 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f6558060-afa9-4b47-b7f8-f9b1c572e562-config-out\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.053369 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.052881 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f6558060-afa9-4b47-b7f8-f9b1c572e562-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.053369 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.052904 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-web-config\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.053369 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.052930 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-config\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.053369 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.052955 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.053369 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.052980 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.053369 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.053059 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.154243 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.154212 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.154368 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.154257 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.154368 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.154278 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.154368 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.154295 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.154368 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.154310 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.154368 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.154326 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.154368 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.154354 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f6558060-afa9-4b47-b7f8-f9b1c572e562-config-out\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.154657 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.154378 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f6558060-afa9-4b47-b7f8-f9b1c572e562-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.154657 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.154400 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-web-config\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.154782 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.154759 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.155552 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.155214 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.155552 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.155312 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-config\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.155552 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.155348 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.155552 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.155374 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.155552 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.155433 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.155552 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.155467 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kp4dk\" (UniqueName: \"kubernetes.io/projected/f6558060-afa9-4b47-b7f8-f9b1c572e562-kube-api-access-kp4dk\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.155552 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.155499 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.155976 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.155565 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.155976 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.155614 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.155976 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.155655 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.155976 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:39.155798 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-trusted-ca-bundle podName:f6558060-afa9-4b47-b7f8-f9b1c572e562 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:39.655778517 +0000 UTC m=+185.468383884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562") : configmap references non-existent config key: ca-bundle.crt Apr 23 08:53:39.157440 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.157406 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.157544 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.157410 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f6558060-afa9-4b47-b7f8-f9b1c572e562-config-out\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.157544 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.157450 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-web-config\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.157544 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.157461 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.157692 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.157637 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.158492 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.158315 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.158492 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.158377 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.158492 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.158449 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-config\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.158697 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.158672 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f6558060-afa9-4b47-b7f8-f9b1c572e562-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.158882 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.158862 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.159704 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.159677 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.159801 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.159786 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.160034 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.160016 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.160476 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.160459 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.166575 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.166555 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp4dk\" (UniqueName: \"kubernetes.io/projected/f6558060-afa9-4b47-b7f8-f9b1c572e562-kube-api-access-kp4dk\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.343726 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.343644 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w2l5x" event={"ID":"2921030c-c941-46f7-b825-ed90bf427d87","Type":"ContainerStarted","Data":"e027123669f858e2e529d19ce40788f16b7085ad50ada99f10ccc83255c1b045"} Apr 23 08:53:39.346436 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.346410 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" event={"ID":"47eda6be-5d7e-4d5c-a453-64682ed1caec","Type":"ContainerStarted","Data":"ff2bd2744ad897c64644d825b1b90a6e1b2e40f118405a06e16cd71483d409a2"} Apr 23 08:53:39.346556 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.346443 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" event={"ID":"47eda6be-5d7e-4d5c-a453-64682ed1caec","Type":"ContainerStarted","Data":"248050f21b613cb0a330e11195e822bb7973ee0292a95a65e47a25695959e61b"} Apr 23 08:53:39.346556 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.346456 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" event={"ID":"47eda6be-5d7e-4d5c-a453-64682ed1caec","Type":"ContainerStarted","Data":"d75e7a41e30c3431e280440304fc55afb2b8d5bdfb6e28b307418d41da09306e"} Apr 23 08:53:39.346648 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.346620 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:39.372385 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.372341 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" podStartSLOduration=2.152592305 podStartE2EDuration="5.372325526s" podCreationTimestamp="2026-04-23 08:53:34 +0000 UTC" firstStartedPulling="2026-04-23 08:53:35.321772909 +0000 UTC m=+181.134378278" lastFinishedPulling="2026-04-23 08:53:38.541506123 +0000 UTC m=+184.354111499" observedRunningTime="2026-04-23 08:53:39.371197165 +0000 UTC m=+185.183802587" watchObservedRunningTime="2026-04-23 08:53:39.372325526 +0000 UTC m=+185.184930940" Apr 23 08:53:39.661073 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.660975 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.662619 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.662599 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:39.961922 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:39.961836 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:40.241012 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:40.240969 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:53:40.244068 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:53:40.244035 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6558060_afa9_4b47_b7f8_f9b1c572e562.slice/crio-6ac7657dc5de6472b6ecd6c50cae9d0edbaae01e8d93a27e8256ad1e3cee9e4a WatchSource:0}: Error finding container 6ac7657dc5de6472b6ecd6c50cae9d0edbaae01e8d93a27e8256ad1e3cee9e4a: Status 404 returned error can't find the container with id 6ac7657dc5de6472b6ecd6c50cae9d0edbaae01e8d93a27e8256ad1e3cee9e4a Apr 23 08:53:40.350345 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:40.350308 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6558060-afa9-4b47-b7f8-f9b1c572e562","Type":"ContainerStarted","Data":"6ac7657dc5de6472b6ecd6c50cae9d0edbaae01e8d93a27e8256ad1e3cee9e4a"} Apr 23 08:53:40.351646 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:40.351623 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w2l5x" event={"ID":"2921030c-c941-46f7-b825-ed90bf427d87","Type":"ContainerStarted","Data":"c7710c76d22837b40807485062d3275f0280bce91c3de8afabea4533dfbaa8f0"} Apr 23 08:53:40.351946 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:40.351926 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w2l5x" Apr 23 08:53:40.356634 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:40.356618 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w2l5x" Apr 23 08:53:40.365314 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:40.365278 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w2l5x" podStartSLOduration=1.859576407 podStartE2EDuration="3.365266644s" podCreationTimestamp="2026-04-23 08:53:37 +0000 UTC" firstStartedPulling="2026-04-23 08:53:38.659772416 +0000 UTC m=+184.472377796" lastFinishedPulling="2026-04-23 08:53:40.16546265 +0000 UTC m=+185.978068033" observedRunningTime="2026-04-23 08:53:40.364364532 +0000 UTC m=+186.176969946" watchObservedRunningTime="2026-04-23 08:53:40.365266644 +0000 UTC m=+186.177872033" Apr 23 08:53:43.362600 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:43.362574 2559 generic.go:358] "Generic (PLEG): container finished" podID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerID="80b53333ed79d903f4955bdfcbe5e142ba7449b16bf35e4a6b58f7928636618b" exitCode=0 Apr 23 08:53:43.362967 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:43.362657 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6558060-afa9-4b47-b7f8-f9b1c572e562","Type":"ContainerDied","Data":"80b53333ed79d903f4955bdfcbe5e142ba7449b16bf35e4a6b58f7928636618b"} Apr 23 08:53:45.358270 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:45.358242 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-56bdc7c8dd-qg52t" Apr 23 08:53:47.377683 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:47.377651 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6558060-afa9-4b47-b7f8-f9b1c572e562","Type":"ContainerStarted","Data":"0a9b11791ae5739618f6b35a7c4bfd1502252f7268ad53a9ba8af2498caff0f4"} Apr 23 08:53:47.378025 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:47.377690 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6558060-afa9-4b47-b7f8-f9b1c572e562","Type":"ContainerStarted","Data":"fd052262447edfb3a9d45fb2dde8fb7fa0fa53e3fe22e0a0aca79c9f5af5eb5e"} Apr 23 08:53:47.378025 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:47.377701 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6558060-afa9-4b47-b7f8-f9b1c572e562","Type":"ContainerStarted","Data":"a6f965776e93015e8a0bc77270d73301af71015cd1d58057b92ba304099192ee"} Apr 23 08:53:47.378025 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:47.377709 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6558060-afa9-4b47-b7f8-f9b1c572e562","Type":"ContainerStarted","Data":"399f9b4a32c6ed6c4e28ff19f0418db06a0b95b007d0bd7a688c4718acd1b39f"} Apr 23 08:53:47.378025 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:47.377717 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6558060-afa9-4b47-b7f8-f9b1c572e562","Type":"ContainerStarted","Data":"c319dd1f0b68152dbcb92c4c1ec4eb24007da45db8e71496bc697c94e750d66a"} Apr 23 08:53:47.378025 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:47.377725 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6558060-afa9-4b47-b7f8-f9b1c572e562","Type":"ContainerStarted","Data":"4e935d03b93589920eaa2bb8acc90ff2ca4e32b6a692a72a1eede6a6a45895af"} Apr 23 08:53:47.401992 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:47.401947 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.025777139 podStartE2EDuration="8.401929387s" podCreationTimestamp="2026-04-23 08:53:39 +0000 UTC" firstStartedPulling="2026-04-23 08:53:40.246414196 +0000 UTC m=+186.059019565" lastFinishedPulling="2026-04-23 08:53:46.622566432 +0000 UTC m=+192.435171813" observedRunningTime="2026-04-23 08:53:47.400668514 +0000 UTC m=+193.213273901" watchObservedRunningTime="2026-04-23 08:53:47.401929387 +0000 UTC m=+193.214534778" Apr 23 08:53:49.962060 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:49.962025 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:53:50.514010 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.513930 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" podUID="4ddd3684-ea96-4a10-88e8-669c7b940363" containerName="registry" containerID="cri-o://c5f247c71c6cf24ad188a43c2248e17c88c525ba6395c6855577ab312feab2dd" gracePeriod=30 Apr 23 08:53:50.748619 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.748597 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:53:50.860226 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.860166 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ddd3684-ea96-4a10-88e8-669c7b940363-ca-trust-extracted\") pod \"4ddd3684-ea96-4a10-88e8-669c7b940363\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " Apr 23 08:53:50.860226 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.860210 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-certificates\") pod \"4ddd3684-ea96-4a10-88e8-669c7b940363\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " Apr 23 08:53:50.860380 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.860242 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-bound-sa-token\") pod \"4ddd3684-ea96-4a10-88e8-669c7b940363\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " Apr 23 08:53:50.860380 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.860289 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ddd3684-ea96-4a10-88e8-669c7b940363-trusted-ca\") pod \"4ddd3684-ea96-4a10-88e8-669c7b940363\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " Apr 23 08:53:50.860380 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.860318 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls\") pod \"4ddd3684-ea96-4a10-88e8-669c7b940363\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " Apr 23 08:53:50.860522 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.860385 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgzt6\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-kube-api-access-cgzt6\") pod \"4ddd3684-ea96-4a10-88e8-669c7b940363\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " Apr 23 08:53:50.860522 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.860411 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ddd3684-ea96-4a10-88e8-669c7b940363-installation-pull-secrets\") pod \"4ddd3684-ea96-4a10-88e8-669c7b940363\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " Apr 23 08:53:50.860522 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.860439 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4ddd3684-ea96-4a10-88e8-669c7b940363-image-registry-private-configuration\") pod \"4ddd3684-ea96-4a10-88e8-669c7b940363\" (UID: \"4ddd3684-ea96-4a10-88e8-669c7b940363\") " Apr 23 08:53:50.860945 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.860895 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4ddd3684-ea96-4a10-88e8-669c7b940363" (UID: "4ddd3684-ea96-4a10-88e8-669c7b940363"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:53:50.861079 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.861024 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ddd3684-ea96-4a10-88e8-669c7b940363-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4ddd3684-ea96-4a10-88e8-669c7b940363" (UID: "4ddd3684-ea96-4a10-88e8-669c7b940363"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:53:50.862824 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.862798 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4ddd3684-ea96-4a10-88e8-669c7b940363" (UID: "4ddd3684-ea96-4a10-88e8-669c7b940363"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:53:50.862824 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.862807 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ddd3684-ea96-4a10-88e8-669c7b940363-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4ddd3684-ea96-4a10-88e8-669c7b940363" (UID: "4ddd3684-ea96-4a10-88e8-669c7b940363"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:53:50.863072 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.863054 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-kube-api-access-cgzt6" (OuterVolumeSpecName: "kube-api-access-cgzt6") pod "4ddd3684-ea96-4a10-88e8-669c7b940363" (UID: "4ddd3684-ea96-4a10-88e8-669c7b940363"). InnerVolumeSpecName "kube-api-access-cgzt6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:53:50.863173 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.863156 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4ddd3684-ea96-4a10-88e8-669c7b940363" (UID: "4ddd3684-ea96-4a10-88e8-669c7b940363"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:53:50.863286 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.863272 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ddd3684-ea96-4a10-88e8-669c7b940363-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "4ddd3684-ea96-4a10-88e8-669c7b940363" (UID: "4ddd3684-ea96-4a10-88e8-669c7b940363"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:53:50.869235 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.869213 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ddd3684-ea96-4a10-88e8-669c7b940363-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4ddd3684-ea96-4a10-88e8-669c7b940363" (UID: "4ddd3684-ea96-4a10-88e8-669c7b940363"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:53:50.961313 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.961290 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cgzt6\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-kube-api-access-cgzt6\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:53:50.961402 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.961315 2559 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ddd3684-ea96-4a10-88e8-669c7b940363-installation-pull-secrets\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:53:50.961402 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.961330 2559 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4ddd3684-ea96-4a10-88e8-669c7b940363-image-registry-private-configuration\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:53:50.961402 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.961344 2559 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ddd3684-ea96-4a10-88e8-669c7b940363-ca-trust-extracted\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:53:50.961402 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.961358 2559 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-certificates\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:53:50.961402 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.961372 2559 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-bound-sa-token\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:53:50.961402 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.961385 2559 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ddd3684-ea96-4a10-88e8-669c7b940363-trusted-ca\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:53:50.961402 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:50.961397 2559 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ddd3684-ea96-4a10-88e8-669c7b940363-registry-tls\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:53:51.391669 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:51.391642 2559 generic.go:358] "Generic (PLEG): container finished" podID="4ddd3684-ea96-4a10-88e8-669c7b940363" containerID="c5f247c71c6cf24ad188a43c2248e17c88c525ba6395c6855577ab312feab2dd" exitCode=0 Apr 23 08:53:51.392165 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:51.391674 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" event={"ID":"4ddd3684-ea96-4a10-88e8-669c7b940363","Type":"ContainerDied","Data":"c5f247c71c6cf24ad188a43c2248e17c88c525ba6395c6855577ab312feab2dd"} Apr 23 08:53:51.392165 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:51.391695 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" event={"ID":"4ddd3684-ea96-4a10-88e8-669c7b940363","Type":"ContainerDied","Data":"d347715d25545852bdc36504c02c981a3d5e151d091e04bc3ce42595ff9dbf55"} Apr 23 08:53:51.392165 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:51.391703 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bcb9d88d6-qvq77" Apr 23 08:53:51.392165 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:51.391711 2559 scope.go:117] "RemoveContainer" containerID="c5f247c71c6cf24ad188a43c2248e17c88c525ba6395c6855577ab312feab2dd" Apr 23 08:53:51.400829 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:51.400808 2559 scope.go:117] "RemoveContainer" containerID="c5f247c71c6cf24ad188a43c2248e17c88c525ba6395c6855577ab312feab2dd" Apr 23 08:53:51.401061 ip-10-0-137-31 kubenswrapper[2559]: E0423 08:53:51.401038 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5f247c71c6cf24ad188a43c2248e17c88c525ba6395c6855577ab312feab2dd\": container with ID starting with c5f247c71c6cf24ad188a43c2248e17c88c525ba6395c6855577ab312feab2dd not found: ID does not exist" containerID="c5f247c71c6cf24ad188a43c2248e17c88c525ba6395c6855577ab312feab2dd" Apr 23 08:53:51.401125 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:51.401068 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5f247c71c6cf24ad188a43c2248e17c88c525ba6395c6855577ab312feab2dd"} err="failed to get container status \"c5f247c71c6cf24ad188a43c2248e17c88c525ba6395c6855577ab312feab2dd\": rpc error: code = NotFound desc = could not find container \"c5f247c71c6cf24ad188a43c2248e17c88c525ba6395c6855577ab312feab2dd\": container with ID starting with c5f247c71c6cf24ad188a43c2248e17c88c525ba6395c6855577ab312feab2dd not found: ID does not exist" Apr 23 08:53:51.410637 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:51.410616 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7bcb9d88d6-qvq77"] Apr 23 08:53:51.413916 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:51.413898 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7bcb9d88d6-qvq77"] Apr 23 08:53:52.815517 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:53:52.815478 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ddd3684-ea96-4a10-88e8-669c7b940363" path="/var/lib/kubelet/pods/4ddd3684-ea96-4a10-88e8-669c7b940363/volumes" Apr 23 08:54:08.441704 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:08.441671 2559 generic.go:358] "Generic (PLEG): container finished" podID="158e7a7b-4a03-4678-8b2c-0dc7d0b7913c" containerID="5f7c49e4248473aa94bfe4a8621045240d0d3a44397bffc1501d04ffbe668a6e" exitCode=0 Apr 23 08:54:08.442170 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:08.441747 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6" event={"ID":"158e7a7b-4a03-4678-8b2c-0dc7d0b7913c","Type":"ContainerDied","Data":"5f7c49e4248473aa94bfe4a8621045240d0d3a44397bffc1501d04ffbe668a6e"} Apr 23 08:54:08.442170 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:08.442085 2559 scope.go:117] "RemoveContainer" containerID="5f7c49e4248473aa94bfe4a8621045240d0d3a44397bffc1501d04ffbe668a6e" Apr 23 08:54:09.445913 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:09.445877 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2hzz6" event={"ID":"158e7a7b-4a03-4678-8b2c-0dc7d0b7913c","Type":"ContainerStarted","Data":"a5f95cc562dfc6e23aae86d4d740969e2196804c7d041c94af1e5b8712c0c051"} Apr 23 08:54:32.516595 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:32.516564 2559 generic.go:358] "Generic (PLEG): container finished" podID="93831e97-cd75-4159-a7be-102ac3929f81" containerID="897ac52b1c5a2cb4f1322a451a63edcb38a04b3b7311a5092bf9cb3ecb62c16f" exitCode=0 Apr 23 08:54:32.517091 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:32.516640 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr" event={"ID":"93831e97-cd75-4159-a7be-102ac3929f81","Type":"ContainerDied","Data":"897ac52b1c5a2cb4f1322a451a63edcb38a04b3b7311a5092bf9cb3ecb62c16f"} Apr 23 08:54:32.517091 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:32.517026 2559 scope.go:117] "RemoveContainer" containerID="897ac52b1c5a2cb4f1322a451a63edcb38a04b3b7311a5092bf9cb3ecb62c16f" Apr 23 08:54:33.520382 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:33.520350 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kczr" event={"ID":"93831e97-cd75-4159-a7be-102ac3929f81","Type":"ContainerStarted","Data":"f6b196313babbb35ea547abf99688150c4c0173a3995f4a50a9ed001c4ea4660"} Apr 23 08:54:39.962620 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:39.962579 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:39.990019 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:39.989731 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:40.556745 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:40.556720 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:46.584889 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:46.584852 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs\") pod \"network-metrics-daemon-lx4sg\" (UID: \"61450b58-933b-4b5d-bf40-9e4408670e3e\") " pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:54:46.587071 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:46.587036 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61450b58-933b-4b5d-bf40-9e4408670e3e-metrics-certs\") pod \"network-metrics-daemon-lx4sg\" (UID: \"61450b58-933b-4b5d-bf40-9e4408670e3e\") " pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:54:46.813634 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:46.813612 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wtr6s\"" Apr 23 08:54:46.822083 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:46.822057 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lx4sg" Apr 23 08:54:46.939265 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:46.939238 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lx4sg"] Apr 23 08:54:46.942195 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:54:46.942168 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61450b58_933b_4b5d_bf40_9e4408670e3e.slice/crio-5167e79d7719b500cf8abab2f8c4f192fdc720844579ed82c641e2a029c8787d WatchSource:0}: Error finding container 5167e79d7719b500cf8abab2f8c4f192fdc720844579ed82c641e2a029c8787d: Status 404 returned error can't find the container with id 5167e79d7719b500cf8abab2f8c4f192fdc720844579ed82c641e2a029c8787d Apr 23 08:54:47.563339 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:47.563301 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lx4sg" event={"ID":"61450b58-933b-4b5d-bf40-9e4408670e3e","Type":"ContainerStarted","Data":"5167e79d7719b500cf8abab2f8c4f192fdc720844579ed82c641e2a029c8787d"} Apr 23 08:54:48.567980 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:48.567944 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lx4sg" event={"ID":"61450b58-933b-4b5d-bf40-9e4408670e3e","Type":"ContainerStarted","Data":"0d4ba2412cadfe699901149851c260a2d9010e72e7fc03026d1d2278d50cc6e5"} Apr 23 08:54:48.567980 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:48.567981 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lx4sg" event={"ID":"61450b58-933b-4b5d-bf40-9e4408670e3e","Type":"ContainerStarted","Data":"34f01ce3c899997189cd5c31222b6f63cb6a2c650242850937be4d810da78529"} Apr 23 08:54:48.583278 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:48.583237 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lx4sg" podStartSLOduration=253.588965579 podStartE2EDuration="4m14.583224775s" podCreationTimestamp="2026-04-23 08:50:34 +0000 UTC" firstStartedPulling="2026-04-23 08:54:46.944483965 +0000 UTC m=+252.757089333" lastFinishedPulling="2026-04-23 08:54:47.938743154 +0000 UTC m=+253.751348529" observedRunningTime="2026-04-23 08:54:48.581962091 +0000 UTC m=+254.394567479" watchObservedRunningTime="2026-04-23 08:54:48.583224775 +0000 UTC m=+254.395830179" Apr 23 08:54:57.417122 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:57.417037 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:54:57.417648 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:57.417615 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="prometheus" containerID="cri-o://4e935d03b93589920eaa2bb8acc90ff2ca4e32b6a692a72a1eede6a6a45895af" gracePeriod=600 Apr 23 08:54:57.417754 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:57.417629 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="kube-rbac-proxy" containerID="cri-o://fd052262447edfb3a9d45fb2dde8fb7fa0fa53e3fe22e0a0aca79c9f5af5eb5e" gracePeriod=600 Apr 23 08:54:57.417754 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:57.417673 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="kube-rbac-proxy-thanos" containerID="cri-o://0a9b11791ae5739618f6b35a7c4bfd1502252f7268ad53a9ba8af2498caff0f4" gracePeriod=600 Apr 23 08:54:57.417754 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:57.417681 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="config-reloader" containerID="cri-o://c319dd1f0b68152dbcb92c4c1ec4eb24007da45db8e71496bc697c94e750d66a" gracePeriod=600 Apr 23 08:54:57.417754 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:57.417746 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="kube-rbac-proxy-web" containerID="cri-o://a6f965776e93015e8a0bc77270d73301af71015cd1d58057b92ba304099192ee" gracePeriod=600 Apr 23 08:54:57.417945 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:57.417799 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="thanos-sidecar" containerID="cri-o://399f9b4a32c6ed6c4e28ff19f0418db06a0b95b007d0bd7a688c4718acd1b39f" gracePeriod=600 Apr 23 08:54:57.599516 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:57.599485 2559 generic.go:358] "Generic (PLEG): container finished" podID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerID="0a9b11791ae5739618f6b35a7c4bfd1502252f7268ad53a9ba8af2498caff0f4" exitCode=0 Apr 23 08:54:57.599516 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:57.599515 2559 generic.go:358] "Generic (PLEG): container finished" podID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerID="fd052262447edfb3a9d45fb2dde8fb7fa0fa53e3fe22e0a0aca79c9f5af5eb5e" exitCode=0 Apr 23 08:54:57.599733 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:57.599525 2559 generic.go:358] "Generic (PLEG): container finished" podID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerID="399f9b4a32c6ed6c4e28ff19f0418db06a0b95b007d0bd7a688c4718acd1b39f" exitCode=0 Apr 23 08:54:57.599733 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:57.599534 2559 generic.go:358] "Generic (PLEG): container finished" podID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerID="c319dd1f0b68152dbcb92c4c1ec4eb24007da45db8e71496bc697c94e750d66a" exitCode=0 Apr 23 08:54:57.599733 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:57.599546 2559 generic.go:358] "Generic (PLEG): container finished" podID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerID="4e935d03b93589920eaa2bb8acc90ff2ca4e32b6a692a72a1eede6a6a45895af" exitCode=0 Apr 23 08:54:57.599733 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:57.599557 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6558060-afa9-4b47-b7f8-f9b1c572e562","Type":"ContainerDied","Data":"0a9b11791ae5739618f6b35a7c4bfd1502252f7268ad53a9ba8af2498caff0f4"} Apr 23 08:54:57.599733 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:57.599600 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6558060-afa9-4b47-b7f8-f9b1c572e562","Type":"ContainerDied","Data":"fd052262447edfb3a9d45fb2dde8fb7fa0fa53e3fe22e0a0aca79c9f5af5eb5e"} Apr 23 08:54:57.599733 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:57.599617 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6558060-afa9-4b47-b7f8-f9b1c572e562","Type":"ContainerDied","Data":"399f9b4a32c6ed6c4e28ff19f0418db06a0b95b007d0bd7a688c4718acd1b39f"} Apr 23 08:54:57.599733 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:57.599631 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6558060-afa9-4b47-b7f8-f9b1c572e562","Type":"ContainerDied","Data":"c319dd1f0b68152dbcb92c4c1ec4eb24007da45db8e71496bc697c94e750d66a"} Apr 23 08:54:57.599733 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:57.599644 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6558060-afa9-4b47-b7f8-f9b1c572e562","Type":"ContainerDied","Data":"4e935d03b93589920eaa2bb8acc90ff2ca4e32b6a692a72a1eede6a6a45895af"} Apr 23 08:54:58.607107 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.607079 2559 generic.go:358] "Generic (PLEG): container finished" podID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerID="a6f965776e93015e8a0bc77270d73301af71015cd1d58057b92ba304099192ee" exitCode=0 Apr 23 08:54:58.607412 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.607156 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6558060-afa9-4b47-b7f8-f9b1c572e562","Type":"ContainerDied","Data":"a6f965776e93015e8a0bc77270d73301af71015cd1d58057b92ba304099192ee"} Apr 23 08:54:58.651075 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.651054 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:58.782164 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.782099 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-web-config\") pod \"f6558060-afa9-4b47-b7f8-f9b1c572e562\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " Apr 23 08:54:58.782164 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.782140 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp4dk\" (UniqueName: \"kubernetes.io/projected/f6558060-afa9-4b47-b7f8-f9b1c572e562-kube-api-access-kp4dk\") pod \"f6558060-afa9-4b47-b7f8-f9b1c572e562\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " Apr 23 08:54:58.782355 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.782165 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-k8s-db\") pod \"f6558060-afa9-4b47-b7f8-f9b1c572e562\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " Apr 23 08:54:58.782355 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.782205 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-grpc-tls\") pod \"f6558060-afa9-4b47-b7f8-f9b1c572e562\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " Apr 23 08:54:58.782355 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.782229 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-config\") pod \"f6558060-afa9-4b47-b7f8-f9b1c572e562\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " Apr 23 08:54:58.782355 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.782258 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-metrics-client-certs\") pod \"f6558060-afa9-4b47-b7f8-f9b1c572e562\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " Apr 23 08:54:58.782355 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.782298 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"f6558060-afa9-4b47-b7f8-f9b1c572e562\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " Apr 23 08:54:58.782355 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.782346 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-configmap-metrics-client-ca\") pod \"f6558060-afa9-4b47-b7f8-f9b1c572e562\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " Apr 23 08:54:58.782654 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.782372 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f6558060-afa9-4b47-b7f8-f9b1c572e562-config-out\") pod \"f6558060-afa9-4b47-b7f8-f9b1c572e562\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " Apr 23 08:54:58.782654 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.782402 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-kube-rbac-proxy\") pod \"f6558060-afa9-4b47-b7f8-f9b1c572e562\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " Apr 23 08:54:58.782654 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.782434 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f6558060-afa9-4b47-b7f8-f9b1c572e562-tls-assets\") pod \"f6558060-afa9-4b47-b7f8-f9b1c572e562\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " Apr 23 08:54:58.782654 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.782463 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"f6558060-afa9-4b47-b7f8-f9b1c572e562\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " Apr 23 08:54:58.782654 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.782506 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-trusted-ca-bundle\") pod \"f6558060-afa9-4b47-b7f8-f9b1c572e562\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " Apr 23 08:54:58.782654 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.782533 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-configmap-serving-certs-ca-bundle\") pod \"f6558060-afa9-4b47-b7f8-f9b1c572e562\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " Apr 23 08:54:58.782654 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.782560 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-thanos-prometheus-http-client-file\") pod \"f6558060-afa9-4b47-b7f8-f9b1c572e562\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " Apr 23 08:54:58.782654 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.782586 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-prometheus-k8s-tls\") pod \"f6558060-afa9-4b47-b7f8-f9b1c572e562\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " Apr 23 08:54:58.782654 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.782626 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-configmap-kubelet-serving-ca-bundle\") pod \"f6558060-afa9-4b47-b7f8-f9b1c572e562\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " Apr 23 08:54:58.783104 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.782668 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-k8s-rulefiles-0\") pod \"f6558060-afa9-4b47-b7f8-f9b1c572e562\" (UID: \"f6558060-afa9-4b47-b7f8-f9b1c572e562\") " Apr 23 08:54:58.783929 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.783734 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "f6558060-afa9-4b47-b7f8-f9b1c572e562" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:54:58.784053 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.783966 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "f6558060-afa9-4b47-b7f8-f9b1c572e562" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:54:58.784436 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.784286 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "f6558060-afa9-4b47-b7f8-f9b1c572e562" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:54:58.784436 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.784361 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "f6558060-afa9-4b47-b7f8-f9b1c572e562" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:54:58.785554 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.785444 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6558060-afa9-4b47-b7f8-f9b1c572e562-kube-api-access-kp4dk" (OuterVolumeSpecName: "kube-api-access-kp4dk") pod "f6558060-afa9-4b47-b7f8-f9b1c572e562" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562"). InnerVolumeSpecName "kube-api-access-kp4dk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:54:58.785554 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.785513 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-config" (OuterVolumeSpecName: "config") pod "f6558060-afa9-4b47-b7f8-f9b1c572e562" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:54:58.785709 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.785628 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "f6558060-afa9-4b47-b7f8-f9b1c572e562" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:54:58.786000 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.785954 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "f6558060-afa9-4b47-b7f8-f9b1c572e562" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:54:58.786000 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.785959 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "f6558060-afa9-4b47-b7f8-f9b1c572e562" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:54:58.787043 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.786973 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "f6558060-afa9-4b47-b7f8-f9b1c572e562" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:54:58.787392 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.787357 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f6558060-afa9-4b47-b7f8-f9b1c572e562" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:54:58.787495 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.787468 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "f6558060-afa9-4b47-b7f8-f9b1c572e562" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:54:58.787571 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.787524 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6558060-afa9-4b47-b7f8-f9b1c572e562-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f6558060-afa9-4b47-b7f8-f9b1c572e562" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:54:58.787633 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.787596 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "f6558060-afa9-4b47-b7f8-f9b1c572e562" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:54:58.787686 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.787659 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6558060-afa9-4b47-b7f8-f9b1c572e562-config-out" (OuterVolumeSpecName: "config-out") pod "f6558060-afa9-4b47-b7f8-f9b1c572e562" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:54:58.788090 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.788065 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "f6558060-afa9-4b47-b7f8-f9b1c572e562" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:54:58.788368 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.788348 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "f6558060-afa9-4b47-b7f8-f9b1c572e562" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:54:58.796746 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.796726 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-web-config" (OuterVolumeSpecName: "web-config") pod "f6558060-afa9-4b47-b7f8-f9b1c572e562" (UID: "f6558060-afa9-4b47-b7f8-f9b1c572e562"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:54:58.883975 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.883953 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kp4dk\" (UniqueName: \"kubernetes.io/projected/f6558060-afa9-4b47-b7f8-f9b1c572e562-kube-api-access-kp4dk\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:54:58.883975 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.883973 2559 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-k8s-db\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:54:58.884111 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.884009 2559 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-grpc-tls\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:54:58.884111 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.884019 2559 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-config\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:54:58.884111 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.884028 2559 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-metrics-client-certs\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:54:58.884111 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.884039 2559 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:54:58.884111 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.884049 2559 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-configmap-metrics-client-ca\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:54:58.884111 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.884058 2559 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f6558060-afa9-4b47-b7f8-f9b1c572e562-config-out\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:54:58.884111 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.884067 2559 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-kube-rbac-proxy\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:54:58.884111 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.884074 2559 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f6558060-afa9-4b47-b7f8-f9b1c572e562-tls-assets\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:54:58.884111 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.884083 2559 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:54:58.884111 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.884092 2559 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-trusted-ca-bundle\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:54:58.884111 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.884102 2559 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:54:58.884111 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.884110 2559 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-thanos-prometheus-http-client-file\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:54:58.884455 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.884120 2559 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-secret-prometheus-k8s-tls\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:54:58.884455 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.884129 2559 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:54:58.884455 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.884137 2559 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f6558060-afa9-4b47-b7f8-f9b1c572e562-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:54:58.884455 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:58.884147 2559 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f6558060-afa9-4b47-b7f8-f9b1c572e562-web-config\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 08:54:59.613155 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.613116 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6558060-afa9-4b47-b7f8-f9b1c572e562","Type":"ContainerDied","Data":"6ac7657dc5de6472b6ecd6c50cae9d0edbaae01e8d93a27e8256ad1e3cee9e4a"} Apr 23 08:54:59.613455 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.613175 2559 scope.go:117] "RemoveContainer" containerID="0a9b11791ae5739618f6b35a7c4bfd1502252f7268ad53a9ba8af2498caff0f4" Apr 23 08:54:59.613455 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.613237 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.620598 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.620568 2559 scope.go:117] "RemoveContainer" containerID="fd052262447edfb3a9d45fb2dde8fb7fa0fa53e3fe22e0a0aca79c9f5af5eb5e" Apr 23 08:54:59.627158 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.627140 2559 scope.go:117] "RemoveContainer" containerID="a6f965776e93015e8a0bc77270d73301af71015cd1d58057b92ba304099192ee" Apr 23 08:54:59.631666 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.631643 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:54:59.634809 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.634790 2559 scope.go:117] "RemoveContainer" containerID="399f9b4a32c6ed6c4e28ff19f0418db06a0b95b007d0bd7a688c4718acd1b39f" Apr 23 08:54:59.635426 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.635403 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:54:59.640946 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.640927 2559 scope.go:117] "RemoveContainer" containerID="c319dd1f0b68152dbcb92c4c1ec4eb24007da45db8e71496bc697c94e750d66a" Apr 23 08:54:59.647159 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.647145 2559 scope.go:117] "RemoveContainer" containerID="4e935d03b93589920eaa2bb8acc90ff2ca4e32b6a692a72a1eede6a6a45895af" Apr 23 08:54:59.653964 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.653950 2559 scope.go:117] "RemoveContainer" containerID="80b53333ed79d903f4955bdfcbe5e142ba7449b16bf35e4a6b58f7928636618b" Apr 23 08:54:59.660629 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.660613 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:54:59.660909 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.660896 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="kube-rbac-proxy-web" Apr 23 08:54:59.660976 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.660911 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="kube-rbac-proxy-web" Apr 23 08:54:59.660976 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.660923 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ddd3684-ea96-4a10-88e8-669c7b940363" containerName="registry" Apr 23 08:54:59.660976 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.660929 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ddd3684-ea96-4a10-88e8-669c7b940363" containerName="registry" Apr 23 08:54:59.660976 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.660939 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="config-reloader" Apr 23 08:54:59.660976 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.660945 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="config-reloader" Apr 23 08:54:59.660976 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.660954 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="thanos-sidecar" Apr 23 08:54:59.660976 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.660959 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="thanos-sidecar" Apr 23 08:54:59.660976 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.660968 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="kube-rbac-proxy-thanos" Apr 23 08:54:59.660976 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.660972 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="kube-rbac-proxy-thanos" Apr 23 08:54:59.661296 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.660981 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="kube-rbac-proxy" Apr 23 08:54:59.661296 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.661003 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="kube-rbac-proxy" Apr 23 08:54:59.661296 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.661012 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="init-config-reloader" Apr 23 08:54:59.661296 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.661018 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="init-config-reloader" Apr 23 08:54:59.661296 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.661024 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="prometheus" Apr 23 08:54:59.661296 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.661029 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="prometheus" Apr 23 08:54:59.661296 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.661074 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="prometheus" Apr 23 08:54:59.661296 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.661083 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="thanos-sidecar" Apr 23 08:54:59.661296 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.661088 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="kube-rbac-proxy" Apr 23 08:54:59.661296 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.661095 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="kube-rbac-proxy-web" Apr 23 08:54:59.661296 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.661101 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="config-reloader" Apr 23 08:54:59.661296 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.661106 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" containerName="kube-rbac-proxy-thanos" Apr 23 08:54:59.661296 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.661112 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ddd3684-ea96-4a10-88e8-669c7b940363" containerName="registry" Apr 23 08:54:59.664883 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.664864 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.666542 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.666523 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 08:54:59.666718 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.666687 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 08:54:59.666816 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.666781 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 08:54:59.667070 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.667051 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 08:54:59.667159 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.667130 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 08:54:59.667159 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.667134 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 08:54:59.667159 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.667143 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 08:54:59.667322 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.667191 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 08:54:59.667541 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.667526 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 08:54:59.667675 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.667586 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-pcflh\"" Apr 23 08:54:59.667785 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.667771 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-tg1oe96okg3c\"" Apr 23 08:54:59.667859 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.667793 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 08:54:59.668635 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.668618 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 08:54:59.670514 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.670496 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 08:54:59.673172 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.673153 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 08:54:59.675767 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.675747 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:54:59.789478 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.789423 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-config\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.789478 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.789458 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.789589 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.789482 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.789589 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.789564 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.789657 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.789606 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.789657 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.789626 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-web-config\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.789657 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.789643 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.789747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.789680 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.789747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.789701 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.789747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.789717 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-config-out\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.789747 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.789732 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.789858 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.789756 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.789858 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.789777 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.789858 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.789798 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.789858 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.789819 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4knt\" (UniqueName: \"kubernetes.io/projected/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-kube-api-access-t4knt\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.789858 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.789843 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.790021 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.789862 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.790021 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.789897 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.890391 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.890368 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.890455 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.890395 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.890455 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.890432 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.890679 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.890658 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-config\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.891244 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.891071 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.891712 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.891610 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.892030 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.891978 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.892142 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.892096 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.892226 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.892209 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.892294 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.892265 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.892452 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.892434 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.892720 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.892702 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-web-config\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.892798 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.892737 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.893011 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.892865 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.893114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.893034 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.893114 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.893068 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-config-out\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.893209 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.893192 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.893254 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.893225 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.893294 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.893257 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.893432 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.893289 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.893490 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.893451 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4knt\" (UniqueName: \"kubernetes.io/projected/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-kube-api-access-t4knt\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.894873 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.894430 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-config\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.894873 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.894649 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.895511 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.895050 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.895822 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.895797 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.897104 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.897077 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.897556 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.897193 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.897556 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.897488 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.898356 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.898291 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.898622 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.898570 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-config-out\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.899797 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.899772 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.901348 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.900162 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.901348 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.900834 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.901965 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.901945 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.902121 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.902100 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-web-config\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.902294 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.902274 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4knt\" (UniqueName: \"kubernetes.io/projected/0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a-kube-api-access-t4knt\") pod \"prometheus-k8s-0\" (UID: \"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:54:59.975229 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:54:59.975196 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:55:00.100450 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:55:00.100422 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:55:00.103443 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:55:00.103415 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b471e3e_ed5f_4084_b9b0_d9f7ef7c985a.slice/crio-2585d6cda1c26dbb5b4791149a3a8e151a773fa2f18eeafa36893ef708259d87 WatchSource:0}: Error finding container 2585d6cda1c26dbb5b4791149a3a8e151a773fa2f18eeafa36893ef708259d87: Status 404 returned error can't find the container with id 2585d6cda1c26dbb5b4791149a3a8e151a773fa2f18eeafa36893ef708259d87 Apr 23 08:55:00.617224 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:55:00.617190 2559 generic.go:358] "Generic (PLEG): container finished" podID="0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a" containerID="e8ce5b0e233a20c1099d03a5c38429b16abc1aa24ed8c5c0f9fc12a7baa68d8e" exitCode=0 Apr 23 08:55:00.617627 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:55:00.617269 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a","Type":"ContainerDied","Data":"e8ce5b0e233a20c1099d03a5c38429b16abc1aa24ed8c5c0f9fc12a7baa68d8e"} Apr 23 08:55:00.617627 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:55:00.617299 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a","Type":"ContainerStarted","Data":"2585d6cda1c26dbb5b4791149a3a8e151a773fa2f18eeafa36893ef708259d87"} Apr 23 08:55:00.817789 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:55:00.816977 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6558060-afa9-4b47-b7f8-f9b1c572e562" path="/var/lib/kubelet/pods/f6558060-afa9-4b47-b7f8-f9b1c572e562/volumes" Apr 23 08:55:01.624553 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:55:01.624519 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a","Type":"ContainerStarted","Data":"01a9bcf3bcb3ece1d38280b30c73fe59f1eb96c29440a3e063ed5f8f05d0e463"} Apr 23 08:55:01.624553 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:55:01.624555 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a","Type":"ContainerStarted","Data":"6b0cfd97f408525fc11c2ae07f2250a7706bf698ccb5b39f3a29a915bbdc8a28"} Apr 23 08:55:01.624918 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:55:01.624564 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a","Type":"ContainerStarted","Data":"8755b825ff1b597e8e0770f419774ed2c16628636e15ec6b287312f82cc72298"} Apr 23 08:55:01.624918 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:55:01.624573 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a","Type":"ContainerStarted","Data":"a3f123c7478f7e826e5a7f06d1fb767ae57962937104b638b1cecfed21d4ef74"} Apr 23 08:55:01.624918 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:55:01.624581 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a","Type":"ContainerStarted","Data":"b4e239f1e87afefaf831d8724d3ffd084983bc781f634e55e506c5a55eaef896"} Apr 23 08:55:01.624918 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:55:01.624589 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a","Type":"ContainerStarted","Data":"7fe08669efb5edc9380c62f22caf367765ace600f978b2d54ef8ffc0239a0af5"} Apr 23 08:55:01.648869 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:55:01.648822 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.648809121 podStartE2EDuration="2.648809121s" podCreationTimestamp="2026-04-23 08:54:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:55:01.648239136 +0000 UTC m=+267.460844524" watchObservedRunningTime="2026-04-23 08:55:01.648809121 +0000 UTC m=+267.461414487" Apr 23 08:55:04.976278 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:55:04.976247 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:55:34.703438 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:55:34.703415 2559 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 08:55:59.976080 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:55:59.976043 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:55:59.991849 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:55:59.991826 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:56:00.827869 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:00.827838 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:56:10.734971 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:10.734939 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-v6jqm"] Apr 23 08:56:10.738187 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:10.738169 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-v6jqm" Apr 23 08:56:10.740154 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:10.740136 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 23 08:56:10.740229 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:10.740187 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-qklmm\"" Apr 23 08:56:10.740660 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:10.740646 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 23 08:56:10.749372 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:10.749349 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-v6jqm"] Apr 23 08:56:10.880946 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:10.880923 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a8ab68d-7562-498e-bea7-b4c3979c0671-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-v6jqm\" (UID: \"5a8ab68d-7562-498e-bea7-b4c3979c0671\") " pod="cert-manager/cert-manager-webhook-587ccfb98-v6jqm" Apr 23 08:56:10.881053 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:10.880960 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czjh9\" (UniqueName: \"kubernetes.io/projected/5a8ab68d-7562-498e-bea7-b4c3979c0671-kube-api-access-czjh9\") pod \"cert-manager-webhook-587ccfb98-v6jqm\" (UID: \"5a8ab68d-7562-498e-bea7-b4c3979c0671\") " pod="cert-manager/cert-manager-webhook-587ccfb98-v6jqm" Apr 23 08:56:10.982139 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:10.982116 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a8ab68d-7562-498e-bea7-b4c3979c0671-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-v6jqm\" (UID: \"5a8ab68d-7562-498e-bea7-b4c3979c0671\") " pod="cert-manager/cert-manager-webhook-587ccfb98-v6jqm" Apr 23 08:56:10.982231 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:10.982157 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czjh9\" (UniqueName: \"kubernetes.io/projected/5a8ab68d-7562-498e-bea7-b4c3979c0671-kube-api-access-czjh9\") pod \"cert-manager-webhook-587ccfb98-v6jqm\" (UID: \"5a8ab68d-7562-498e-bea7-b4c3979c0671\") " pod="cert-manager/cert-manager-webhook-587ccfb98-v6jqm" Apr 23 08:56:10.989772 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:10.989716 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a8ab68d-7562-498e-bea7-b4c3979c0671-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-v6jqm\" (UID: \"5a8ab68d-7562-498e-bea7-b4c3979c0671\") " pod="cert-manager/cert-manager-webhook-587ccfb98-v6jqm" Apr 23 08:56:10.989772 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:10.989749 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czjh9\" (UniqueName: \"kubernetes.io/projected/5a8ab68d-7562-498e-bea7-b4c3979c0671-kube-api-access-czjh9\") pod \"cert-manager-webhook-587ccfb98-v6jqm\" (UID: \"5a8ab68d-7562-498e-bea7-b4c3979c0671\") " pod="cert-manager/cert-manager-webhook-587ccfb98-v6jqm" Apr 23 08:56:11.064773 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:11.064751 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-v6jqm" Apr 23 08:56:11.182094 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:56:11.182069 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a8ab68d_7562_498e_bea7_b4c3979c0671.slice/crio-3d56ed66c472055cbc53ece6fd19f029feb49b1a1b48d84f6b7114b99c902151 WatchSource:0}: Error finding container 3d56ed66c472055cbc53ece6fd19f029feb49b1a1b48d84f6b7114b99c902151: Status 404 returned error can't find the container with id 3d56ed66c472055cbc53ece6fd19f029feb49b1a1b48d84f6b7114b99c902151 Apr 23 08:56:11.182094 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:11.182077 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-v6jqm"] Apr 23 08:56:11.184340 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:11.184323 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:56:11.843565 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:11.843527 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-v6jqm" event={"ID":"5a8ab68d-7562-498e-bea7-b4c3979c0671","Type":"ContainerStarted","Data":"3d56ed66c472055cbc53ece6fd19f029feb49b1a1b48d84f6b7114b99c902151"} Apr 23 08:56:14.852415 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:14.852383 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-v6jqm" event={"ID":"5a8ab68d-7562-498e-bea7-b4c3979c0671","Type":"ContainerStarted","Data":"8d649d553973130a49e4a22965248c49583fc8889800d6d803b3fabd5922d321"} Apr 23 08:56:14.852773 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:14.852428 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-v6jqm" Apr 23 08:56:14.868582 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:14.868538 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-v6jqm" podStartSLOduration=1.9681257840000002 podStartE2EDuration="4.868527242s" podCreationTimestamp="2026-04-23 08:56:10 +0000 UTC" firstStartedPulling="2026-04-23 08:56:11.184478838 +0000 UTC m=+336.997084207" lastFinishedPulling="2026-04-23 08:56:14.084880296 +0000 UTC m=+339.897485665" observedRunningTime="2026-04-23 08:56:14.867046903 +0000 UTC m=+340.679652302" watchObservedRunningTime="2026-04-23 08:56:14.868527242 +0000 UTC m=+340.681132630" Apr 23 08:56:20.857628 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:20.857593 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-v6jqm" Apr 23 08:56:22.233195 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:22.233164 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-g7wsd"] Apr 23 08:56:22.236238 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:22.236219 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-g7wsd" Apr 23 08:56:22.237998 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:22.237969 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-dkfnt\"" Apr 23 08:56:22.248004 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:22.245564 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-g7wsd"] Apr 23 08:56:22.263568 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:22.263540 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7vnx\" (UniqueName: \"kubernetes.io/projected/9f8dccf2-f4fb-4e88-b45e-fd31d6a06843-kube-api-access-z7vnx\") pod \"cert-manager-79c8d999ff-g7wsd\" (UID: \"9f8dccf2-f4fb-4e88-b45e-fd31d6a06843\") " pod="cert-manager/cert-manager-79c8d999ff-g7wsd" Apr 23 08:56:22.263661 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:22.263589 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f8dccf2-f4fb-4e88-b45e-fd31d6a06843-bound-sa-token\") pod \"cert-manager-79c8d999ff-g7wsd\" (UID: \"9f8dccf2-f4fb-4e88-b45e-fd31d6a06843\") " pod="cert-manager/cert-manager-79c8d999ff-g7wsd" Apr 23 08:56:22.364886 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:22.364850 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f8dccf2-f4fb-4e88-b45e-fd31d6a06843-bound-sa-token\") pod \"cert-manager-79c8d999ff-g7wsd\" (UID: \"9f8dccf2-f4fb-4e88-b45e-fd31d6a06843\") " pod="cert-manager/cert-manager-79c8d999ff-g7wsd" Apr 23 08:56:22.365008 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:22.364908 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7vnx\" (UniqueName: \"kubernetes.io/projected/9f8dccf2-f4fb-4e88-b45e-fd31d6a06843-kube-api-access-z7vnx\") pod \"cert-manager-79c8d999ff-g7wsd\" (UID: \"9f8dccf2-f4fb-4e88-b45e-fd31d6a06843\") " pod="cert-manager/cert-manager-79c8d999ff-g7wsd" Apr 23 08:56:22.372235 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:22.372214 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f8dccf2-f4fb-4e88-b45e-fd31d6a06843-bound-sa-token\") pod \"cert-manager-79c8d999ff-g7wsd\" (UID: \"9f8dccf2-f4fb-4e88-b45e-fd31d6a06843\") " pod="cert-manager/cert-manager-79c8d999ff-g7wsd" Apr 23 08:56:22.372377 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:22.372359 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7vnx\" (UniqueName: \"kubernetes.io/projected/9f8dccf2-f4fb-4e88-b45e-fd31d6a06843-kube-api-access-z7vnx\") pod \"cert-manager-79c8d999ff-g7wsd\" (UID: \"9f8dccf2-f4fb-4e88-b45e-fd31d6a06843\") " pod="cert-manager/cert-manager-79c8d999ff-g7wsd" Apr 23 08:56:22.548154 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:22.548096 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-g7wsd" Apr 23 08:56:22.662821 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:22.662794 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-g7wsd"] Apr 23 08:56:22.665962 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:56:22.665935 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f8dccf2_f4fb_4e88_b45e_fd31d6a06843.slice/crio-33284585cf214945ddb215e6d893db00248a9475ee7053c6e313e8969e6ef605 WatchSource:0}: Error finding container 33284585cf214945ddb215e6d893db00248a9475ee7053c6e313e8969e6ef605: Status 404 returned error can't find the container with id 33284585cf214945ddb215e6d893db00248a9475ee7053c6e313e8969e6ef605 Apr 23 08:56:22.875788 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:22.875718 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-g7wsd" event={"ID":"9f8dccf2-f4fb-4e88-b45e-fd31d6a06843","Type":"ContainerStarted","Data":"e6e3ea27e1a8721146e008724fc39d7ee8cd6f495f6aae6471558d52a3f26523"} Apr 23 08:56:22.875788 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:22.875750 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-g7wsd" event={"ID":"9f8dccf2-f4fb-4e88-b45e-fd31d6a06843","Type":"ContainerStarted","Data":"33284585cf214945ddb215e6d893db00248a9475ee7053c6e313e8969e6ef605"} Apr 23 08:56:22.887647 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:56:22.887603 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-g7wsd" podStartSLOduration=0.887588614 podStartE2EDuration="887.588614ms" podCreationTimestamp="2026-04-23 08:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:56:22.887226725 +0000 UTC m=+348.699832127" watchObservedRunningTime="2026-04-23 08:56:22.887588614 +0000 UTC m=+348.700194004" Apr 23 08:58:55.374364 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:58:55.374330 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn"] Apr 23 08:58:55.377421 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:58:55.377405 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn" Apr 23 08:58:55.379286 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:58:55.379263 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-dv8pw\"/\"openshift-service-ca.crt\"" Apr 23 08:58:55.379411 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:58:55.379311 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-dv8pw\"/\"kube-root-ca.crt\"" Apr 23 08:58:55.379411 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:58:55.379323 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-dv8pw\"/\"default-dockercfg-42m4p\"" Apr 23 08:58:55.394764 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:58:55.394746 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn"] Apr 23 08:58:55.427244 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:58:55.427224 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhnqs\" (UniqueName: \"kubernetes.io/projected/815f4113-631c-4675-95a0-fe7ed8d24aab-kube-api-access-nhnqs\") pod \"progression-enabled-node-0-0-98vtn\" (UID: \"815f4113-631c-4675-95a0-fe7ed8d24aab\") " pod="rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn" Apr 23 08:58:55.528424 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:58:55.528400 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnqs\" (UniqueName: \"kubernetes.io/projected/815f4113-631c-4675-95a0-fe7ed8d24aab-kube-api-access-nhnqs\") pod \"progression-enabled-node-0-0-98vtn\" (UID: \"815f4113-631c-4675-95a0-fe7ed8d24aab\") " pod="rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn" Apr 23 08:58:55.535908 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:58:55.535884 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhnqs\" (UniqueName: \"kubernetes.io/projected/815f4113-631c-4675-95a0-fe7ed8d24aab-kube-api-access-nhnqs\") pod \"progression-enabled-node-0-0-98vtn\" (UID: \"815f4113-631c-4675-95a0-fe7ed8d24aab\") " pod="rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn" Apr 23 08:58:55.688276 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:58:55.688220 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn" Apr 23 08:58:55.801603 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:58:55.801575 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn"] Apr 23 08:58:55.804687 ip-10-0-137-31 kubenswrapper[2559]: W0423 08:58:55.804656 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod815f4113_631c_4675_95a0_fe7ed8d24aab.slice/crio-f3fd3aae7a8ca7f63b939130b49db82f7fcb9494d619ea5e33524cd79d774c40 WatchSource:0}: Error finding container f3fd3aae7a8ca7f63b939130b49db82f7fcb9494d619ea5e33524cd79d774c40: Status 404 returned error can't find the container with id f3fd3aae7a8ca7f63b939130b49db82f7fcb9494d619ea5e33524cd79d774c40 Apr 23 08:58:56.312683 ip-10-0-137-31 kubenswrapper[2559]: I0423 08:58:56.312648 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn" event={"ID":"815f4113-631c-4675-95a0-fe7ed8d24aab","Type":"ContainerStarted","Data":"f3fd3aae7a8ca7f63b939130b49db82f7fcb9494d619ea5e33524cd79d774c40"} Apr 23 09:00:42.649248 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:00:42.649205 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn" event={"ID":"815f4113-631c-4675-95a0-fe7ed8d24aab","Type":"ContainerStarted","Data":"ff6ffdd94b1ea9711b5c76d305b0ed321a146b4b5eb6cc5d32d9ff507d565964"} Apr 23 09:00:42.649719 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:00:42.649340 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn" Apr 23 09:00:42.668806 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:00:42.668748 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn" podStartSLOduration=1.401396501 podStartE2EDuration="1m47.668732127s" podCreationTimestamp="2026-04-23 08:58:55 +0000 UTC" firstStartedPulling="2026-04-23 08:58:55.806639566 +0000 UTC m=+501.619244932" lastFinishedPulling="2026-04-23 09:00:42.073975192 +0000 UTC m=+607.886580558" observedRunningTime="2026-04-23 09:00:42.666168192 +0000 UTC m=+608.478773580" watchObservedRunningTime="2026-04-23 09:00:42.668732127 +0000 UTC m=+608.481337515" Apr 23 09:00:44.654316 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:00:44.654286 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn" Apr 23 09:01:05.652158 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:05.652121 2559 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn" podUID="815f4113-631c-4675-95a0-fe7ed8d24aab" containerName="node" probeResult="failure" output="Get \"http://10.133.0.25:28080/metrics\": dial tcp 10.133.0.25:28080: connect: connection refused" Apr 23 09:01:05.719082 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:05.719052 2559 generic.go:358] "Generic (PLEG): container finished" podID="815f4113-631c-4675-95a0-fe7ed8d24aab" containerID="ff6ffdd94b1ea9711b5c76d305b0ed321a146b4b5eb6cc5d32d9ff507d565964" exitCode=0 Apr 23 09:01:05.719198 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:05.719129 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn" event={"ID":"815f4113-631c-4675-95a0-fe7ed8d24aab","Type":"ContainerDied","Data":"ff6ffdd94b1ea9711b5c76d305b0ed321a146b4b5eb6cc5d32d9ff507d565964"} Apr 23 09:01:06.843909 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:06.843887 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn" Apr 23 09:01:06.999467 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:06.999442 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhnqs\" (UniqueName: \"kubernetes.io/projected/815f4113-631c-4675-95a0-fe7ed8d24aab-kube-api-access-nhnqs\") pod \"815f4113-631c-4675-95a0-fe7ed8d24aab\" (UID: \"815f4113-631c-4675-95a0-fe7ed8d24aab\") " Apr 23 09:01:07.001434 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:07.001403 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/815f4113-631c-4675-95a0-fe7ed8d24aab-kube-api-access-nhnqs" (OuterVolumeSpecName: "kube-api-access-nhnqs") pod "815f4113-631c-4675-95a0-fe7ed8d24aab" (UID: "815f4113-631c-4675-95a0-fe7ed8d24aab"). InnerVolumeSpecName "kube-api-access-nhnqs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:01:07.100493 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:07.100471 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nhnqs\" (UniqueName: \"kubernetes.io/projected/815f4113-631c-4675-95a0-fe7ed8d24aab-kube-api-access-nhnqs\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 09:01:07.725869 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:07.725831 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn" event={"ID":"815f4113-631c-4675-95a0-fe7ed8d24aab","Type":"ContainerDied","Data":"f3fd3aae7a8ca7f63b939130b49db82f7fcb9494d619ea5e33524cd79d774c40"} Apr 23 09:01:07.725869 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:07.725861 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn" Apr 23 09:01:07.726078 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:07.725866 2559 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3fd3aae7a8ca7f63b939130b49db82f7fcb9494d619ea5e33524cd79d774c40" Apr 23 09:01:09.805100 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:09.805068 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt"] Apr 23 09:01:09.805496 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:09.805386 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="815f4113-631c-4675-95a0-fe7ed8d24aab" containerName="node" Apr 23 09:01:09.805496 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:09.805397 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="815f4113-631c-4675-95a0-fe7ed8d24aab" containerName="node" Apr 23 09:01:09.805496 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:09.805456 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="815f4113-631c-4675-95a0-fe7ed8d24aab" containerName="node" Apr 23 09:01:09.830844 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:09.830819 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt" Apr 23 09:01:09.830977 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:09.830818 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt"] Apr 23 09:01:09.832739 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:09.832716 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-dv8pw\"/\"openshift-service-ca.crt\"" Apr 23 09:01:09.833099 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:09.833081 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-dv8pw\"/\"kube-root-ca.crt\"" Apr 23 09:01:09.833208 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:09.833141 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-dv8pw\"/\"default-dockercfg-42m4p\"" Apr 23 09:01:09.918175 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:09.918147 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tkb5\" (UniqueName: \"kubernetes.io/projected/f73fb432-7fbc-4656-bc0d-93c4f795c889-kube-api-access-9tkb5\") pod \"progression-disabled-node-0-0-lc4pt\" (UID: \"f73fb432-7fbc-4656-bc0d-93c4f795c889\") " pod="rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt" Apr 23 09:01:10.019141 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:10.019115 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tkb5\" (UniqueName: \"kubernetes.io/projected/f73fb432-7fbc-4656-bc0d-93c4f795c889-kube-api-access-9tkb5\") pod \"progression-disabled-node-0-0-lc4pt\" (UID: \"f73fb432-7fbc-4656-bc0d-93c4f795c889\") " pod="rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt" Apr 23 09:01:10.026950 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:10.026931 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tkb5\" (UniqueName: \"kubernetes.io/projected/f73fb432-7fbc-4656-bc0d-93c4f795c889-kube-api-access-9tkb5\") pod \"progression-disabled-node-0-0-lc4pt\" (UID: \"f73fb432-7fbc-4656-bc0d-93c4f795c889\") " pod="rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt" Apr 23 09:01:10.140785 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:10.140731 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt" Apr 23 09:01:10.259793 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:10.259765 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt"] Apr 23 09:01:10.262019 ip-10-0-137-31 kubenswrapper[2559]: W0423 09:01:10.261973 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf73fb432_7fbc_4656_bc0d_93c4f795c889.slice/crio-27f518fd3b80bd6cb949a16b436ecd2454e461e9f34425231cf33bc073b6ba9d WatchSource:0}: Error finding container 27f518fd3b80bd6cb949a16b436ecd2454e461e9f34425231cf33bc073b6ba9d: Status 404 returned error can't find the container with id 27f518fd3b80bd6cb949a16b436ecd2454e461e9f34425231cf33bc073b6ba9d Apr 23 09:01:10.736858 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:10.736823 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt" event={"ID":"f73fb432-7fbc-4656-bc0d-93c4f795c889","Type":"ContainerStarted","Data":"14afaf0604758515dc229e651e37f70e749a009e8e5aa917139381ec6f6f18a7"} Apr 23 09:01:10.736858 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:10.736859 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt" event={"ID":"f73fb432-7fbc-4656-bc0d-93c4f795c889","Type":"ContainerStarted","Data":"27f518fd3b80bd6cb949a16b436ecd2454e461e9f34425231cf33bc073b6ba9d"} Apr 23 09:01:10.737090 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:10.736953 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt" Apr 23 09:01:10.752089 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:10.752045 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt" podStartSLOduration=1.7520304260000001 podStartE2EDuration="1.752030426s" podCreationTimestamp="2026-04-23 09:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:01:10.751122059 +0000 UTC m=+636.563727460" watchObservedRunningTime="2026-04-23 09:01:10.752030426 +0000 UTC m=+636.564635815" Apr 23 09:01:12.747699 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:12.747670 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt" Apr 23 09:01:33.740198 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:33.740161 2559 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt" podUID="f73fb432-7fbc-4656-bc0d-93c4f795c889" containerName="node" probeResult="failure" output="Get \"http://10.133.0.26:28080/metrics\": dial tcp 10.133.0.26:28080: connect: connection refused" Apr 23 09:01:33.821922 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:33.821897 2559 generic.go:358] "Generic (PLEG): container finished" podID="f73fb432-7fbc-4656-bc0d-93c4f795c889" containerID="14afaf0604758515dc229e651e37f70e749a009e8e5aa917139381ec6f6f18a7" exitCode=0 Apr 23 09:01:33.822068 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:33.821997 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt" event={"ID":"f73fb432-7fbc-4656-bc0d-93c4f795c889","Type":"ContainerDied","Data":"14afaf0604758515dc229e651e37f70e749a009e8e5aa917139381ec6f6f18a7"} Apr 23 09:01:34.942791 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:34.942766 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt" Apr 23 09:01:35.091522 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:35.091455 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tkb5\" (UniqueName: \"kubernetes.io/projected/f73fb432-7fbc-4656-bc0d-93c4f795c889-kube-api-access-9tkb5\") pod \"f73fb432-7fbc-4656-bc0d-93c4f795c889\" (UID: \"f73fb432-7fbc-4656-bc0d-93c4f795c889\") " Apr 23 09:01:35.093280 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:35.093255 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f73fb432-7fbc-4656-bc0d-93c4f795c889-kube-api-access-9tkb5" (OuterVolumeSpecName: "kube-api-access-9tkb5") pod "f73fb432-7fbc-4656-bc0d-93c4f795c889" (UID: "f73fb432-7fbc-4656-bc0d-93c4f795c889"). InnerVolumeSpecName "kube-api-access-9tkb5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:01:35.192863 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:35.192840 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9tkb5\" (UniqueName: \"kubernetes.io/projected/f73fb432-7fbc-4656-bc0d-93c4f795c889-kube-api-access-9tkb5\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 09:01:35.829830 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:35.829802 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt" event={"ID":"f73fb432-7fbc-4656-bc0d-93c4f795c889","Type":"ContainerDied","Data":"27f518fd3b80bd6cb949a16b436ecd2454e461e9f34425231cf33bc073b6ba9d"} Apr 23 09:01:35.829830 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:35.829829 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt" Apr 23 09:01:35.830057 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:35.829833 2559 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27f518fd3b80bd6cb949a16b436ecd2454e461e9f34425231cf33bc073b6ba9d" Apr 23 09:01:44.805918 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:44.805885 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7"] Apr 23 09:01:44.806324 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:44.806236 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f73fb432-7fbc-4656-bc0d-93c4f795c889" containerName="node" Apr 23 09:01:44.806324 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:44.806249 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="f73fb432-7fbc-4656-bc0d-93c4f795c889" containerName="node" Apr 23 09:01:44.806324 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:44.806301 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="f73fb432-7fbc-4656-bc0d-93c4f795c889" containerName="node" Apr 23 09:01:44.809206 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:44.809191 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7" Apr 23 09:01:44.811077 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:44.811056 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-dv8pw\"/\"openshift-service-ca.crt\"" Apr 23 09:01:44.811199 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:44.811055 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-dv8pw\"/\"default-dockercfg-42m4p\"" Apr 23 09:01:44.811199 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:44.811148 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-dv8pw\"/\"kube-root-ca.crt\"" Apr 23 09:01:44.826424 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:44.826402 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7"] Apr 23 09:01:44.862362 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:44.862340 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btlxd\" (UniqueName: \"kubernetes.io/projected/55da8be8-1179-4a1f-8e44-283aaf5b91a5-kube-api-access-btlxd\") pod \"progression-invalid-node-0-0-984m7\" (UID: \"55da8be8-1179-4a1f-8e44-283aaf5b91a5\") " pod="rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7" Apr 23 09:01:44.963078 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:44.963052 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btlxd\" (UniqueName: \"kubernetes.io/projected/55da8be8-1179-4a1f-8e44-283aaf5b91a5-kube-api-access-btlxd\") pod \"progression-invalid-node-0-0-984m7\" (UID: \"55da8be8-1179-4a1f-8e44-283aaf5b91a5\") " pod="rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7" Apr 23 09:01:44.970917 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:44.970893 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btlxd\" (UniqueName: \"kubernetes.io/projected/55da8be8-1179-4a1f-8e44-283aaf5b91a5-kube-api-access-btlxd\") pod \"progression-invalid-node-0-0-984m7\" (UID: \"55da8be8-1179-4a1f-8e44-283aaf5b91a5\") " pod="rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7" Apr 23 09:01:45.120416 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:45.120346 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7" Apr 23 09:01:45.235294 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:45.235270 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7"] Apr 23 09:01:45.237095 ip-10-0-137-31 kubenswrapper[2559]: W0423 09:01:45.237066 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55da8be8_1179_4a1f_8e44_283aaf5b91a5.slice/crio-c8d275947291105764de886881378752df8a87f23c2d21f2570a94dd15468378 WatchSource:0}: Error finding container c8d275947291105764de886881378752df8a87f23c2d21f2570a94dd15468378: Status 404 returned error can't find the container with id c8d275947291105764de886881378752df8a87f23c2d21f2570a94dd15468378 Apr 23 09:01:45.239170 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:45.239150 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:01:45.865504 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:45.864785 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7" event={"ID":"55da8be8-1179-4a1f-8e44-283aaf5b91a5","Type":"ContainerStarted","Data":"b8432cace783973f3de18d9d616fabdf3d689e8d4fa069ce1a58108f7befd1d9"} Apr 23 09:01:45.865504 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:45.864826 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7" event={"ID":"55da8be8-1179-4a1f-8e44-283aaf5b91a5","Type":"ContainerStarted","Data":"c8d275947291105764de886881378752df8a87f23c2d21f2570a94dd15468378"} Apr 23 09:01:45.865504 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:45.865468 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7" Apr 23 09:01:47.870247 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:47.870219 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7" Apr 23 09:01:47.886620 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:01:47.886567 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7" podStartSLOduration=3.886551296 podStartE2EDuration="3.886551296s" podCreationTimestamp="2026-04-23 09:01:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:01:45.882742107 +0000 UTC m=+671.695347497" watchObservedRunningTime="2026-04-23 09:01:47.886551296 +0000 UTC m=+673.699156685" Apr 23 09:02:08.868180 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:02:08.868140 2559 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7" podUID="55da8be8-1179-4a1f-8e44-283aaf5b91a5" containerName="node" probeResult="failure" output="Get \"http://10.133.0.27:28080/metrics\": dial tcp 10.133.0.27:28080: connect: connection refused" Apr 23 09:02:08.937656 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:02:08.937618 2559 generic.go:358] "Generic (PLEG): container finished" podID="55da8be8-1179-4a1f-8e44-283aaf5b91a5" containerID="b8432cace783973f3de18d9d616fabdf3d689e8d4fa069ce1a58108f7befd1d9" exitCode=0 Apr 23 09:02:08.937815 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:02:08.937696 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7" event={"ID":"55da8be8-1179-4a1f-8e44-283aaf5b91a5","Type":"ContainerDied","Data":"b8432cace783973f3de18d9d616fabdf3d689e8d4fa069ce1a58108f7befd1d9"} Apr 23 09:02:10.060551 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:02:10.060528 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7" Apr 23 09:02:10.146763 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:02:10.146733 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btlxd\" (UniqueName: \"kubernetes.io/projected/55da8be8-1179-4a1f-8e44-283aaf5b91a5-kube-api-access-btlxd\") pod \"55da8be8-1179-4a1f-8e44-283aaf5b91a5\" (UID: \"55da8be8-1179-4a1f-8e44-283aaf5b91a5\") " Apr 23 09:02:10.148713 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:02:10.148690 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55da8be8-1179-4a1f-8e44-283aaf5b91a5-kube-api-access-btlxd" (OuterVolumeSpecName: "kube-api-access-btlxd") pod "55da8be8-1179-4a1f-8e44-283aaf5b91a5" (UID: "55da8be8-1179-4a1f-8e44-283aaf5b91a5"). InnerVolumeSpecName "kube-api-access-btlxd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:02:10.247882 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:02:10.247861 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-btlxd\" (UniqueName: \"kubernetes.io/projected/55da8be8-1179-4a1f-8e44-283aaf5b91a5-kube-api-access-btlxd\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 09:02:10.945720 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:02:10.945692 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7" event={"ID":"55da8be8-1179-4a1f-8e44-283aaf5b91a5","Type":"ContainerDied","Data":"c8d275947291105764de886881378752df8a87f23c2d21f2570a94dd15468378"} Apr 23 09:02:10.945720 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:02:10.945723 2559 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8d275947291105764de886881378752df8a87f23c2d21f2570a94dd15468378" Apr 23 09:02:10.945889 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:02:10.945726 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7" Apr 23 09:04:00.433343 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:00.433269 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-dv8pw/progression-no-metrics-node-0-0-hldhg"] Apr 23 09:04:00.433713 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:00.433637 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55da8be8-1179-4a1f-8e44-283aaf5b91a5" containerName="node" Apr 23 09:04:00.433713 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:00.433650 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="55da8be8-1179-4a1f-8e44-283aaf5b91a5" containerName="node" Apr 23 09:04:00.433713 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:00.433706 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="55da8be8-1179-4a1f-8e44-283aaf5b91a5" containerName="node" Apr 23 09:04:00.436590 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:00.436571 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-no-metrics-node-0-0-hldhg" Apr 23 09:04:00.438322 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:00.438303 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-dv8pw\"/\"openshift-service-ca.crt\"" Apr 23 09:04:00.438415 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:00.438304 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-dv8pw\"/\"kube-root-ca.crt\"" Apr 23 09:04:00.438690 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:00.438675 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-dv8pw\"/\"default-dockercfg-42m4p\"" Apr 23 09:04:00.444503 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:00.444485 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-no-metrics-node-0-0-hldhg"] Apr 23 09:04:00.517110 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:00.517083 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dht4m\" (UniqueName: \"kubernetes.io/projected/7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a-kube-api-access-dht4m\") pod \"progression-no-metrics-node-0-0-hldhg\" (UID: \"7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a\") " pod="rhai-e2e-progression-dv8pw/progression-no-metrics-node-0-0-hldhg" Apr 23 09:04:00.618162 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:00.618136 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dht4m\" (UniqueName: \"kubernetes.io/projected/7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a-kube-api-access-dht4m\") pod \"progression-no-metrics-node-0-0-hldhg\" (UID: \"7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a\") " pod="rhai-e2e-progression-dv8pw/progression-no-metrics-node-0-0-hldhg" Apr 23 09:04:00.625753 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:00.625728 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dht4m\" (UniqueName: \"kubernetes.io/projected/7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a-kube-api-access-dht4m\") pod \"progression-no-metrics-node-0-0-hldhg\" (UID: \"7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a\") " pod="rhai-e2e-progression-dv8pw/progression-no-metrics-node-0-0-hldhg" Apr 23 09:04:00.746896 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:00.746874 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-no-metrics-node-0-0-hldhg" Apr 23 09:04:00.867541 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:00.867519 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-no-metrics-node-0-0-hldhg"] Apr 23 09:04:00.869797 ip-10-0-137-31 kubenswrapper[2559]: W0423 09:04:00.869771 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cf4d26c_395d_4d66_9ec2_f9ddbfcd131a.slice/crio-eff14258cd86fb66edebf6dd6dc977f31834ae5a299191d1c691b82dbae79c81 WatchSource:0}: Error finding container eff14258cd86fb66edebf6dd6dc977f31834ae5a299191d1c691b82dbae79c81: Status 404 returned error can't find the container with id eff14258cd86fb66edebf6dd6dc977f31834ae5a299191d1c691b82dbae79c81 Apr 23 09:04:01.317725 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:01.317686 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-no-metrics-node-0-0-hldhg" event={"ID":"7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a","Type":"ContainerStarted","Data":"5ce1b944ef99d18f9e8cdb7b49e895d20c3f01875f88316e7b00e40297ca740b"} Apr 23 09:04:01.317725 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:01.317727 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-no-metrics-node-0-0-hldhg" event={"ID":"7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a","Type":"ContainerStarted","Data":"eff14258cd86fb66edebf6dd6dc977f31834ae5a299191d1c691b82dbae79c81"} Apr 23 09:04:01.333507 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:01.333449 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-dv8pw/progression-no-metrics-node-0-0-hldhg" podStartSLOduration=1.3334325 podStartE2EDuration="1.3334325s" podCreationTimestamp="2026-04-23 09:04:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:04:01.332801438 +0000 UTC m=+807.145406825" watchObservedRunningTime="2026-04-23 09:04:01.3334325 +0000 UTC m=+807.146037893" Apr 23 09:04:06.334476 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:06.334445 2559 generic.go:358] "Generic (PLEG): container finished" podID="7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a" containerID="5ce1b944ef99d18f9e8cdb7b49e895d20c3f01875f88316e7b00e40297ca740b" exitCode=0 Apr 23 09:04:06.334819 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:06.334516 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-no-metrics-node-0-0-hldhg" event={"ID":"7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a","Type":"ContainerDied","Data":"5ce1b944ef99d18f9e8cdb7b49e895d20c3f01875f88316e7b00e40297ca740b"} Apr 23 09:04:07.458281 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:07.458261 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-no-metrics-node-0-0-hldhg" Apr 23 09:04:07.566855 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:07.566832 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dht4m\" (UniqueName: \"kubernetes.io/projected/7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a-kube-api-access-dht4m\") pod \"7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a\" (UID: \"7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a\") " Apr 23 09:04:07.568715 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:07.568696 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a-kube-api-access-dht4m" (OuterVolumeSpecName: "kube-api-access-dht4m") pod "7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a" (UID: "7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a"). InnerVolumeSpecName "kube-api-access-dht4m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:04:07.668045 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:07.667977 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dht4m\" (UniqueName: \"kubernetes.io/projected/7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a-kube-api-access-dht4m\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 09:04:08.341693 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:08.341663 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-no-metrics-node-0-0-hldhg" Apr 23 09:04:08.341858 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:08.341664 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-no-metrics-node-0-0-hldhg" event={"ID":"7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a","Type":"ContainerDied","Data":"eff14258cd86fb66edebf6dd6dc977f31834ae5a299191d1c691b82dbae79c81"} Apr 23 09:04:08.341858 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:08.341774 2559 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eff14258cd86fb66edebf6dd6dc977f31834ae5a299191d1c691b82dbae79c81" Apr 23 09:04:12.784021 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:12.783973 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b45tk/must-gather-6t6kd"] Apr 23 09:04:12.784375 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:12.784311 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a" containerName="node" Apr 23 09:04:12.784375 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:12.784323 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a" containerName="node" Apr 23 09:04:12.784449 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:12.784397 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a" containerName="node" Apr 23 09:04:12.787532 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:12.787518 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b45tk/must-gather-6t6kd" Apr 23 09:04:12.789327 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:12.789302 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-b45tk\"/\"default-dockercfg-5w67x\"" Apr 23 09:04:12.789427 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:12.789302 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b45tk\"/\"openshift-service-ca.crt\"" Apr 23 09:04:12.789427 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:12.789342 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b45tk\"/\"kube-root-ca.crt\"" Apr 23 09:04:12.794603 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:12.794474 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b45tk/must-gather-6t6kd"] Apr 23 09:04:12.902043 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:12.902019 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3e936f59-3191-4fe3-a45d-381713c176cc-must-gather-output\") pod \"must-gather-6t6kd\" (UID: \"3e936f59-3191-4fe3-a45d-381713c176cc\") " pod="openshift-must-gather-b45tk/must-gather-6t6kd" Apr 23 09:04:12.902138 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:12.902047 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq9jd\" (UniqueName: \"kubernetes.io/projected/3e936f59-3191-4fe3-a45d-381713c176cc-kube-api-access-vq9jd\") pod \"must-gather-6t6kd\" (UID: \"3e936f59-3191-4fe3-a45d-381713c176cc\") " pod="openshift-must-gather-b45tk/must-gather-6t6kd" Apr 23 09:04:13.003097 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:13.003069 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3e936f59-3191-4fe3-a45d-381713c176cc-must-gather-output\") pod \"must-gather-6t6kd\" (UID: \"3e936f59-3191-4fe3-a45d-381713c176cc\") " pod="openshift-must-gather-b45tk/must-gather-6t6kd" Apr 23 09:04:13.003097 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:13.003101 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vq9jd\" (UniqueName: \"kubernetes.io/projected/3e936f59-3191-4fe3-a45d-381713c176cc-kube-api-access-vq9jd\") pod \"must-gather-6t6kd\" (UID: \"3e936f59-3191-4fe3-a45d-381713c176cc\") " pod="openshift-must-gather-b45tk/must-gather-6t6kd" Apr 23 09:04:13.003380 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:13.003365 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3e936f59-3191-4fe3-a45d-381713c176cc-must-gather-output\") pod \"must-gather-6t6kd\" (UID: \"3e936f59-3191-4fe3-a45d-381713c176cc\") " pod="openshift-must-gather-b45tk/must-gather-6t6kd" Apr 23 09:04:13.010213 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:13.010185 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq9jd\" (UniqueName: \"kubernetes.io/projected/3e936f59-3191-4fe3-a45d-381713c176cc-kube-api-access-vq9jd\") pod \"must-gather-6t6kd\" (UID: \"3e936f59-3191-4fe3-a45d-381713c176cc\") " pod="openshift-must-gather-b45tk/must-gather-6t6kd" Apr 23 09:04:13.097284 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:13.097225 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b45tk/must-gather-6t6kd" Apr 23 09:04:13.213868 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:13.213832 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b45tk/must-gather-6t6kd"] Apr 23 09:04:13.215231 ip-10-0-137-31 kubenswrapper[2559]: W0423 09:04:13.215201 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e936f59_3191_4fe3_a45d_381713c176cc.slice/crio-1616b95b2827856682903638882ce913d6de88f7715e6975d08a1b5859ddaa87 WatchSource:0}: Error finding container 1616b95b2827856682903638882ce913d6de88f7715e6975d08a1b5859ddaa87: Status 404 returned error can't find the container with id 1616b95b2827856682903638882ce913d6de88f7715e6975d08a1b5859ddaa87 Apr 23 09:04:13.360332 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:13.360258 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b45tk/must-gather-6t6kd" event={"ID":"3e936f59-3191-4fe3-a45d-381713c176cc","Type":"ContainerStarted","Data":"1616b95b2827856682903638882ce913d6de88f7715e6975d08a1b5859ddaa87"} Apr 23 09:04:17.407268 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:17.407230 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt"] Apr 23 09:04:17.410583 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:17.410555 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-disabled-node-0-0-lc4pt"] Apr 23 09:04:17.414702 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:17.414679 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn"] Apr 23 09:04:17.419584 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:17.419562 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-enabled-node-0-0-98vtn"] Apr 23 09:04:17.423765 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:17.423744 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7"] Apr 23 09:04:17.427108 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:17.427088 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-invalid-node-0-0-984m7"] Apr 23 09:04:17.441998 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:17.441968 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-no-metrics-node-0-0-hldhg"] Apr 23 09:04:17.443690 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:17.443660 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-no-metrics-node-0-0-hldhg"] Apr 23 09:04:18.816603 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:18.816566 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55da8be8-1179-4a1f-8e44-283aaf5b91a5" path="/var/lib/kubelet/pods/55da8be8-1179-4a1f-8e44-283aaf5b91a5/volumes" Apr 23 09:04:18.817085 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:18.817067 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a" path="/var/lib/kubelet/pods/7cf4d26c-395d-4d66-9ec2-f9ddbfcd131a/volumes" Apr 23 09:04:18.817485 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:18.817470 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="815f4113-631c-4675-95a0-fe7ed8d24aab" path="/var/lib/kubelet/pods/815f4113-631c-4675-95a0-fe7ed8d24aab/volumes" Apr 23 09:04:18.817877 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:18.817861 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f73fb432-7fbc-4656-bc0d-93c4f795c889" path="/var/lib/kubelet/pods/f73fb432-7fbc-4656-bc0d-93c4f795c889/volumes" Apr 23 09:04:19.383996 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:19.383931 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b45tk/must-gather-6t6kd" event={"ID":"3e936f59-3191-4fe3-a45d-381713c176cc","Type":"ContainerStarted","Data":"06cbfed7510c73dc8d2453e5bbd22d49c845baaa3fae233e71f5cf95acd41050"} Apr 23 09:04:19.384152 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:19.384007 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b45tk/must-gather-6t6kd" event={"ID":"3e936f59-3191-4fe3-a45d-381713c176cc","Type":"ContainerStarted","Data":"1b1c90ef2d9a157d7eea1ffb2cad9972b3e157feea59b8a73b4b77c74ea4dd8d"} Apr 23 09:04:19.398350 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:04:19.398305 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b45tk/must-gather-6t6kd" podStartSLOduration=2.008314165 podStartE2EDuration="7.398289896s" podCreationTimestamp="2026-04-23 09:04:12 +0000 UTC" firstStartedPulling="2026-04-23 09:04:13.216787612 +0000 UTC m=+819.029392981" lastFinishedPulling="2026-04-23 09:04:18.606763342 +0000 UTC m=+824.419368712" observedRunningTime="2026-04-23 09:04:19.396880488 +0000 UTC m=+825.209485876" watchObservedRunningTime="2026-04-23 09:04:19.398289896 +0000 UTC m=+825.210895285" Apr 23 09:05:04.553487 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:04.553447 2559 generic.go:358] "Generic (PLEG): container finished" podID="3e936f59-3191-4fe3-a45d-381713c176cc" containerID="1b1c90ef2d9a157d7eea1ffb2cad9972b3e157feea59b8a73b4b77c74ea4dd8d" exitCode=0 Apr 23 09:05:04.553897 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:04.553512 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b45tk/must-gather-6t6kd" event={"ID":"3e936f59-3191-4fe3-a45d-381713c176cc","Type":"ContainerDied","Data":"1b1c90ef2d9a157d7eea1ffb2cad9972b3e157feea59b8a73b4b77c74ea4dd8d"} Apr 23 09:05:04.553897 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:04.553845 2559 scope.go:117] "RemoveContainer" containerID="1b1c90ef2d9a157d7eea1ffb2cad9972b3e157feea59b8a73b4b77c74ea4dd8d" Apr 23 09:05:04.793610 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:04.793581 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b45tk_must-gather-6t6kd_3e936f59-3191-4fe3-a45d-381713c176cc/gather/0.log" Apr 23 09:05:08.173939 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:08.173897 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8jbv7_4bd15f5a-3c09-4ed5-b70b-145f046716e5/global-pull-secret-syncer/0.log" Apr 23 09:05:08.334002 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:08.333953 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2zjq6_d2f6fea9-05ba-40e3-bbdf-c915828b21ac/konnectivity-agent/0.log" Apr 23 09:05:08.401543 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:08.401520 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-31.ec2.internal_c00815b7f1126dd8cdee9bcddd809207/haproxy/0.log" Apr 23 09:05:10.138668 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.138632 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b45tk/must-gather-6t6kd"] Apr 23 09:05:10.139076 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.138840 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-b45tk/must-gather-6t6kd" podUID="3e936f59-3191-4fe3-a45d-381713c176cc" containerName="copy" containerID="cri-o://06cbfed7510c73dc8d2453e5bbd22d49c845baaa3fae233e71f5cf95acd41050" gracePeriod=2 Apr 23 09:05:10.144391 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.144349 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b45tk/must-gather-6t6kd"] Apr 23 09:05:10.369403 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.369381 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b45tk_must-gather-6t6kd_3e936f59-3191-4fe3-a45d-381713c176cc/copy/0.log" Apr 23 09:05:10.369744 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.369729 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b45tk/must-gather-6t6kd" Apr 23 09:05:10.371127 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.371106 2559 status_manager.go:895] "Failed to get status for pod" podUID="3e936f59-3191-4fe3-a45d-381713c176cc" pod="openshift-must-gather-b45tk/must-gather-6t6kd" err="pods \"must-gather-6t6kd\" is forbidden: User \"system:node:ip-10-0-137-31.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-b45tk\": no relationship found between node 'ip-10-0-137-31.ec2.internal' and this object" Apr 23 09:05:10.505647 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.505627 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq9jd\" (UniqueName: \"kubernetes.io/projected/3e936f59-3191-4fe3-a45d-381713c176cc-kube-api-access-vq9jd\") pod \"3e936f59-3191-4fe3-a45d-381713c176cc\" (UID: \"3e936f59-3191-4fe3-a45d-381713c176cc\") " Apr 23 09:05:10.505734 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.505695 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3e936f59-3191-4fe3-a45d-381713c176cc-must-gather-output\") pod \"3e936f59-3191-4fe3-a45d-381713c176cc\" (UID: \"3e936f59-3191-4fe3-a45d-381713c176cc\") " Apr 23 09:05:10.507723 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.507688 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e936f59-3191-4fe3-a45d-381713c176cc-kube-api-access-vq9jd" (OuterVolumeSpecName: "kube-api-access-vq9jd") pod "3e936f59-3191-4fe3-a45d-381713c176cc" (UID: "3e936f59-3191-4fe3-a45d-381713c176cc"). InnerVolumeSpecName "kube-api-access-vq9jd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:05:10.507974 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.507958 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e936f59-3191-4fe3-a45d-381713c176cc-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3e936f59-3191-4fe3-a45d-381713c176cc" (UID: "3e936f59-3191-4fe3-a45d-381713c176cc"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 09:05:10.573938 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.573917 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b45tk_must-gather-6t6kd_3e936f59-3191-4fe3-a45d-381713c176cc/copy/0.log" Apr 23 09:05:10.574243 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.574220 2559 generic.go:358] "Generic (PLEG): container finished" podID="3e936f59-3191-4fe3-a45d-381713c176cc" containerID="06cbfed7510c73dc8d2453e5bbd22d49c845baaa3fae233e71f5cf95acd41050" exitCode=143 Apr 23 09:05:10.574298 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.574271 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b45tk/must-gather-6t6kd" Apr 23 09:05:10.574341 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.574318 2559 scope.go:117] "RemoveContainer" containerID="06cbfed7510c73dc8d2453e5bbd22d49c845baaa3fae233e71f5cf95acd41050" Apr 23 09:05:10.575821 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.575797 2559 status_manager.go:895] "Failed to get status for pod" podUID="3e936f59-3191-4fe3-a45d-381713c176cc" pod="openshift-must-gather-b45tk/must-gather-6t6kd" err="pods \"must-gather-6t6kd\" is forbidden: User \"system:node:ip-10-0-137-31.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-b45tk\": no relationship found between node 'ip-10-0-137-31.ec2.internal' and this object" Apr 23 09:05:10.582654 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.582488 2559 scope.go:117] "RemoveContainer" containerID="1b1c90ef2d9a157d7eea1ffb2cad9972b3e157feea59b8a73b4b77c74ea4dd8d" Apr 23 09:05:10.584404 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.584382 2559 status_manager.go:895] "Failed to get status for pod" podUID="3e936f59-3191-4fe3-a45d-381713c176cc" pod="openshift-must-gather-b45tk/must-gather-6t6kd" err="pods \"must-gather-6t6kd\" is forbidden: User \"system:node:ip-10-0-137-31.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-b45tk\": no relationship found between node 'ip-10-0-137-31.ec2.internal' and this object" Apr 23 09:05:10.593768 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.593754 2559 scope.go:117] "RemoveContainer" containerID="06cbfed7510c73dc8d2453e5bbd22d49c845baaa3fae233e71f5cf95acd41050" Apr 23 09:05:10.594019 ip-10-0-137-31 kubenswrapper[2559]: E0423 09:05:10.593977 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06cbfed7510c73dc8d2453e5bbd22d49c845baaa3fae233e71f5cf95acd41050\": container with ID starting with 06cbfed7510c73dc8d2453e5bbd22d49c845baaa3fae233e71f5cf95acd41050 not found: ID does not exist" containerID="06cbfed7510c73dc8d2453e5bbd22d49c845baaa3fae233e71f5cf95acd41050" Apr 23 09:05:10.594095 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.594032 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06cbfed7510c73dc8d2453e5bbd22d49c845baaa3fae233e71f5cf95acd41050"} err="failed to get container status \"06cbfed7510c73dc8d2453e5bbd22d49c845baaa3fae233e71f5cf95acd41050\": rpc error: code = NotFound desc = could not find container \"06cbfed7510c73dc8d2453e5bbd22d49c845baaa3fae233e71f5cf95acd41050\": container with ID starting with 06cbfed7510c73dc8d2453e5bbd22d49c845baaa3fae233e71f5cf95acd41050 not found: ID does not exist" Apr 23 09:05:10.594095 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.594060 2559 scope.go:117] "RemoveContainer" containerID="1b1c90ef2d9a157d7eea1ffb2cad9972b3e157feea59b8a73b4b77c74ea4dd8d" Apr 23 09:05:10.594282 ip-10-0-137-31 kubenswrapper[2559]: E0423 09:05:10.594264 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b1c90ef2d9a157d7eea1ffb2cad9972b3e157feea59b8a73b4b77c74ea4dd8d\": container with ID starting with 1b1c90ef2d9a157d7eea1ffb2cad9972b3e157feea59b8a73b4b77c74ea4dd8d not found: ID does not exist" containerID="1b1c90ef2d9a157d7eea1ffb2cad9972b3e157feea59b8a73b4b77c74ea4dd8d" Apr 23 09:05:10.594323 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.594287 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b1c90ef2d9a157d7eea1ffb2cad9972b3e157feea59b8a73b4b77c74ea4dd8d"} err="failed to get container status \"1b1c90ef2d9a157d7eea1ffb2cad9972b3e157feea59b8a73b4b77c74ea4dd8d\": rpc error: code = NotFound desc = could not find container \"1b1c90ef2d9a157d7eea1ffb2cad9972b3e157feea59b8a73b4b77c74ea4dd8d\": container with ID starting with 1b1c90ef2d9a157d7eea1ffb2cad9972b3e157feea59b8a73b4b77c74ea4dd8d not found: ID does not exist" Apr 23 09:05:10.606630 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.606613 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vq9jd\" (UniqueName: \"kubernetes.io/projected/3e936f59-3191-4fe3-a45d-381713c176cc-kube-api-access-vq9jd\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 09:05:10.606630 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.606630 2559 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3e936f59-3191-4fe3-a45d-381713c176cc-must-gather-output\") on node \"ip-10-0-137-31.ec2.internal\" DevicePath \"\"" Apr 23 09:05:10.814866 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:10.814805 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e936f59-3191-4fe3-a45d-381713c176cc" path="/var/lib/kubelet/pods/3e936f59-3191-4fe3-a45d-381713c176cc/volumes" Apr 23 09:05:11.680563 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:11.680493 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-vjqf6_88c6d6b4-1465-4419-b9d5-b3d67fda3332/cluster-monitoring-operator/0.log" Apr 23 09:05:11.703064 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:11.703040 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tmsrv_b021b7b6-fa1c-439c-a9d4-1ca2e800d088/kube-state-metrics/0.log" Apr 23 09:05:11.716624 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:11.716602 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tmsrv_b021b7b6-fa1c-439c-a9d4-1ca2e800d088/kube-rbac-proxy-main/0.log" Apr 23 09:05:11.733155 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:11.733138 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tmsrv_b021b7b6-fa1c-439c-a9d4-1ca2e800d088/kube-rbac-proxy-self/0.log" Apr 23 09:05:11.781025 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:11.781003 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-w2l5x_2921030c-c941-46f7-b825-ed90bf427d87/monitoring-plugin/0.log" Apr 23 09:05:11.885856 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:11.885834 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tgj7l_c306769a-094a-45dc-86f8-c3de6fc5d9e1/node-exporter/0.log" Apr 23 09:05:11.901575 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:11.901556 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tgj7l_c306769a-094a-45dc-86f8-c3de6fc5d9e1/kube-rbac-proxy/0.log" Apr 23 09:05:11.919147 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:11.919131 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tgj7l_c306769a-094a-45dc-86f8-c3de6fc5d9e1/init-textfile/0.log" Apr 23 09:05:12.014564 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.014546 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-twl26_85da8388-1bc2-4cac-b714-3814193f1216/kube-rbac-proxy-main/0.log" Apr 23 09:05:12.029901 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.029876 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-twl26_85da8388-1bc2-4cac-b714-3814193f1216/kube-rbac-proxy-self/0.log" Apr 23 09:05:12.046362 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.046343 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-twl26_85da8388-1bc2-4cac-b714-3814193f1216/openshift-state-metrics/0.log" Apr 23 09:05:12.073044 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.073024 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a/prometheus/0.log" Apr 23 09:05:12.090812 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.090788 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a/config-reloader/0.log" Apr 23 09:05:12.108155 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.108131 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a/thanos-sidecar/0.log" Apr 23 09:05:12.124755 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.124739 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a/kube-rbac-proxy-web/0.log" Apr 23 09:05:12.142828 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.142806 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a/kube-rbac-proxy/0.log" Apr 23 09:05:12.162291 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.162276 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a/kube-rbac-proxy-thanos/0.log" Apr 23 09:05:12.180849 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.180831 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0b471e3e-ed5f-4084-b9b0-d9f7ef7c985a/init-config-reloader/0.log" Apr 23 09:05:12.207719 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.207701 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-q8mrb_34002683-14aa-4497-a022-e41ce886599a/prometheus-operator/0.log" Apr 23 09:05:12.220050 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.220030 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-q8mrb_34002683-14aa-4497-a022-e41ce886599a/kube-rbac-proxy/0.log" Apr 23 09:05:12.240904 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.240880 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-2bc5z_5a69f4af-57b2-4d08-a860-f69a31bc13f5/prometheus-operator-admission-webhook/0.log" Apr 23 09:05:12.336872 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.336793 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-56bdc7c8dd-qg52t_47eda6be-5d7e-4d5c-a453-64682ed1caec/thanos-query/0.log" Apr 23 09:05:12.353929 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.353902 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-56bdc7c8dd-qg52t_47eda6be-5d7e-4d5c-a453-64682ed1caec/kube-rbac-proxy-web/0.log" Apr 23 09:05:12.368263 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.368242 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-56bdc7c8dd-qg52t_47eda6be-5d7e-4d5c-a453-64682ed1caec/kube-rbac-proxy/0.log" Apr 23 09:05:12.384452 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.384437 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-56bdc7c8dd-qg52t_47eda6be-5d7e-4d5c-a453-64682ed1caec/prom-label-proxy/0.log" Apr 23 09:05:12.399415 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.399396 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-56bdc7c8dd-qg52t_47eda6be-5d7e-4d5c-a453-64682ed1caec/kube-rbac-proxy-rules/0.log" Apr 23 09:05:12.414699 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:12.414667 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-56bdc7c8dd-qg52t_47eda6be-5d7e-4d5c-a453-64682ed1caec/kube-rbac-proxy-metrics/0.log" Apr 23 09:05:14.987421 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:14.987390 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w"] Apr 23 09:05:14.987872 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:14.987725 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e936f59-3191-4fe3-a45d-381713c176cc" containerName="gather" Apr 23 09:05:14.987872 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:14.987737 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e936f59-3191-4fe3-a45d-381713c176cc" containerName="gather" Apr 23 09:05:14.987872 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:14.987751 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e936f59-3191-4fe3-a45d-381713c176cc" containerName="copy" Apr 23 09:05:14.987872 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:14.987758 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e936f59-3191-4fe3-a45d-381713c176cc" containerName="copy" Apr 23 09:05:14.987872 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:14.987820 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e936f59-3191-4fe3-a45d-381713c176cc" containerName="copy" Apr 23 09:05:14.987872 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:14.987835 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e936f59-3191-4fe3-a45d-381713c176cc" containerName="gather" Apr 23 09:05:14.992535 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:14.992514 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:14.994054 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:14.994035 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7xsfl\"/\"openshift-service-ca.crt\"" Apr 23 09:05:14.994156 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:14.994111 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7xsfl\"/\"kube-root-ca.crt\"" Apr 23 09:05:14.994467 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:14.994451 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7xsfl\"/\"default-dockercfg-x5hz6\"" Apr 23 09:05:15.000943 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.000921 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w"] Apr 23 09:05:15.139223 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.139198 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c95ea4ae-262b-4068-92ee-d6083f47a4ad-lib-modules\") pod \"perf-node-gather-daemonset-hmw9w\" (UID: \"c95ea4ae-262b-4068-92ee-d6083f47a4ad\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:15.139329 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.139238 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn8w8\" (UniqueName: \"kubernetes.io/projected/c95ea4ae-262b-4068-92ee-d6083f47a4ad-kube-api-access-rn8w8\") pod \"perf-node-gather-daemonset-hmw9w\" (UID: \"c95ea4ae-262b-4068-92ee-d6083f47a4ad\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:15.139329 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.139268 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c95ea4ae-262b-4068-92ee-d6083f47a4ad-sys\") pod \"perf-node-gather-daemonset-hmw9w\" (UID: \"c95ea4ae-262b-4068-92ee-d6083f47a4ad\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:15.139329 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.139309 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c95ea4ae-262b-4068-92ee-d6083f47a4ad-podres\") pod \"perf-node-gather-daemonset-hmw9w\" (UID: \"c95ea4ae-262b-4068-92ee-d6083f47a4ad\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:15.139434 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.139362 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c95ea4ae-262b-4068-92ee-d6083f47a4ad-proc\") pod \"perf-node-gather-daemonset-hmw9w\" (UID: \"c95ea4ae-262b-4068-92ee-d6083f47a4ad\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:15.240261 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.240201 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rn8w8\" (UniqueName: \"kubernetes.io/projected/c95ea4ae-262b-4068-92ee-d6083f47a4ad-kube-api-access-rn8w8\") pod \"perf-node-gather-daemonset-hmw9w\" (UID: \"c95ea4ae-262b-4068-92ee-d6083f47a4ad\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:15.240261 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.240244 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c95ea4ae-262b-4068-92ee-d6083f47a4ad-sys\") pod \"perf-node-gather-daemonset-hmw9w\" (UID: \"c95ea4ae-262b-4068-92ee-d6083f47a4ad\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:15.240261 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.240264 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c95ea4ae-262b-4068-92ee-d6083f47a4ad-podres\") pod \"perf-node-gather-daemonset-hmw9w\" (UID: \"c95ea4ae-262b-4068-92ee-d6083f47a4ad\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:15.240415 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.240326 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c95ea4ae-262b-4068-92ee-d6083f47a4ad-sys\") pod \"perf-node-gather-daemonset-hmw9w\" (UID: \"c95ea4ae-262b-4068-92ee-d6083f47a4ad\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:15.240415 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.240361 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c95ea4ae-262b-4068-92ee-d6083f47a4ad-proc\") pod \"perf-node-gather-daemonset-hmw9w\" (UID: \"c95ea4ae-262b-4068-92ee-d6083f47a4ad\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:15.240415 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.240390 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c95ea4ae-262b-4068-92ee-d6083f47a4ad-proc\") pod \"perf-node-gather-daemonset-hmw9w\" (UID: \"c95ea4ae-262b-4068-92ee-d6083f47a4ad\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:15.240415 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.240364 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c95ea4ae-262b-4068-92ee-d6083f47a4ad-podres\") pod \"perf-node-gather-daemonset-hmw9w\" (UID: \"c95ea4ae-262b-4068-92ee-d6083f47a4ad\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:15.240544 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.240437 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c95ea4ae-262b-4068-92ee-d6083f47a4ad-lib-modules\") pod \"perf-node-gather-daemonset-hmw9w\" (UID: \"c95ea4ae-262b-4068-92ee-d6083f47a4ad\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:15.240544 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.240524 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c95ea4ae-262b-4068-92ee-d6083f47a4ad-lib-modules\") pod \"perf-node-gather-daemonset-hmw9w\" (UID: \"c95ea4ae-262b-4068-92ee-d6083f47a4ad\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:15.247170 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.247145 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn8w8\" (UniqueName: \"kubernetes.io/projected/c95ea4ae-262b-4068-92ee-d6083f47a4ad-kube-api-access-rn8w8\") pod \"perf-node-gather-daemonset-hmw9w\" (UID: \"c95ea4ae-262b-4068-92ee-d6083f47a4ad\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:15.303291 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.303265 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:15.355641 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.355612 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ghbpk_26cb0581-bb5b-4912-8a69-48378d6dc35b/dns/0.log" Apr 23 09:05:15.371620 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.371597 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ghbpk_26cb0581-bb5b-4912-8a69-48378d6dc35b/kube-rbac-proxy/0.log" Apr 23 09:05:15.422379 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.422261 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w"] Apr 23 09:05:15.424913 ip-10-0-137-31 kubenswrapper[2559]: W0423 09:05:15.424887 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc95ea4ae_262b_4068_92ee_d6083f47a4ad.slice/crio-ad066fbb27c69dc0a451b3031aef429c6e82d3d47eafd4378ae110676d1afbe3 WatchSource:0}: Error finding container ad066fbb27c69dc0a451b3031aef429c6e82d3d47eafd4378ae110676d1afbe3: Status 404 returned error can't find the container with id ad066fbb27c69dc0a451b3031aef429c6e82d3d47eafd4378ae110676d1afbe3 Apr 23 09:05:15.460933 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.460913 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-dnx2l_c5cbe02b-2850-4700-8339-43f2fe5f24d5/dns-node-resolver/0.log" Apr 23 09:05:15.592874 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.592820 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" event={"ID":"c95ea4ae-262b-4068-92ee-d6083f47a4ad","Type":"ContainerStarted","Data":"5e0e3dc98295747aec58f8e2f5d89e683ff58a6ee95e7678cce915e278795a6c"} Apr 23 09:05:15.592874 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.592851 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" event={"ID":"c95ea4ae-262b-4068-92ee-d6083f47a4ad","Type":"ContainerStarted","Data":"ad066fbb27c69dc0a451b3031aef429c6e82d3d47eafd4378ae110676d1afbe3"} Apr 23 09:05:15.593023 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.592962 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:15.608047 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.607946 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" podStartSLOduration=1.60793078 podStartE2EDuration="1.60793078s" podCreationTimestamp="2026-04-23 09:05:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:05:15.606274671 +0000 UTC m=+881.418880058" watchObservedRunningTime="2026-04-23 09:05:15.60793078 +0000 UTC m=+881.420536172" Apr 23 09:05:15.904931 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:15.904854 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j7htq_d23173d3-6e3a-4b55-b1dc-7075f2278e15/node-ca/0.log" Apr 23 09:05:16.858110 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:16.858084 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-l59kj_492da4be-9e87-49d4-91cf-b96c4bece553/serve-healthcheck-canary/0.log" Apr 23 09:05:17.272713 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:17.272686 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2dpsp_761dbda3-1985-42a4-a075-6cb13ccc1d11/kube-rbac-proxy/0.log" Apr 23 09:05:17.290628 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:17.290593 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2dpsp_761dbda3-1985-42a4-a075-6cb13ccc1d11/exporter/0.log" Apr 23 09:05:17.309439 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:17.309417 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2dpsp_761dbda3-1985-42a4-a075-6cb13ccc1d11/extractor/0.log" Apr 23 09:05:21.605887 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:21.605855 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-hmw9w" Apr 23 09:05:22.277305 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:22.277275 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-2hzz6_158e7a7b-4a03-4678-8b2c-0dc7d0b7913c/kube-storage-version-migrator-operator/1.log" Apr 23 09:05:22.279077 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:22.279052 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-2hzz6_158e7a7b-4a03-4678-8b2c-0dc7d0b7913c/kube-storage-version-migrator-operator/0.log" Apr 23 09:05:23.550132 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:23.550106 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nxtdf_757f8beb-3271-44ad-88be-22369a09a56a/kube-multus-additional-cni-plugins/0.log" Apr 23 09:05:23.568344 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:23.568321 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nxtdf_757f8beb-3271-44ad-88be-22369a09a56a/egress-router-binary-copy/0.log" Apr 23 09:05:23.586213 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:23.586191 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nxtdf_757f8beb-3271-44ad-88be-22369a09a56a/cni-plugins/0.log" Apr 23 09:05:23.605109 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:23.605085 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nxtdf_757f8beb-3271-44ad-88be-22369a09a56a/bond-cni-plugin/0.log" Apr 23 09:05:23.621593 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:23.621572 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nxtdf_757f8beb-3271-44ad-88be-22369a09a56a/routeoverride-cni/0.log" Apr 23 09:05:23.636706 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:23.636690 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nxtdf_757f8beb-3271-44ad-88be-22369a09a56a/whereabouts-cni-bincopy/0.log" Apr 23 09:05:23.652826 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:23.652805 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nxtdf_757f8beb-3271-44ad-88be-22369a09a56a/whereabouts-cni/0.log" Apr 23 09:05:23.678657 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:23.678640 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pbxp4_be6f3563-0a3f-4959-9cda-d87e7a467749/kube-multus/0.log" Apr 23 09:05:23.789441 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:23.789417 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lx4sg_61450b58-933b-4b5d-bf40-9e4408670e3e/network-metrics-daemon/0.log" Apr 23 09:05:23.802833 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:23.802800 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lx4sg_61450b58-933b-4b5d-bf40-9e4408670e3e/kube-rbac-proxy/0.log" Apr 23 09:05:25.155921 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:25.155849 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sqslz_76d63805-1298-4d8e-8c56-96c2091a8697/ovn-controller/0.log" Apr 23 09:05:25.176486 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:25.176457 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sqslz_76d63805-1298-4d8e-8c56-96c2091a8697/ovn-acl-logging/0.log" Apr 23 09:05:25.193879 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:25.193859 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sqslz_76d63805-1298-4d8e-8c56-96c2091a8697/kube-rbac-proxy-node/0.log" Apr 23 09:05:25.212737 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:25.212717 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sqslz_76d63805-1298-4d8e-8c56-96c2091a8697/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 09:05:25.226038 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:25.226018 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sqslz_76d63805-1298-4d8e-8c56-96c2091a8697/northd/0.log" Apr 23 09:05:25.240378 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:25.240358 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sqslz_76d63805-1298-4d8e-8c56-96c2091a8697/nbdb/0.log" Apr 23 09:05:25.255598 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:25.255581 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sqslz_76d63805-1298-4d8e-8c56-96c2091a8697/sbdb/0.log" Apr 23 09:05:25.411120 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:25.411062 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sqslz_76d63805-1298-4d8e-8c56-96c2091a8697/ovnkube-controller/0.log" Apr 23 09:05:26.329892 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:26.329867 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-mbsts_80e7d027-15dd-449f-b406-53648881a780/check-endpoints/0.log" Apr 23 09:05:26.365873 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:26.365852 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-7z6cq_4dc177c2-0f81-4db4-ac46-adbf96e2b0c5/network-check-target-container/0.log" Apr 23 09:05:27.241609 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:27.241581 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-fx7dn_de15a57c-d8b0-4a96-8a86-1a6ba933e6d9/iptables-alerter/0.log" Apr 23 09:05:27.854433 ip-10-0-137-31 kubenswrapper[2559]: I0423 09:05:27.854402 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-jjkvv_3a8842ac-bb6d-4882-b788-6c3e16e84191/tuned/0.log"