Apr 16 14:49:48.723025 ip-10-0-141-195 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 14:49:48.723037 ip-10-0-141-195 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 14:49:48.723045 ip-10-0-141-195 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 14:49:48.723258 ip-10-0-141-195 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 14:49:58.927380 ip-10-0-141-195 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 14:49:58.927395 ip-10-0-141-195 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 88bea7f63fe94384891ed3c430c2b76a -- Apr 16 14:52:18.941973 ip-10-0-141-195 systemd[1]: Starting Kubernetes Kubelet... Apr 16 14:52:19.449217 ip-10-0-141-195 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:19.449217 ip-10-0-141-195 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 14:52:19.449217 ip-10-0-141-195 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:19.449217 ip-10-0-141-195 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 14:52:19.449217 ip-10-0-141-195 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:19.450750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.450673 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 14:52:19.454654 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454627 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:19.454692 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454656 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:19.454692 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454661 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:19.454692 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454665 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:19.454692 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454668 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:19.454692 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454670 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:19.454692 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454675 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:19.454692 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454677 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:19.454692 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454680 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:19.454692 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454683 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:19.454692 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454685 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:19.454692 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454688 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:19.454692 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454691 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:19.454692 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454694 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:19.454692 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454697 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454700 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454703 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454705 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454708 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454711 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454713 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454718 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454722 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454725 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454728 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454730 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454733 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454736 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454739 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454741 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454744 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454747 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454749 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454753 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:19.455022 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454757 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454759 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454762 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454764 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454766 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454769 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454771 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454774 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454776 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454779 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454781 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454783 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454786 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454788 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454792 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454794 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454797 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454799 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454802 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454805 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:19.455495 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454807 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454813 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454815 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454818 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454820 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454823 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454826 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454828 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454831 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454833 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454836 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454839 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454841 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454843 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454846 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454848 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454851 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454853 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454856 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454858 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:19.455998 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454861 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454863 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454866 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454868 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454872 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454876 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454883 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454886 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454888 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454891 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454893 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.454896 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455278 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455284 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455287 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455290 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455293 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455297 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455300 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:19.456485 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455304 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455308 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455311 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455314 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455318 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455320 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455323 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455326 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455329 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455332 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455335 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455337 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455340 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455343 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455345 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455348 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455351 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455353 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455356 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455359 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:19.456950 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455363 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455365 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455368 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455370 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455373 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455376 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455378 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455382 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455385 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455387 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455390 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455392 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455395 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455397 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455400 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455402 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455405 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455407 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455409 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455412 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:19.457460 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455415 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455418 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455420 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455423 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455426 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455429 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455432 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455435 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455437 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455440 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455443 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455446 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455448 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455451 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455453 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455456 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455459 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455461 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455464 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455466 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:19.457957 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455469 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455472 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455474 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455477 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455479 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455482 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455484 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455487 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455489 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455492 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455494 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455498 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455500 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455503 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455506 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455509 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455511 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455514 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.455517 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456222 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456232 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 14:52:19.458438 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456238 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456242 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456247 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456250 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456255 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456259 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456262 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456266 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456270 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456273 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456276 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456280 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456284 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456287 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456290 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456293 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456296 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456301 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456304 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456307 2576 flags.go:64] FLAG: --config-dir="" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456309 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456313 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456317 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456320 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 14:52:19.458965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456323 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456326 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456329 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456332 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456335 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456338 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456341 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456345 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456348 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456352 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456354 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456358 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456360 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456366 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456369 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456372 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456375 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456378 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456382 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456385 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456388 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456391 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456394 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456397 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456400 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 14:52:19.459540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456403 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456406 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456409 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456411 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456415 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456419 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456422 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456425 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456428 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456431 2576 flags.go:64] FLAG: --help="false" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456434 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-141-195.ec2.internal" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456437 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456440 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456443 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456446 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456450 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456453 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456456 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456461 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456464 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456466 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456470 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456473 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456475 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 14:52:19.460165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456478 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456482 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456484 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456487 2576 flags.go:64] FLAG: --lock-file="" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456490 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456494 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456497 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456502 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456505 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456508 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456511 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456514 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456517 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456520 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456523 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456527 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456530 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456534 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456537 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456540 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456543 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456546 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456549 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456553 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456556 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 14:52:19.460750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456564 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456568 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456571 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456574 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456577 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456583 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456586 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456589 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456592 2576 flags.go:64] FLAG: --port="10250" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456595 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456598 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0cf067fa6ed03abaa" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456602 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456605 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456608 2576 flags.go:64] FLAG: --register-node="true" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456611 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456614 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456617 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456620 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456622 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456625 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456629 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456632 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456648 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456651 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456654 2576 flags.go:64] FLAG: --runonce="false" Apr 16 14:52:19.461346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456656 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456659 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456663 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456666 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456668 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456671 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456675 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456678 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456682 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456685 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456688 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456691 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456694 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456697 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456700 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456705 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456708 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456710 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456715 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456717 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456720 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456724 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456727 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456730 2576 flags.go:64] FLAG: --v="2" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456734 2576 flags.go:64] FLAG: --version="false" Apr 16 14:52:19.461957 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456738 2576 flags.go:64] FLAG: --vmodule="" Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456742 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.456745 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456835 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456839 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456842 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456844 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456847 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456849 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456852 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456855 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456857 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456860 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456863 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456867 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456872 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456875 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456878 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456881 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:19.462584 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456884 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456887 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456889 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456892 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456894 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456897 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456900 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456903 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456905 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456908 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456911 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456914 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456916 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456919 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456921 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456924 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456927 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456931 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456934 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456937 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:19.463067 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456939 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456942 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456944 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456947 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456949 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456952 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456954 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456957 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456962 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456964 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456967 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456969 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456972 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456974 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456977 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456979 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456982 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456984 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456987 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456989 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:19.463619 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456992 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456995 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.456997 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457000 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457003 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457005 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457008 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457010 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457012 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457015 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457017 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457020 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457022 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457024 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457027 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457029 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457032 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457034 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457037 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457039 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:19.464498 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457043 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:19.465028 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457045 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:19.465028 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457048 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:19.465028 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457051 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:19.465028 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457053 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:19.465028 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457056 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:19.465028 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457058 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:19.465028 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457061 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:19.465028 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457063 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:19.465028 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.457065 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:19.465028 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.457698 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:19.465305 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.465286 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 14:52:19.465335 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.465307 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 14:52:19.465362 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465355 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:19.465391 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465362 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:19.465391 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465367 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:19.465391 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465371 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:19.465391 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465374 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:19.465391 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465377 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:19.465391 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465380 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:19.465391 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465383 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:19.465391 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465386 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:19.465391 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465388 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:19.465391 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465391 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:19.465391 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465394 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465397 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465400 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465402 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465405 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465408 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465410 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465413 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465416 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465418 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465421 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465424 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465426 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465429 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465432 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465434 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465437 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465439 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465442 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465444 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:19.465683 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465447 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465450 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465452 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465455 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465457 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465460 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465462 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465465 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465467 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465469 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465472 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465475 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465487 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465491 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465495 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465497 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465500 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465503 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465507 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465510 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:19.466162 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465513 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465515 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465518 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465521 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465523 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465526 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465528 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465531 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465533 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465536 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465538 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465541 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465543 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465545 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465548 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465551 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465553 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465556 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465559 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465561 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:19.466759 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465564 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:19.467240 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465566 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:19.467240 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465569 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:19.467240 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465572 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:19.467240 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465574 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:19.467240 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465582 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:19.467240 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465584 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:19.467240 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465587 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:19.467240 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465589 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:19.467240 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465592 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:19.467240 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465595 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:19.467240 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465597 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:19.467240 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465599 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:19.467240 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465602 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:19.467240 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465605 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:19.467240 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.465610 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:19.467240 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465737 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465743 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465746 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465749 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465752 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465755 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465758 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465761 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465763 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465766 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465768 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465771 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465774 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465776 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465779 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465782 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465784 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465787 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465790 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465792 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:19.467649 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465795 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465797 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465804 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465807 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465810 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465812 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465815 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465817 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465820 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465822 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465825 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465827 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465830 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465832 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465835 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465837 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465840 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465842 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465845 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:19.468132 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465847 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465849 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465853 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465855 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465857 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465860 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465862 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465865 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465867 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465870 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465873 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465875 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465877 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465880 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465882 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465885 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465888 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465891 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465893 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:19.468621 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465896 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465899 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465901 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465904 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465906 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465909 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465911 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465914 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465917 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465919 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465922 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465924 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465926 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465929 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465931 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465934 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465936 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465939 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465941 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465944 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:19.469131 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465948 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:19.469616 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465952 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:19.469616 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465955 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:19.469616 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465958 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:19.469616 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465961 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:19.469616 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465965 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:19.469616 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465968 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:19.469616 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:19.465971 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:19.469616 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.465975 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:19.469616 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.466756 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 14:52:19.469616 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.469463 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 14:52:19.470505 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.470493 2576 server.go:1019] "Starting client certificate rotation" Apr 16 14:52:19.470604 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.470588 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:19.470633 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.470622 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:19.498791 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.498765 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:19.501327 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.501307 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:19.516066 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.516045 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 14:52:19.521370 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.521356 2576 log.go:25] "Validated CRI v1 image API" Apr 16 14:52:19.523512 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.523499 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 14:52:19.526538 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.526507 2576 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 86f3104d-02dc-49d2-95a9-9477df2cb6b9:/dev/nvme0n1p3 b07f8a88-780d-4a2a-ab90-8aa108d47b24:/dev/nvme0n1p4] Apr 16 14:52:19.526613 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.526536 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 14:52:19.531606 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.531578 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:19.531723 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.531506 2576 manager.go:217] Machine: {Timestamp:2026-04-16 14:52:19.530289409 +0000 UTC m=+0.448164216 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099295 MemoryCapacity:32812163072 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b9625fe4627ccd546d3f08895e822 SystemUUID:ec2b9625-fe46-27cc-d546-d3f08895e822 BootID:88bea7f6-3fe9-4384-891e-d3c430c2b76a Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406081536 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:37:41:2e:e9:cf Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:37:41:2e:e9:cf Speed:0 Mtu:9001} {Name:ovs-system MacAddress:be:66:b2:44:66:ac Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812163072 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 14:52:19.531723 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.531721 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 14:52:19.531822 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.531808 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 14:52:19.533919 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.533891 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 14:52:19.534052 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.533922 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-195.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 14:52:19.534097 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.534062 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 14:52:19.534097 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.534071 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 14:52:19.534097 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.534084 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:19.534796 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.534786 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:19.536239 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.536228 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:19.536497 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.536488 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 14:52:19.539598 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.539589 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 14:52:19.539630 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.539603 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 14:52:19.539630 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.539615 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 14:52:19.539630 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.539624 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 14:52:19.539734 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.539633 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 14:52:19.541242 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.541228 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:19.541294 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.541250 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:19.542936 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.542918 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dl4db" Apr 16 14:52:19.544344 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.544327 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 14:52:19.546659 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.546632 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 14:52:19.548171 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.548158 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 14:52:19.548242 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.548175 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 14:52:19.548242 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.548182 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 14:52:19.548242 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.548187 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 14:52:19.548242 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.548194 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 14:52:19.548242 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.548204 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 14:52:19.548242 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.548233 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 14:52:19.548242 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.548240 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 14:52:19.548431 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.548248 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 14:52:19.548431 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.548254 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 14:52:19.548431 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.548269 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 14:52:19.548431 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.548278 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 14:52:19.548431 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.548308 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 14:52:19.548431 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.548318 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 14:52:19.550827 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.550806 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dl4db" Apr 16 14:52:19.552513 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.552499 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 14:52:19.552557 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.552539 2576 server.go:1295] "Started kubelet" Apr 16 14:52:19.552717 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.552667 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 14:52:19.552717 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.552681 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-195.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 14:52:19.552844 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.552664 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 14:52:19.552844 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:19.552763 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-195.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 14:52:19.552844 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:19.552763 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 14:52:19.552844 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.552780 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 14:52:19.553291 ip-10-0-141-195 systemd[1]: Started Kubernetes Kubelet. Apr 16 14:52:19.553920 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.553780 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 14:52:19.554061 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.554045 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 14:52:19.559458 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.559438 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:19.560540 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.560350 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 14:52:19.561563 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:19.561509 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-195.ec2.internal\" not found" Apr 16 14:52:19.561679 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.561663 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 14:52:19.561759 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.561685 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 14:52:19.561759 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.561671 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 14:52:19.561839 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.561760 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 14:52:19.561839 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.561769 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 14:52:19.563397 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.563376 2576 factory.go:55] Registering systemd factory Apr 16 14:52:19.563397 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.563397 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 14:52:19.563621 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:19.563601 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 14:52:19.563738 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.563656 2576 factory.go:153] Registering CRI-O factory Apr 16 14:52:19.563738 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.563672 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 14:52:19.563738 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.563737 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 14:52:19.563888 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.563762 2576 factory.go:103] Registering Raw factory Apr 16 14:52:19.563888 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.563776 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 14:52:19.564263 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.564249 2576 manager.go:319] Starting recovery of all containers Apr 16 14:52:19.567703 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.567680 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:19.572104 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.571983 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 14:52:19.575670 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:19.575500 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-195.ec2.internal\" not found" node="ip-10-0-141-195.ec2.internal" Apr 16 14:52:19.579009 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.578992 2576 manager.go:324] Recovery completed Apr 16 14:52:19.582882 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.582870 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:19.585285 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.585264 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:19.585348 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.585292 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:19.585348 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.585305 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:19.585781 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.585768 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 14:52:19.585781 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.585779 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 14:52:19.585862 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.585798 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:19.588215 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.588202 2576 policy_none.go:49] "None policy: Start" Apr 16 14:52:19.588215 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.588217 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 14:52:19.588299 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.588226 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 14:52:19.638961 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.627864 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 14:52:19.638961 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:19.627901 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 14:52:19.638961 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.627911 2576 server.go:85] "Starting device plugin registration server" Apr 16 14:52:19.638961 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.628139 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 14:52:19.638961 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.628159 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 14:52:19.638961 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.628248 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 14:52:19.638961 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.628327 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 14:52:19.638961 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.628338 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 14:52:19.638961 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:19.628876 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 14:52:19.638961 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:19.628918 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-195.ec2.internal\" not found" Apr 16 14:52:19.704588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.704530 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 14:52:19.704588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.704558 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 14:52:19.704588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.704573 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 14:52:19.704588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.704579 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 14:52:19.704817 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:19.704608 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 14:52:19.708909 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.708892 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:19.728845 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.728829 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:19.729564 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.729550 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:19.729629 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.729581 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:19.729629 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.729592 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:19.729629 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.729612 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-195.ec2.internal" Apr 16 14:52:19.738917 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.738901 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-195.ec2.internal" Apr 16 14:52:19.738959 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:19.738927 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-195.ec2.internal\": node \"ip-10-0-141-195.ec2.internal\" not found" Apr 16 14:52:19.762649 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:19.762624 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-195.ec2.internal\" not found" Apr 16 14:52:19.804819 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.804775 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-141-195.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal"] Apr 16 14:52:19.804870 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.804857 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:19.806198 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.806185 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:19.806259 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.806211 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:19.806259 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.806224 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:19.807541 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.807529 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:19.807714 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.807702 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-195.ec2.internal" Apr 16 14:52:19.807754 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.807728 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:19.808250 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.808234 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:19.808318 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.808264 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:19.808318 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.808236 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:19.808318 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.808277 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:19.808318 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.808291 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:19.808318 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.808304 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:19.809864 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.809843 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal" Apr 16 14:52:19.809941 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.809872 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:19.810559 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.810547 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:19.810612 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.810569 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:19.810612 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.810579 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:19.827146 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:19.827126 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-195.ec2.internal\" not found" node="ip-10-0-141-195.ec2.internal" Apr 16 14:52:19.831508 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:19.831494 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-195.ec2.internal\" not found" node="ip-10-0-141-195.ec2.internal" Apr 16 14:52:19.863333 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:19.863314 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-195.ec2.internal\" not found" Apr 16 14:52:19.864461 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.864448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d244d209c4485d5d4fbaebc8851f6290-config\") pod \"kube-apiserver-proxy-ip-10-0-141-195.ec2.internal\" (UID: \"d244d209c4485d5d4fbaebc8851f6290\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-195.ec2.internal" Apr 16 14:52:19.864511 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.864469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6e540316e449bbc0853c708e7d4b7aa0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal\" (UID: \"6e540316e449bbc0853c708e7d4b7aa0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal" Apr 16 14:52:19.864511 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.864487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e540316e449bbc0853c708e7d4b7aa0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal\" (UID: \"6e540316e449bbc0853c708e7d4b7aa0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal" Apr 16 14:52:19.963565 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:19.963499 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-195.ec2.internal\" not found" Apr 16 14:52:19.964665 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.964629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6e540316e449bbc0853c708e7d4b7aa0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal\" (UID: \"6e540316e449bbc0853c708e7d4b7aa0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal" Apr 16 14:52:19.964745 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.964682 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e540316e449bbc0853c708e7d4b7aa0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal\" (UID: \"6e540316e449bbc0853c708e7d4b7aa0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal" Apr 16 14:52:19.964745 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.964707 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d244d209c4485d5d4fbaebc8851f6290-config\") pod \"kube-apiserver-proxy-ip-10-0-141-195.ec2.internal\" (UID: \"d244d209c4485d5d4fbaebc8851f6290\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-195.ec2.internal" Apr 16 14:52:19.964844 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.964756 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e540316e449bbc0853c708e7d4b7aa0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal\" (UID: \"6e540316e449bbc0853c708e7d4b7aa0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal" Apr 16 14:52:19.964844 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.964766 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d244d209c4485d5d4fbaebc8851f6290-config\") pod \"kube-apiserver-proxy-ip-10-0-141-195.ec2.internal\" (UID: \"d244d209c4485d5d4fbaebc8851f6290\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-195.ec2.internal" Apr 16 14:52:19.964844 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:19.964773 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6e540316e449bbc0853c708e7d4b7aa0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal\" (UID: \"6e540316e449bbc0853c708e7d4b7aa0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal" Apr 16 14:52:20.064322 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:20.064278 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-195.ec2.internal\" not found" Apr 16 14:52:20.130576 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.130555 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-195.ec2.internal" Apr 16 14:52:20.133725 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.133709 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal" Apr 16 14:52:20.164605 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:20.164569 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-195.ec2.internal\" not found" Apr 16 14:52:20.265069 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:20.264987 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-195.ec2.internal\" not found" Apr 16 14:52:20.365497 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:20.365471 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-195.ec2.internal\" not found" Apr 16 14:52:20.453144 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.452799 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:20.461012 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.460996 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-195.ec2.internal" Apr 16 14:52:20.469849 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.469829 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 14:52:20.469983 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.469964 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:20.470024 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:20.469962 2576 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a795ea6d85fa545579f0401bc09884af-cb7d424b29f6becf.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/kube-system/pods\": read tcp 10.0.141.195:41474->3.216.86.221:6443: use of closed network connection" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-195.ec2.internal" Apr 16 14:52:20.470024 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.469994 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal" Apr 16 14:52:20.470024 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.469967 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:20.470114 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.469967 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:20.486180 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.486160 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:20.540554 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.540533 2576 apiserver.go:52] "Watching apiserver" Apr 16 14:52:20.546970 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.546950 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 14:52:20.547366 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.547344 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jsx45","kube-system/konnectivity-agent-2tqg5","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln","openshift-cluster-node-tuning-operator/tuned-gd596","openshift-dns/node-resolver-8js26","openshift-image-registry/node-ca-7xqtf","openshift-multus/network-metrics-daemon-5zzl5","openshift-network-diagnostics/network-check-target-dpr62","openshift-network-operator/iptables-alerter-9tmph","kube-system/kube-apiserver-proxy-ip-10-0-141-195.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal","openshift-multus/multus-76rhf","openshift-multus/multus-additional-cni-plugins-gn47f"] Apr 16 14:52:20.551394 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.550023 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:20.551394 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:20.550110 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zzl5" podUID="fb239c54-f254-4320-9008-4b4f5895660d" Apr 16 14:52:20.551394 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.550411 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2tqg5" Apr 16 14:52:20.552552 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.552522 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 14:47:19 +0000 UTC" deadline="2027-09-29 22:57:30.028322133 +0000 UTC" Apr 16 14:52:20.552672 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.552551 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12752h5m9.475774589s" Apr 16 14:52:20.552956 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.552916 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 14:52:20.553057 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.552985 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-z8nh9\"" Apr 16 14:52:20.553121 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.552930 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 14:52:20.553173 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.553122 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.554544 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.554527 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.555043 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.555027 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:20.555132 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.555098 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:20.555132 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.555098 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-58s96\"" Apr 16 14:52:20.555868 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.555855 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8js26" Apr 16 14:52:20.555951 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.555934 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7xqtf" Apr 16 14:52:20.556364 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.556348 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 14:52:20.556808 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.556795 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 14:52:20.557031 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.557019 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 14:52:20.557314 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.557299 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.557374 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.557358 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8cxk6\"" Apr 16 14:52:20.557862 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.557843 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 14:52:20.558048 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.558030 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 14:52:20.558048 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.558048 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 14:52:20.558180 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.558095 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 14:52:20.558180 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.558100 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-847j8\"" Apr 16 14:52:20.558180 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.558100 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 14:52:20.558320 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.558051 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cj8gs\"" Apr 16 14:52:20.558531 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.558509 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:20.558613 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:20.558567 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpr62" podUID="3e8487d9-5753-42d0-9838-80ce2360a1b0" Apr 16 14:52:20.559303 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.559284 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 14:52:20.560037 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.560019 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:20.560111 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.560043 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 14:52:20.560366 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.560348 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9tmph" Apr 16 14:52:20.560562 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.560543 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 14:52:20.560672 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.560583 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 14:52:20.560672 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.560542 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-fpdhf\"" Apr 16 14:52:20.561029 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.561009 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 14:52:20.562884 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.561539 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 14:52:20.562884 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.562433 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:20.563706 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.563125 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lvsxn\"" Apr 16 14:52:20.563706 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.563139 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:20.563706 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.563416 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 14:52:20.564517 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.564500 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.566172 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.566153 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.566753 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.566737 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 14:52:20.566836 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.566783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkltk\" (UniqueName: \"kubernetes.io/projected/44f20ff6-a1ac-47f8-845d-906159fbce7f-kube-api-access-wkltk\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.566836 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.566815 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-multus-socket-dir-parent\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.566941 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.566873 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-host-run-netns\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.566941 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.566903 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-host-var-lib-cni-multus\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.566941 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.566932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-host-var-lib-kubelet\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.567126 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.566957 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-etc-kubernetes\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.567126 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.566980 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-run-ovn\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.567126 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567002 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 14:52:20.567126 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567002 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/44f20ff6-a1ac-47f8-845d-906159fbce7f-device-dir\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.567126 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567052 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-systemd\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.567126 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567074 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-run\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.567126 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567097 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-lib-modules\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.567126 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567121 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-host-var-lib-cni-bin\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.567502 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567148 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.567502 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567174 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/653d6001-f9f0-440a-9aab-e87455bc4e3f-host\") pod \"node-ca-7xqtf\" (UID: \"653d6001-f9f0-440a-9aab-e87455bc4e3f\") " pod="openshift-image-registry/node-ca-7xqtf" Apr 16 14:52:20.567502 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567211 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/653d6001-f9f0-440a-9aab-e87455bc4e3f-serviceca\") pod \"node-ca-7xqtf\" (UID: \"653d6001-f9f0-440a-9aab-e87455bc4e3f\") " pod="openshift-image-registry/node-ca-7xqtf" Apr 16 14:52:20.567502 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567230 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlf5p\" (UniqueName: \"kubernetes.io/projected/ffd2dff3-4034-47e4-b3bf-bd072dba227e-kube-api-access-dlf5p\") pod \"iptables-alerter-9tmph\" (UID: \"ffd2dff3-4034-47e4-b3bf-bd072dba227e\") " pod="openshift-network-operator/iptables-alerter-9tmph" Apr 16 14:52:20.567502 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567248 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/44f20ff6-a1ac-47f8-845d-906159fbce7f-socket-dir\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.567502 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567271 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-os-release\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.567502 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567294 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0d793735-b1fb-4ca3-bc99-1447700e773f-konnectivity-ca\") pod \"konnectivity-agent-2tqg5\" (UID: \"0d793735-b1fb-4ca3-bc99-1447700e773f\") " pod="kube-system/konnectivity-agent-2tqg5" Apr 16 14:52:20.567502 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567313 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-env-overrides\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.567502 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567335 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-ovn-node-metrics-cert\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.567502 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567354 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/44f20ff6-a1ac-47f8-845d-906159fbce7f-sys-fs\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.567502 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567377 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-kubernetes\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.567502 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567399 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-system-cni-dir\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.567502 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567423 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c68702b9-21dc-43b1-ba9f-d5a236a6b183-multus-daemon-config\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.567502 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567445 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx46k\" (UniqueName: \"kubernetes.io/projected/fb239c54-f254-4320-9008-4b4f5895660d-kube-api-access-tx46k\") pod \"network-metrics-daemon-5zzl5\" (UID: \"fb239c54-f254-4320-9008-4b4f5895660d\") " pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:20.567502 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567471 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/15bab44e-b8bf-4170-b623-a5f844d8bfb0-hosts-file\") pod \"node-resolver-8js26\" (UID: \"15bab44e-b8bf-4170-b623-a5f844d8bfb0\") " pod="openshift-dns/node-resolver-8js26" Apr 16 14:52:20.567502 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567494 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0d793735-b1fb-4ca3-bc99-1447700e773f-agent-certs\") pod \"konnectivity-agent-2tqg5\" (UID: \"0d793735-b1fb-4ca3-bc99-1447700e773f\") " pod="kube-system/konnectivity-agent-2tqg5" Apr 16 14:52:20.568287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567517 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-run-ovn-kubernetes\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.568287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567537 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-tuned\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.568287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567557 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-hostroot\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.568287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567577 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-tmp\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.568287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567598 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-kubelet\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.568287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-cni-netd\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.568287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567672 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44f20ff6-a1ac-47f8-845d-906159fbce7f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.568287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567697 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-cnibin\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.568287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567719 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdqb4\" (UniqueName: \"kubernetes.io/projected/c68702b9-21dc-43b1-ba9f-d5a236a6b183-kube-api-access-fdqb4\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.568287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-var-lib-openvswitch\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.568287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567766 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-tc7xh\"" Apr 16 14:52:20.568287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567805 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 14:52:20.568287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.567767 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-etc-openvswitch\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.568287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m95lh\" (UniqueName: \"kubernetes.io/projected/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-kube-api-access-m95lh\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.568287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568079 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqvc9\" (UniqueName: \"kubernetes.io/projected/653d6001-f9f0-440a-9aab-e87455bc4e3f-kube-api-access-cqvc9\") pod \"node-ca-7xqtf\" (UID: \"653d6001-f9f0-440a-9aab-e87455bc4e3f\") " pod="openshift-image-registry/node-ca-7xqtf" Apr 16 14:52:20.568287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568103 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs\") pod \"network-metrics-daemon-5zzl5\" (UID: \"fb239c54-f254-4320-9008-4b4f5895660d\") " pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:20.568287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568126 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/15bab44e-b8bf-4170-b623-a5f844d8bfb0-tmp-dir\") pod \"node-resolver-8js26\" (UID: \"15bab44e-b8bf-4170-b623-a5f844d8bfb0\") " pod="openshift-dns/node-resolver-8js26" Apr 16 14:52:20.568287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568157 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-ovnkube-script-lib\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.569448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568180 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhzv5\" (UniqueName: \"kubernetes.io/projected/15bab44e-b8bf-4170-b623-a5f844d8bfb0-kube-api-access-jhzv5\") pod \"node-resolver-8js26\" (UID: \"15bab44e-b8bf-4170-b623-a5f844d8bfb0\") " pod="openshift-dns/node-resolver-8js26" Apr 16 14:52:20.569448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568203 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/44f20ff6-a1ac-47f8-845d-906159fbce7f-etc-selinux\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.569448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568221 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 14:52:20.569448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568225 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-sys\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.569448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568353 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgz68\" (UniqueName: \"kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68\") pod \"network-check-target-dpr62\" (UID: \"3e8487d9-5753-42d0-9838-80ce2360a1b0\") " pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:20.569448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568378 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-host-run-multus-certs\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.569448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568417 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-cni-bin\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.569448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568443 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-ovnkube-config\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.569448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568468 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-systemd-units\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.569448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568492 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-run-openvswitch\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.569448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568536 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/44f20ff6-a1ac-47f8-845d-906159fbce7f-registration-dir\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.569448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568566 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-modprobe-d\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.569448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568589 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-var-lib-kubelet\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.569448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568626 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-host\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.569448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568672 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 14:52:20.569448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568689 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vgtz9\"" Apr 16 14:52:20.569448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568674 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhsmj\" (UniqueName: \"kubernetes.io/projected/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-kube-api-access-mhsmj\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.569448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568796 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-multus-cni-dir\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.570027 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-node-log\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.570027 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568850 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-host-run-k8s-cni-cncf-io\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.570027 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568873 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-slash\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.570027 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568888 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 14:52:20.570027 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568896 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-run-netns\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.570027 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568918 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-run-systemd\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.570027 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568941 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ffd2dff3-4034-47e4-b3bf-bd072dba227e-iptables-alerter-script\") pod \"iptables-alerter-9tmph\" (UID: \"ffd2dff3-4034-47e4-b3bf-bd072dba227e\") " pod="openshift-network-operator/iptables-alerter-9tmph" Apr 16 14:52:20.570027 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568963 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffd2dff3-4034-47e4-b3bf-bd072dba227e-host-slash\") pod \"iptables-alerter-9tmph\" (UID: \"ffd2dff3-4034-47e4-b3bf-bd072dba227e\") " pod="openshift-network-operator/iptables-alerter-9tmph" Apr 16 14:52:20.570027 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.568986 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-sysconfig\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.570027 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.569010 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-sysctl-conf\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.570027 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.569036 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-multus-conf-dir\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.570027 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.569053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-log-socket\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.570027 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.569075 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-sysctl-d\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.570027 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.569098 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c68702b9-21dc-43b1-ba9f-d5a236a6b183-cni-binary-copy\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.572356 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.572340 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:20.614273 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.614253 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-fcbzf" Apr 16 14:52:20.621677 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.621657 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-fcbzf" Apr 16 14:52:20.654767 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:20.654729 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd244d209c4485d5d4fbaebc8851f6290.slice/crio-db94d00f44fab247ff92cb87826d044338694d52a5f15210aefc00ef8b2ec01a WatchSource:0}: Error finding container db94d00f44fab247ff92cb87826d044338694d52a5f15210aefc00ef8b2ec01a: Status 404 returned error can't find the container with id db94d00f44fab247ff92cb87826d044338694d52a5f15210aefc00ef8b2ec01a Apr 16 14:52:20.655020 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:20.655002 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e540316e449bbc0853c708e7d4b7aa0.slice/crio-94ff17b631397b97c716190c23cd464e9f95364361124f4258aecfcba153ce40 WatchSource:0}: Error finding container 94ff17b631397b97c716190c23cd464e9f95364361124f4258aecfcba153ce40: Status 404 returned error can't find the container with id 94ff17b631397b97c716190c23cd464e9f95364361124f4258aecfcba153ce40 Apr 16 14:52:20.662301 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.662280 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 14:52:20.663148 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.663132 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:52:20.669281 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669260 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44f20ff6-a1ac-47f8-845d-906159fbce7f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.669364 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669290 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-cnibin\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.669364 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdqb4\" (UniqueName: \"kubernetes.io/projected/c68702b9-21dc-43b1-ba9f-d5a236a6b183-kube-api-access-fdqb4\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.669364 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669328 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-var-lib-openvswitch\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.669364 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669350 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-etc-openvswitch\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.669565 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669364 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44f20ff6-a1ac-47f8-845d-906159fbce7f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.669565 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669373 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m95lh\" (UniqueName: \"kubernetes.io/projected/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-kube-api-access-m95lh\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.669565 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669381 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-cnibin\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.669565 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669414 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fac8cd8d-4a46-40df-b803-d33c16259cc1-cni-binary-copy\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.669565 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqvc9\" (UniqueName: \"kubernetes.io/projected/653d6001-f9f0-440a-9aab-e87455bc4e3f-kube-api-access-cqvc9\") pod \"node-ca-7xqtf\" (UID: \"653d6001-f9f0-440a-9aab-e87455bc4e3f\") " pod="openshift-image-registry/node-ca-7xqtf" Apr 16 14:52:20.669565 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-var-lib-openvswitch\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.669565 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-etc-openvswitch\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.669565 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs\") pod \"network-metrics-daemon-5zzl5\" (UID: \"fb239c54-f254-4320-9008-4b4f5895660d\") " pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:20.669565 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/15bab44e-b8bf-4170-b623-a5f844d8bfb0-tmp-dir\") pod \"node-resolver-8js26\" (UID: \"15bab44e-b8bf-4170-b623-a5f844d8bfb0\") " pod="openshift-dns/node-resolver-8js26" Apr 16 14:52:20.669565 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669536 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-ovnkube-script-lib\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.669565 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:20.669549 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:20.670143 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:20.669612 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs podName:fb239c54-f254-4320-9008-4b4f5895660d nodeName:}" failed. No retries permitted until 2026-04-16 14:52:21.169581876 +0000 UTC m=+2.087456691 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs") pod "network-metrics-daemon-5zzl5" (UID: "fb239c54-f254-4320-9008-4b4f5895660d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:20.670143 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669633 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhzv5\" (UniqueName: \"kubernetes.io/projected/15bab44e-b8bf-4170-b623-a5f844d8bfb0-kube-api-access-jhzv5\") pod \"node-resolver-8js26\" (UID: \"15bab44e-b8bf-4170-b623-a5f844d8bfb0\") " pod="openshift-dns/node-resolver-8js26" Apr 16 14:52:20.670143 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669693 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/44f20ff6-a1ac-47f8-845d-906159fbce7f-etc-selinux\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.670143 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669714 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-sys\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.670143 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669729 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgz68\" (UniqueName: \"kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68\") pod \"network-check-target-dpr62\" (UID: \"3e8487d9-5753-42d0-9838-80ce2360a1b0\") " pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:20.670143 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669745 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-host-run-multus-certs\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.670143 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-host-run-multus-certs\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.670143 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-sys\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.670143 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669836 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fac8cd8d-4a46-40df-b803-d33c16259cc1-cnibin\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.670143 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669866 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-cni-bin\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.670143 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669868 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/15bab44e-b8bf-4170-b623-a5f844d8bfb0-tmp-dir\") pod \"node-resolver-8js26\" (UID: \"15bab44e-b8bf-4170-b623-a5f844d8bfb0\") " pod="openshift-dns/node-resolver-8js26" Apr 16 14:52:20.670143 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-ovnkube-config\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.670143 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669920 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/44f20ff6-a1ac-47f8-845d-906159fbce7f-etc-selinux\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.670143 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669936 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-systemd-units\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.670143 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.669948 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-cni-bin\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.670143 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670022 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-run-openvswitch\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.670143 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670035 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-systemd-units\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.670833 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670061 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/44f20ff6-a1ac-47f8-845d-906159fbce7f-registration-dir\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.670833 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-run-openvswitch\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.670833 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670086 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-modprobe-d\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.670833 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-var-lib-kubelet\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.670833 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670131 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/44f20ff6-a1ac-47f8-845d-906159fbce7f-registration-dir\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.670833 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-host\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.670833 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670155 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-ovnkube-script-lib\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.670833 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fac8cd8d-4a46-40df-b803-d33c16259cc1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.670833 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670180 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-var-lib-kubelet\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.670833 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670195 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhsmj\" (UniqueName: \"kubernetes.io/projected/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-kube-api-access-mhsmj\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.670833 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-modprobe-d\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.670833 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670220 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-multus-cni-dir\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.670833 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-host\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.670833 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-multus-cni-dir\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.670833 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670263 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-node-log\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.670833 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-node-log\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.670833 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670332 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-host-run-k8s-cni-cncf-io\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.671588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670353 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fac8cd8d-4a46-40df-b803-d33c16259cc1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.671588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670356 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-host-run-k8s-cni-cncf-io\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.671588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670371 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-slash\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.671588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670387 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-run-netns\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.671588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670414 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-run-netns\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.671588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670414 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-slash\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.671588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670428 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-run-systemd\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.671588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670452 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ffd2dff3-4034-47e4-b3bf-bd072dba227e-iptables-alerter-script\") pod \"iptables-alerter-9tmph\" (UID: \"ffd2dff3-4034-47e4-b3bf-bd072dba227e\") " pod="openshift-network-operator/iptables-alerter-9tmph" Apr 16 14:52:20.671588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffd2dff3-4034-47e4-b3bf-bd072dba227e-host-slash\") pod \"iptables-alerter-9tmph\" (UID: \"ffd2dff3-4034-47e4-b3bf-bd072dba227e\") " pod="openshift-network-operator/iptables-alerter-9tmph" Apr 16 14:52:20.671588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670493 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-run-systemd\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.671588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670506 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-sysconfig\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.671588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670528 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffd2dff3-4034-47e4-b3bf-bd072dba227e-host-slash\") pod \"iptables-alerter-9tmph\" (UID: \"ffd2dff3-4034-47e4-b3bf-bd072dba227e\") " pod="openshift-network-operator/iptables-alerter-9tmph" Apr 16 14:52:20.671588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670532 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-sysctl-conf\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.671588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-sysconfig\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.671588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670563 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-multus-conf-dir\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.671588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670585 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-log-socket\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.671588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670604 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-sysctl-d\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.671588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c68702b9-21dc-43b1-ba9f-d5a236a6b183-cni-binary-copy\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.672399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670630 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-sysctl-conf\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.672399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkltk\" (UniqueName: \"kubernetes.io/projected/44f20ff6-a1ac-47f8-845d-906159fbce7f-kube-api-access-wkltk\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.672399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670671 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-multus-conf-dir\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.672399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670677 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-log-socket\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.672399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670696 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-multus-socket-dir-parent\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.672399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670739 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-sysctl-d\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.672399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670748 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-multus-socket-dir-parent\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.672399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670793 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-ovnkube-config\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.672399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-host-run-netns\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.672399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670856 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-host-var-lib-cni-multus\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.672399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-host-var-lib-kubelet\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.672399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-host-var-lib-cni-multus\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.672399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670907 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-etc-kubernetes\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.672399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670923 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-host-var-lib-kubelet\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.672399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670926 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-etc-kubernetes\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.672399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670951 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ffd2dff3-4034-47e4-b3bf-bd072dba227e-iptables-alerter-script\") pod \"iptables-alerter-9tmph\" (UID: \"ffd2dff3-4034-47e4-b3bf-bd072dba227e\") " pod="openshift-network-operator/iptables-alerter-9tmph" Apr 16 14:52:20.672399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670962 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-run-ovn\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.672399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670931 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-run-ovn\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.673487 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670997 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/44f20ff6-a1ac-47f8-845d-906159fbce7f-device-dir\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.673487 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.670901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-host-run-netns\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.673487 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671022 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-systemd\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.673487 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-run\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.673487 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671072 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-lib-modules\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.673487 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-systemd\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.673487 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671023 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/44f20ff6-a1ac-47f8-845d-906159fbce7f-device-dir\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.673487 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-host-var-lib-cni-bin\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.673487 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671117 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-run\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.673487 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671125 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.673487 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671148 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-host-var-lib-cni-bin\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.673487 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671154 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/653d6001-f9f0-440a-9aab-e87455bc4e3f-host\") pod \"node-ca-7xqtf\" (UID: \"653d6001-f9f0-440a-9aab-e87455bc4e3f\") " pod="openshift-image-registry/node-ca-7xqtf" Apr 16 14:52:20.673487 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671159 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-lib-modules\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.673487 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671179 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.673487 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671179 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/653d6001-f9f0-440a-9aab-e87455bc4e3f-serviceca\") pod \"node-ca-7xqtf\" (UID: \"653d6001-f9f0-440a-9aab-e87455bc4e3f\") " pod="openshift-image-registry/node-ca-7xqtf" Apr 16 14:52:20.673487 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671192 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/653d6001-f9f0-440a-9aab-e87455bc4e3f-host\") pod \"node-ca-7xqtf\" (UID: \"653d6001-f9f0-440a-9aab-e87455bc4e3f\") " pod="openshift-image-registry/node-ca-7xqtf" Apr 16 14:52:20.673487 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671206 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlf5p\" (UniqueName: \"kubernetes.io/projected/ffd2dff3-4034-47e4-b3bf-bd072dba227e-kube-api-access-dlf5p\") pod \"iptables-alerter-9tmph\" (UID: \"ffd2dff3-4034-47e4-b3bf-bd072dba227e\") " pod="openshift-network-operator/iptables-alerter-9tmph" Apr 16 14:52:20.673487 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/44f20ff6-a1ac-47f8-845d-906159fbce7f-socket-dir\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.674416 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671245 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-os-release\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.674416 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fac8cd8d-4a46-40df-b803-d33c16259cc1-os-release\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.674416 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671295 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0d793735-b1fb-4ca3-bc99-1447700e773f-konnectivity-ca\") pod \"konnectivity-agent-2tqg5\" (UID: \"0d793735-b1fb-4ca3-bc99-1447700e773f\") " pod="kube-system/konnectivity-agent-2tqg5" Apr 16 14:52:20.674416 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671301 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-os-release\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.674416 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671319 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/44f20ff6-a1ac-47f8-845d-906159fbce7f-socket-dir\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.674416 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-env-overrides\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.674416 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-ovn-node-metrics-cert\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.674416 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671389 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/44f20ff6-a1ac-47f8-845d-906159fbce7f-sys-fs\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.674416 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-kubernetes\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.674416 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671432 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-system-cni-dir\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.674416 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671455 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c68702b9-21dc-43b1-ba9f-d5a236a6b183-multus-daemon-config\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.674416 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671472 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/44f20ff6-a1ac-47f8-845d-906159fbce7f-sys-fs\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.674416 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671481 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fac8cd8d-4a46-40df-b803-d33c16259cc1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.674416 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx46k\" (UniqueName: \"kubernetes.io/projected/fb239c54-f254-4320-9008-4b4f5895660d-kube-api-access-tx46k\") pod \"network-metrics-daemon-5zzl5\" (UID: \"fb239c54-f254-4320-9008-4b4f5895660d\") " pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:20.674416 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671520 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-system-cni-dir\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.674416 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671532 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/15bab44e-b8bf-4170-b623-a5f844d8bfb0-hosts-file\") pod \"node-resolver-8js26\" (UID: \"15bab44e-b8bf-4170-b623-a5f844d8bfb0\") " pod="openshift-dns/node-resolver-8js26" Apr 16 14:52:20.674416 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671558 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fac8cd8d-4a46-40df-b803-d33c16259cc1-system-cni-dir\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.674980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671559 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-kubernetes\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.674980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671594 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxqg9\" (UniqueName: \"kubernetes.io/projected/fac8cd8d-4a46-40df-b803-d33c16259cc1-kube-api-access-dxqg9\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.674980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671600 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/653d6001-f9f0-440a-9aab-e87455bc4e3f-serviceca\") pod \"node-ca-7xqtf\" (UID: \"653d6001-f9f0-440a-9aab-e87455bc4e3f\") " pod="openshift-image-registry/node-ca-7xqtf" Apr 16 14:52:20.674980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671674 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/15bab44e-b8bf-4170-b623-a5f844d8bfb0-hosts-file\") pod \"node-resolver-8js26\" (UID: \"15bab44e-b8bf-4170-b623-a5f844d8bfb0\") " pod="openshift-dns/node-resolver-8js26" Apr 16 14:52:20.674980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671738 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 14:52:20.674980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671753 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-env-overrides\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.674980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671757 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c68702b9-21dc-43b1-ba9f-d5a236a6b183-cni-binary-copy\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.674980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0d793735-b1fb-4ca3-bc99-1447700e773f-agent-certs\") pod \"konnectivity-agent-2tqg5\" (UID: \"0d793735-b1fb-4ca3-bc99-1447700e773f\") " pod="kube-system/konnectivity-agent-2tqg5" Apr 16 14:52:20.674980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671827 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0d793735-b1fb-4ca3-bc99-1447700e773f-konnectivity-ca\") pod \"konnectivity-agent-2tqg5\" (UID: \"0d793735-b1fb-4ca3-bc99-1447700e773f\") " pod="kube-system/konnectivity-agent-2tqg5" Apr 16 14:52:20.674980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-run-ovn-kubernetes\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.674980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671859 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-run-ovn-kubernetes\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.674980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671900 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-tuned\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.674980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671925 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-hostroot\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.674980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-tmp\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.674980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671969 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-kubelet\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.674980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-cni-netd\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.674980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.671992 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c68702b9-21dc-43b1-ba9f-d5a236a6b183-hostroot\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.674980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.672026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-kubelet\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.675551 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.672067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-host-cni-netd\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.675551 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.672156 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c68702b9-21dc-43b1-ba9f-d5a236a6b183-multus-daemon-config\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.675551 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.674814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-tmp\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.675551 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.674821 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-ovn-node-metrics-cert\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.675551 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.674831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-etc-tuned\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.675551 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.675041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0d793735-b1fb-4ca3-bc99-1447700e773f-agent-certs\") pod \"konnectivity-agent-2tqg5\" (UID: \"0d793735-b1fb-4ca3-bc99-1447700e773f\") " pod="kube-system/konnectivity-agent-2tqg5" Apr 16 14:52:20.675977 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:20.675961 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:20.676015 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:20.675982 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:20.676015 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:20.675995 2576 projected.go:194] Error preparing data for projected volume kube-api-access-sgz68 for pod openshift-network-diagnostics/network-check-target-dpr62: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:20.676083 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:20.676054 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68 podName:3e8487d9-5753-42d0-9838-80ce2360a1b0 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:21.176038675 +0000 UTC m=+2.093913488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-sgz68" (UniqueName: "kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68") pod "network-check-target-dpr62" (UID: "3e8487d9-5753-42d0-9838-80ce2360a1b0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:20.678275 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.678255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqvc9\" (UniqueName: \"kubernetes.io/projected/653d6001-f9f0-440a-9aab-e87455bc4e3f-kube-api-access-cqvc9\") pod \"node-ca-7xqtf\" (UID: \"653d6001-f9f0-440a-9aab-e87455bc4e3f\") " pod="openshift-image-registry/node-ca-7xqtf" Apr 16 14:52:20.678723 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.678615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhzv5\" (UniqueName: \"kubernetes.io/projected/15bab44e-b8bf-4170-b623-a5f844d8bfb0-kube-api-access-jhzv5\") pod \"node-resolver-8js26\" (UID: \"15bab44e-b8bf-4170-b623-a5f844d8bfb0\") " pod="openshift-dns/node-resolver-8js26" Apr 16 14:52:20.678723 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.678694 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdqb4\" (UniqueName: \"kubernetes.io/projected/c68702b9-21dc-43b1-ba9f-d5a236a6b183-kube-api-access-fdqb4\") pod \"multus-76rhf\" (UID: \"c68702b9-21dc-43b1-ba9f-d5a236a6b183\") " pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.679161 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.679143 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhsmj\" (UniqueName: \"kubernetes.io/projected/6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78-kube-api-access-mhsmj\") pod \"tuned-gd596\" (UID: \"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78\") " pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.679443 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.679424 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m95lh\" (UniqueName: \"kubernetes.io/projected/7888efd0-340e-44a4-9e27-7fbbad8b7bfd-kube-api-access-m95lh\") pod \"ovnkube-node-jsx45\" (UID: \"7888efd0-340e-44a4-9e27-7fbbad8b7bfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.682152 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.682129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlf5p\" (UniqueName: \"kubernetes.io/projected/ffd2dff3-4034-47e4-b3bf-bd072dba227e-kube-api-access-dlf5p\") pod \"iptables-alerter-9tmph\" (UID: \"ffd2dff3-4034-47e4-b3bf-bd072dba227e\") " pod="openshift-network-operator/iptables-alerter-9tmph" Apr 16 14:52:20.682514 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.682499 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkltk\" (UniqueName: \"kubernetes.io/projected/44f20ff6-a1ac-47f8-845d-906159fbce7f-kube-api-access-wkltk\") pod \"aws-ebs-csi-driver-node-tsnln\" (UID: \"44f20ff6-a1ac-47f8-845d-906159fbce7f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.682667 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.682631 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx46k\" (UniqueName: \"kubernetes.io/projected/fb239c54-f254-4320-9008-4b4f5895660d-kube-api-access-tx46k\") pod \"network-metrics-daemon-5zzl5\" (UID: \"fb239c54-f254-4320-9008-4b4f5895660d\") " pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:20.707932 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.707896 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-195.ec2.internal" event={"ID":"d244d209c4485d5d4fbaebc8851f6290","Type":"ContainerStarted","Data":"db94d00f44fab247ff92cb87826d044338694d52a5f15210aefc00ef8b2ec01a"} Apr 16 14:52:20.708773 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.708752 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal" event={"ID":"6e540316e449bbc0853c708e7d4b7aa0","Type":"ContainerStarted","Data":"94ff17b631397b97c716190c23cd464e9f95364361124f4258aecfcba153ce40"} Apr 16 14:52:20.772981 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.772963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fac8cd8d-4a46-40df-b803-d33c16259cc1-os-release\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.773050 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.772997 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fac8cd8d-4a46-40df-b803-d33c16259cc1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.773050 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.773014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fac8cd8d-4a46-40df-b803-d33c16259cc1-system-cni-dir\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.773050 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.773030 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxqg9\" (UniqueName: \"kubernetes.io/projected/fac8cd8d-4a46-40df-b803-d33c16259cc1-kube-api-access-dxqg9\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.773190 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.773097 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fac8cd8d-4a46-40df-b803-d33c16259cc1-system-cni-dir\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.773190 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.773113 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fac8cd8d-4a46-40df-b803-d33c16259cc1-os-release\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.773190 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.773140 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fac8cd8d-4a46-40df-b803-d33c16259cc1-cni-binary-copy\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.773406 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.773210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fac8cd8d-4a46-40df-b803-d33c16259cc1-cnibin\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.773406 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.773245 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fac8cd8d-4a46-40df-b803-d33c16259cc1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.773406 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.773272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fac8cd8d-4a46-40df-b803-d33c16259cc1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.773406 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.773288 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fac8cd8d-4a46-40df-b803-d33c16259cc1-cnibin\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.773595 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.773577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fac8cd8d-4a46-40df-b803-d33c16259cc1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.773701 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.773682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fac8cd8d-4a46-40df-b803-d33c16259cc1-cni-binary-copy\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.773853 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.773839 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fac8cd8d-4a46-40df-b803-d33c16259cc1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.774068 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.774052 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fac8cd8d-4a46-40df-b803-d33c16259cc1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.781009 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.780988 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxqg9\" (UniqueName: \"kubernetes.io/projected/fac8cd8d-4a46-40df-b803-d33c16259cc1-kube-api-access-dxqg9\") pod \"multus-additional-cni-plugins-gn47f\" (UID: \"fac8cd8d-4a46-40df-b803-d33c16259cc1\") " pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:20.882125 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.882077 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2tqg5" Apr 16 14:52:20.887708 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:20.887680 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d793735_b1fb_4ca3_bc99_1447700e773f.slice/crio-847d5fa6ecafafdf0ed5b3fd09f8bba3b89dd8ec63d00e149629b4bfd7881b82 WatchSource:0}: Error finding container 847d5fa6ecafafdf0ed5b3fd09f8bba3b89dd8ec63d00e149629b4bfd7881b82: Status 404 returned error can't find the container with id 847d5fa6ecafafdf0ed5b3fd09f8bba3b89dd8ec63d00e149629b4bfd7881b82 Apr 16 14:52:20.894297 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.894282 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gd596" Apr 16 14:52:20.899794 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:20.899772 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cdb3edd_1f53_40cc_a5d3_2aeb3c013c78.slice/crio-aa00c725bbaa6e9940221c09ef66f67e05cfa122b6f55519623910f75cb4884a WatchSource:0}: Error finding container aa00c725bbaa6e9940221c09ef66f67e05cfa122b6f55519623910f75cb4884a: Status 404 returned error can't find the container with id aa00c725bbaa6e9940221c09ef66f67e05cfa122b6f55519623910f75cb4884a Apr 16 14:52:20.900247 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.900230 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" Apr 16 14:52:20.903702 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.903685 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8js26" Apr 16 14:52:20.907385 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:20.907366 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44f20ff6_a1ac_47f8_845d_906159fbce7f.slice/crio-bc8a332ffcc33f1ef23b080e2a5804433d58f92f6e4954f0f372f1b4422b6dd9 WatchSource:0}: Error finding container bc8a332ffcc33f1ef23b080e2a5804433d58f92f6e4954f0f372f1b4422b6dd9: Status 404 returned error can't find the container with id bc8a332ffcc33f1ef23b080e2a5804433d58f92f6e4954f0f372f1b4422b6dd9 Apr 16 14:52:20.910928 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:20.910911 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15bab44e_b8bf_4170_b623_a5f844d8bfb0.slice/crio-fa9546f37cfc4cffe8d3f359b3eb047d7babff3ffa377836b6f42c0e9ef46981 WatchSource:0}: Error finding container fa9546f37cfc4cffe8d3f359b3eb047d7babff3ffa377836b6f42c0e9ef46981: Status 404 returned error can't find the container with id fa9546f37cfc4cffe8d3f359b3eb047d7babff3ffa377836b6f42c0e9ef46981 Apr 16 14:52:20.921298 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.921277 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7xqtf" Apr 16 14:52:20.927401 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:20.927379 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod653d6001_f9f0_440a_9aab_e87455bc4e3f.slice/crio-210511e0493e51a52fdd7dee331c992313628a5d847db14315a021ab75ff0b03 WatchSource:0}: Error finding container 210511e0493e51a52fdd7dee331c992313628a5d847db14315a021ab75ff0b03: Status 404 returned error can't find the container with id 210511e0493e51a52fdd7dee331c992313628a5d847db14315a021ab75ff0b03 Apr 16 14:52:20.935135 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.935119 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:20.940666 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:20.940632 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7888efd0_340e_44a4_9e27_7fbbad8b7bfd.slice/crio-efbf3685c7826309381a1ac53e74fe9a4df20bb40884418249a19b7e77bfd1c2 WatchSource:0}: Error finding container efbf3685c7826309381a1ac53e74fe9a4df20bb40884418249a19b7e77bfd1c2: Status 404 returned error can't find the container with id efbf3685c7826309381a1ac53e74fe9a4df20bb40884418249a19b7e77bfd1c2 Apr 16 14:52:20.962999 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.962983 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9tmph" Apr 16 14:52:20.968239 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.968190 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-76rhf" Apr 16 14:52:20.970599 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:20.970579 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffd2dff3_4034_47e4_b3bf_bd072dba227e.slice/crio-1e1b5adc067e2c5ec0aede4251feed956fbffcb395bfda92a5c61315267b277d WatchSource:0}: Error finding container 1e1b5adc067e2c5ec0aede4251feed956fbffcb395bfda92a5c61315267b277d: Status 404 returned error can't find the container with id 1e1b5adc067e2c5ec0aede4251feed956fbffcb395bfda92a5c61315267b277d Apr 16 14:52:20.975045 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:20.975021 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc68702b9_21dc_43b1_ba9f_d5a236a6b183.slice/crio-c67ede35aa949021927ca698cb631d4310b05e1b3317e27e8783d30a1570e887 WatchSource:0}: Error finding container c67ede35aa949021927ca698cb631d4310b05e1b3317e27e8783d30a1570e887: Status 404 returned error can't find the container with id c67ede35aa949021927ca698cb631d4310b05e1b3317e27e8783d30a1570e887 Apr 16 14:52:20.993340 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:20.993326 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gn47f" Apr 16 14:52:21.005349 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:21.005328 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:21.176249 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:21.176170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs\") pod \"network-metrics-daemon-5zzl5\" (UID: \"fb239c54-f254-4320-9008-4b4f5895660d\") " pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:21.176249 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:21.176225 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgz68\" (UniqueName: \"kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68\") pod \"network-check-target-dpr62\" (UID: \"3e8487d9-5753-42d0-9838-80ce2360a1b0\") " pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:21.176452 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:21.176328 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:21.176452 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:21.176363 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:21.176452 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:21.176378 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:21.176452 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:21.176385 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs podName:fb239c54-f254-4320-9008-4b4f5895660d nodeName:}" failed. No retries permitted until 2026-04-16 14:52:22.176367902 +0000 UTC m=+3.094242718 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs") pod "network-metrics-daemon-5zzl5" (UID: "fb239c54-f254-4320-9008-4b4f5895660d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:21.176452 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:21.176389 2576 projected.go:194] Error preparing data for projected volume kube-api-access-sgz68 for pod openshift-network-diagnostics/network-check-target-dpr62: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:21.176452 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:21.176423 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68 podName:3e8487d9-5753-42d0-9838-80ce2360a1b0 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:22.176414097 +0000 UTC m=+3.094288892 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-sgz68" (UniqueName: "kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68") pod "network-check-target-dpr62" (UID: "3e8487d9-5753-42d0-9838-80ce2360a1b0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:21.301523 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:21.301318 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:21.624058 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:21.623885 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:20 +0000 UTC" deadline="2027-11-23 02:51:21.828693696 +0000 UTC" Apr 16 14:52:21.624058 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:21.623923 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14051h59m0.204773932s" Apr 16 14:52:21.726206 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:21.726138 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gn47f" event={"ID":"fac8cd8d-4a46-40df-b803-d33c16259cc1","Type":"ContainerStarted","Data":"60ffb160f310e5ef868c8c4bb3bf29ba6841f872f8b849042b3cbf08c5735f6f"} Apr 16 14:52:21.731238 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:21.731179 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-76rhf" event={"ID":"c68702b9-21dc-43b1-ba9f-d5a236a6b183","Type":"ContainerStarted","Data":"c67ede35aa949021927ca698cb631d4310b05e1b3317e27e8783d30a1570e887"} Apr 16 14:52:21.744318 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:21.744279 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" event={"ID":"7888efd0-340e-44a4-9e27-7fbbad8b7bfd","Type":"ContainerStarted","Data":"efbf3685c7826309381a1ac53e74fe9a4df20bb40884418249a19b7e77bfd1c2"} Apr 16 14:52:21.764794 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:21.764751 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8js26" event={"ID":"15bab44e-b8bf-4170-b623-a5f844d8bfb0","Type":"ContainerStarted","Data":"fa9546f37cfc4cffe8d3f359b3eb047d7babff3ffa377836b6f42c0e9ef46981"} Apr 16 14:52:21.777929 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:21.777876 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gd596" event={"ID":"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78","Type":"ContainerStarted","Data":"aa00c725bbaa6e9940221c09ef66f67e05cfa122b6f55519623910f75cb4884a"} Apr 16 14:52:21.787234 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:21.787210 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2tqg5" event={"ID":"0d793735-b1fb-4ca3-bc99-1447700e773f","Type":"ContainerStarted","Data":"847d5fa6ecafafdf0ed5b3fd09f8bba3b89dd8ec63d00e149629b4bfd7881b82"} Apr 16 14:52:21.803466 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:21.803439 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9tmph" event={"ID":"ffd2dff3-4034-47e4-b3bf-bd072dba227e","Type":"ContainerStarted","Data":"1e1b5adc067e2c5ec0aede4251feed956fbffcb395bfda92a5c61315267b277d"} Apr 16 14:52:21.813371 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:21.813315 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7xqtf" event={"ID":"653d6001-f9f0-440a-9aab-e87455bc4e3f","Type":"ContainerStarted","Data":"210511e0493e51a52fdd7dee331c992313628a5d847db14315a021ab75ff0b03"} Apr 16 14:52:21.831122 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:21.831086 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" event={"ID":"44f20ff6-a1ac-47f8-845d-906159fbce7f","Type":"ContainerStarted","Data":"bc8a332ffcc33f1ef23b080e2a5804433d58f92f6e4954f0f372f1b4422b6dd9"} Apr 16 14:52:21.936313 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:21.936055 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:22.182374 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:22.182340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs\") pod \"network-metrics-daemon-5zzl5\" (UID: \"fb239c54-f254-4320-9008-4b4f5895660d\") " pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:22.182564 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:22.182394 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgz68\" (UniqueName: \"kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68\") pod \"network-check-target-dpr62\" (UID: \"3e8487d9-5753-42d0-9838-80ce2360a1b0\") " pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:22.182628 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:22.182567 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:22.182628 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:22.182583 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:22.182628 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:22.182596 2576 projected.go:194] Error preparing data for projected volume kube-api-access-sgz68 for pod openshift-network-diagnostics/network-check-target-dpr62: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:22.182917 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:22.182670 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68 podName:3e8487d9-5753-42d0-9838-80ce2360a1b0 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:24.182633457 +0000 UTC m=+5.100508268 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-sgz68" (UniqueName: "kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68") pod "network-check-target-dpr62" (UID: "3e8487d9-5753-42d0-9838-80ce2360a1b0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:22.183071 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:22.183051 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:22.183144 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:22.183106 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs podName:fb239c54-f254-4320-9008-4b4f5895660d nodeName:}" failed. No retries permitted until 2026-04-16 14:52:24.183091687 +0000 UTC m=+5.100966485 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs") pod "network-metrics-daemon-5zzl5" (UID: "fb239c54-f254-4320-9008-4b4f5895660d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:22.625030 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:22.624945 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:20 +0000 UTC" deadline="2027-12-27 04:30:36.54816209 +0000 UTC" Apr 16 14:52:22.625030 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:22.624978 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14869h38m13.923187608s" Apr 16 14:52:22.706244 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:22.705606 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:22.706244 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:22.705746 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpr62" podUID="3e8487d9-5753-42d0-9838-80ce2360a1b0" Apr 16 14:52:22.706244 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:22.706087 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:22.706244 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:22.706188 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zzl5" podUID="fb239c54-f254-4320-9008-4b4f5895660d" Apr 16 14:52:22.869116 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:22.868910 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:24.202115 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:24.201802 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs\") pod \"network-metrics-daemon-5zzl5\" (UID: \"fb239c54-f254-4320-9008-4b4f5895660d\") " pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:24.202115 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:24.201856 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgz68\" (UniqueName: \"kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68\") pod \"network-check-target-dpr62\" (UID: \"3e8487d9-5753-42d0-9838-80ce2360a1b0\") " pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:24.202115 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:24.201994 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:24.202115 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:24.202010 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:24.202115 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:24.202024 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:24.202115 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:24.202036 2576 projected.go:194] Error preparing data for projected volume kube-api-access-sgz68 for pod openshift-network-diagnostics/network-check-target-dpr62: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:24.202115 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:24.202069 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs podName:fb239c54-f254-4320-9008-4b4f5895660d nodeName:}" failed. No retries permitted until 2026-04-16 14:52:28.202048524 +0000 UTC m=+9.119923332 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs") pod "network-metrics-daemon-5zzl5" (UID: "fb239c54-f254-4320-9008-4b4f5895660d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:24.202115 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:24.202089 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68 podName:3e8487d9-5753-42d0-9838-80ce2360a1b0 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:28.202080007 +0000 UTC m=+9.119954815 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-sgz68" (UniqueName: "kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68") pod "network-check-target-dpr62" (UID: "3e8487d9-5753-42d0-9838-80ce2360a1b0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:24.706622 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:24.706010 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:24.706622 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:24.706128 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpr62" podUID="3e8487d9-5753-42d0-9838-80ce2360a1b0" Apr 16 14:52:24.706622 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:24.706468 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:24.706622 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:24.706590 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zzl5" podUID="fb239c54-f254-4320-9008-4b4f5895660d" Apr 16 14:52:26.706241 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:26.705542 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:26.706241 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:26.705682 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpr62" podUID="3e8487d9-5753-42d0-9838-80ce2360a1b0" Apr 16 14:52:26.706241 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:26.706097 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:26.706241 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:26.706195 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zzl5" podUID="fb239c54-f254-4320-9008-4b4f5895660d" Apr 16 14:52:28.233159 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:28.233126 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgz68\" (UniqueName: \"kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68\") pod \"network-check-target-dpr62\" (UID: \"3e8487d9-5753-42d0-9838-80ce2360a1b0\") " pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:28.233580 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:28.233187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs\") pod \"network-metrics-daemon-5zzl5\" (UID: \"fb239c54-f254-4320-9008-4b4f5895660d\") " pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:28.233580 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:28.233284 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:28.233580 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:28.233338 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs podName:fb239c54-f254-4320-9008-4b4f5895660d nodeName:}" failed. No retries permitted until 2026-04-16 14:52:36.233324394 +0000 UTC m=+17.151199186 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs") pod "network-metrics-daemon-5zzl5" (UID: "fb239c54-f254-4320-9008-4b4f5895660d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:28.233768 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:28.233679 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:28.233768 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:28.233694 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:28.233768 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:28.233705 2576 projected.go:194] Error preparing data for projected volume kube-api-access-sgz68 for pod openshift-network-diagnostics/network-check-target-dpr62: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:28.233768 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:28.233741 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68 podName:3e8487d9-5753-42d0-9838-80ce2360a1b0 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:36.233726775 +0000 UTC m=+17.151601570 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-sgz68" (UniqueName: "kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68") pod "network-check-target-dpr62" (UID: "3e8487d9-5753-42d0-9838-80ce2360a1b0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:28.705500 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:28.705409 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:28.705842 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:28.705550 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zzl5" podUID="fb239c54-f254-4320-9008-4b4f5895660d" Apr 16 14:52:28.705842 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:28.705624 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:28.705842 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:28.705710 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpr62" podUID="3e8487d9-5753-42d0-9838-80ce2360a1b0" Apr 16 14:52:30.705012 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:30.704973 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:30.705425 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:30.704973 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:30.705425 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:30.705101 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpr62" podUID="3e8487d9-5753-42d0-9838-80ce2360a1b0" Apr 16 14:52:30.705425 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:30.705217 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zzl5" podUID="fb239c54-f254-4320-9008-4b4f5895660d" Apr 16 14:52:32.705043 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:32.705009 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:32.705456 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:32.705009 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:32.705456 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:32.705106 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpr62" podUID="3e8487d9-5753-42d0-9838-80ce2360a1b0" Apr 16 14:52:32.705456 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:32.705226 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zzl5" podUID="fb239c54-f254-4320-9008-4b4f5895660d" Apr 16 14:52:34.705448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:34.705364 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:34.705904 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:34.705364 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:34.705904 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:34.705491 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpr62" podUID="3e8487d9-5753-42d0-9838-80ce2360a1b0" Apr 16 14:52:34.705904 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:34.705605 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zzl5" podUID="fb239c54-f254-4320-9008-4b4f5895660d" Apr 16 14:52:36.296575 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:36.296540 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs\") pod \"network-metrics-daemon-5zzl5\" (UID: \"fb239c54-f254-4320-9008-4b4f5895660d\") " pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:36.297087 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:36.296583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgz68\" (UniqueName: \"kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68\") pod \"network-check-target-dpr62\" (UID: \"3e8487d9-5753-42d0-9838-80ce2360a1b0\") " pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:36.297087 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:36.296718 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:36.297087 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:36.296746 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:36.297087 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:36.296765 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:36.297087 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:36.296777 2576 projected.go:194] Error preparing data for projected volume kube-api-access-sgz68 for pod openshift-network-diagnostics/network-check-target-dpr62: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:36.297087 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:36.296795 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs podName:fb239c54-f254-4320-9008-4b4f5895660d nodeName:}" failed. No retries permitted until 2026-04-16 14:52:52.29677294 +0000 UTC m=+33.214647739 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs") pod "network-metrics-daemon-5zzl5" (UID: "fb239c54-f254-4320-9008-4b4f5895660d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:36.297087 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:36.296829 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68 podName:3e8487d9-5753-42d0-9838-80ce2360a1b0 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:52.296816209 +0000 UTC m=+33.214691004 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-sgz68" (UniqueName: "kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68") pod "network-check-target-dpr62" (UID: "3e8487d9-5753-42d0-9838-80ce2360a1b0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:36.705705 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:36.705618 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:36.705862 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:36.705745 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpr62" podUID="3e8487d9-5753-42d0-9838-80ce2360a1b0" Apr 16 14:52:36.705862 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:36.705808 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:36.705958 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:36.705930 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zzl5" podUID="fb239c54-f254-4320-9008-4b4f5895660d" Apr 16 14:52:38.704930 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:38.704896 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:38.704930 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:38.704930 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:38.705413 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:38.705014 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpr62" podUID="3e8487d9-5753-42d0-9838-80ce2360a1b0" Apr 16 14:52:38.705413 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:38.705139 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zzl5" podUID="fb239c54-f254-4320-9008-4b4f5895660d" Apr 16 14:52:39.861766 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:39.861584 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-76rhf" event={"ID":"c68702b9-21dc-43b1-ba9f-d5a236a6b183","Type":"ContainerStarted","Data":"7a6ca72aac36752e562816a9b3066b1bd1b2a2cee2932aaf18c5ab027a7b2a8b"} Apr 16 14:52:39.864244 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:39.864224 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovn-acl-logging/0.log" Apr 16 14:52:39.864577 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:39.864559 2576 generic.go:358] "Generic (PLEG): container finished" podID="7888efd0-340e-44a4-9e27-7fbbad8b7bfd" containerID="6991062f3beb10ed6d11e882f9b6479f241e0d4ae6a96eb4b4bbc06d5e960de4" exitCode=1 Apr 16 14:52:39.864679 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:39.864614 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" event={"ID":"7888efd0-340e-44a4-9e27-7fbbad8b7bfd","Type":"ContainerStarted","Data":"07aa28f9df9608ed57ae57880a413796d22e054a8a2a3fb1a7ad0528e7bc6f90"} Apr 16 14:52:39.864679 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:39.864670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" event={"ID":"7888efd0-340e-44a4-9e27-7fbbad8b7bfd","Type":"ContainerStarted","Data":"997343d030466cf4c210a0acb49e2cbb508eccefc5993533115d59a008817dad"} Apr 16 14:52:39.864790 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:39.864685 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" event={"ID":"7888efd0-340e-44a4-9e27-7fbbad8b7bfd","Type":"ContainerDied","Data":"6991062f3beb10ed6d11e882f9b6479f241e0d4ae6a96eb4b4bbc06d5e960de4"} Apr 16 14:52:39.864790 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:39.864699 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" event={"ID":"7888efd0-340e-44a4-9e27-7fbbad8b7bfd","Type":"ContainerStarted","Data":"8fea14f8a682bfb7b916105982ae71a9dfe17acac083155e09ffde6595a576d2"} Apr 16 14:52:39.869166 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:39.869137 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gd596" event={"ID":"6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78","Type":"ContainerStarted","Data":"9dc3843a79d5acdf5cbf83a637d9bbf7dc70448ef7a39180672810b737d8c5ca"} Apr 16 14:52:39.871476 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:39.871453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-195.ec2.internal" event={"ID":"d244d209c4485d5d4fbaebc8851f6290","Type":"ContainerStarted","Data":"ee178dd8dcf53a308ea2f07f1ddc228de0e39923a688ebbeb41d41f093b6720d"} Apr 16 14:52:39.892715 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:39.892628 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-195.ec2.internal" podStartSLOduration=19.892612726 podStartE2EDuration="19.892612726s" podCreationTimestamp="2026-04-16 14:52:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:39.892455457 +0000 UTC m=+20.810330274" watchObservedRunningTime="2026-04-16 14:52:39.892612726 +0000 UTC m=+20.810487541" Apr 16 14:52:39.892977 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:39.892936 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-76rhf" podStartSLOduration=2.493366811 podStartE2EDuration="20.89292569s" podCreationTimestamp="2026-04-16 14:52:19 +0000 UTC" firstStartedPulling="2026-04-16 14:52:20.976439637 +0000 UTC m=+1.894314430" lastFinishedPulling="2026-04-16 14:52:39.375998515 +0000 UTC m=+20.293873309" observedRunningTime="2026-04-16 14:52:39.880446846 +0000 UTC m=+20.798321662" watchObservedRunningTime="2026-04-16 14:52:39.89292569 +0000 UTC m=+20.810800505" Apr 16 14:52:40.705860 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.705691 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:40.705994 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.705691 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:40.705994 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:40.705951 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpr62" podUID="3e8487d9-5753-42d0-9838-80ce2360a1b0" Apr 16 14:52:40.706070 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:40.706013 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zzl5" podUID="fb239c54-f254-4320-9008-4b4f5895660d" Apr 16 14:52:40.873952 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.873906 2576 generic.go:358] "Generic (PLEG): container finished" podID="6e540316e449bbc0853c708e7d4b7aa0" containerID="45d5ddb65191ff253f733aa23975f82f1b1062db048bbb2c9a88cdda28d8290b" exitCode=0 Apr 16 14:52:40.874527 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.873991 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal" event={"ID":"6e540316e449bbc0853c708e7d4b7aa0","Type":"ContainerDied","Data":"45d5ddb65191ff253f733aa23975f82f1b1062db048bbb2c9a88cdda28d8290b"} Apr 16 14:52:40.875497 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.875477 2576 generic.go:358] "Generic (PLEG): container finished" podID="fac8cd8d-4a46-40df-b803-d33c16259cc1" containerID="868e77635d6461d6d3838d63af0668eaa3cc42f224882c93d2ee1729f17f8fd3" exitCode=0 Apr 16 14:52:40.875619 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.875549 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gn47f" event={"ID":"fac8cd8d-4a46-40df-b803-d33c16259cc1","Type":"ContainerDied","Data":"868e77635d6461d6d3838d63af0668eaa3cc42f224882c93d2ee1729f17f8fd3"} Apr 16 14:52:40.877954 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.877936 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovn-acl-logging/0.log" Apr 16 14:52:40.878369 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.878307 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" event={"ID":"7888efd0-340e-44a4-9e27-7fbbad8b7bfd","Type":"ContainerStarted","Data":"5861221d458cfeb5ce13b3f5e5eb32dd75b43215107f4743387c1e0bceebcc53"} Apr 16 14:52:40.878369 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.878333 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" event={"ID":"7888efd0-340e-44a4-9e27-7fbbad8b7bfd","Type":"ContainerStarted","Data":"b32d19f55b92cfa5130952caf110e29d5eecbd235b1443628256795654aa22d3"} Apr 16 14:52:40.879535 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.879516 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8js26" event={"ID":"15bab44e-b8bf-4170-b623-a5f844d8bfb0","Type":"ContainerStarted","Data":"620cbb79770531367c06c9d9e216be9912d69d6ecd02f4b20d83d489775d3a0f"} Apr 16 14:52:40.880669 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.880630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2tqg5" event={"ID":"0d793735-b1fb-4ca3-bc99-1447700e773f","Type":"ContainerStarted","Data":"e48395b11219732fc229f37acbbdda3581e1e0d7aba5cf129435cf9d7e7c3a44"} Apr 16 14:52:40.881784 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.881764 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9tmph" event={"ID":"ffd2dff3-4034-47e4-b3bf-bd072dba227e","Type":"ContainerStarted","Data":"646e3be373b8c6febd72a98b57060f30fed6fe426cccba7d7265e999e68f43ea"} Apr 16 14:52:40.883026 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.883006 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7xqtf" event={"ID":"653d6001-f9f0-440a-9aab-e87455bc4e3f","Type":"ContainerStarted","Data":"d633ebf3b980bb314fe46d5f5391bd4bde7107bac4828b555e489d8a15b5b528"} Apr 16 14:52:40.884238 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.884219 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" event={"ID":"44f20ff6-a1ac-47f8-845d-906159fbce7f","Type":"ContainerStarted","Data":"333d00d3da64dca4b57c5952a3b0ff86f091187c1756dc0f48ae991f67d15c37"} Apr 16 14:52:40.889208 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.889163 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-gd596" podStartSLOduration=3.448476567 podStartE2EDuration="21.889150076s" podCreationTimestamp="2026-04-16 14:52:19 +0000 UTC" firstStartedPulling="2026-04-16 14:52:20.901309375 +0000 UTC m=+1.819184168" lastFinishedPulling="2026-04-16 14:52:39.341982877 +0000 UTC m=+20.259857677" observedRunningTime="2026-04-16 14:52:39.907172966 +0000 UTC m=+20.825047782" watchObservedRunningTime="2026-04-16 14:52:40.889150076 +0000 UTC m=+21.807024892" Apr 16 14:52:40.918709 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.918669 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-9tmph" podStartSLOduration=3.579629148 podStartE2EDuration="21.918656375s" podCreationTimestamp="2026-04-16 14:52:19 +0000 UTC" firstStartedPulling="2026-04-16 14:52:20.972137302 +0000 UTC m=+1.890012096" lastFinishedPulling="2026-04-16 14:52:39.311164515 +0000 UTC m=+20.229039323" observedRunningTime="2026-04-16 14:52:40.902160786 +0000 UTC m=+21.820035600" watchObservedRunningTime="2026-04-16 14:52:40.918656375 +0000 UTC m=+21.836531187" Apr 16 14:52:40.932551 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.932510 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2tqg5" podStartSLOduration=3.480937708 podStartE2EDuration="21.932500319s" podCreationTimestamp="2026-04-16 14:52:19 +0000 UTC" firstStartedPulling="2026-04-16 14:52:20.889114306 +0000 UTC m=+1.806989100" lastFinishedPulling="2026-04-16 14:52:39.340676904 +0000 UTC m=+20.258551711" observedRunningTime="2026-04-16 14:52:40.918458932 +0000 UTC m=+21.836333749" watchObservedRunningTime="2026-04-16 14:52:40.932500319 +0000 UTC m=+21.850375188" Apr 16 14:52:40.955525 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.955477 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7xqtf" podStartSLOduration=3.543981311 podStartE2EDuration="21.955465632s" podCreationTimestamp="2026-04-16 14:52:19 +0000 UTC" firstStartedPulling="2026-04-16 14:52:20.928660135 +0000 UTC m=+1.846534928" lastFinishedPulling="2026-04-16 14:52:39.340144443 +0000 UTC m=+20.258019249" observedRunningTime="2026-04-16 14:52:40.932623937 +0000 UTC m=+21.850498737" watchObservedRunningTime="2026-04-16 14:52:40.955465632 +0000 UTC m=+21.873340447" Apr 16 14:52:40.970784 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:40.970673 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8js26" podStartSLOduration=3.541676125 podStartE2EDuration="21.970660193s" podCreationTimestamp="2026-04-16 14:52:19 +0000 UTC" firstStartedPulling="2026-04-16 14:52:20.91277791 +0000 UTC m=+1.830652703" lastFinishedPulling="2026-04-16 14:52:39.341761975 +0000 UTC m=+20.259636771" observedRunningTime="2026-04-16 14:52:40.970064381 +0000 UTC m=+21.887939197" watchObservedRunningTime="2026-04-16 14:52:40.970660193 +0000 UTC m=+21.888535005" Apr 16 14:52:41.050998 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:41.050962 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 14:52:41.640228 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:41.640093 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T14:52:41.050981654Z","UUID":"1d243e6f-5222-4e05-a61f-a5ced4aa5c6f","Handler":null,"Name":"","Endpoint":""} Apr 16 14:52:41.641899 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:41.641875 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 14:52:41.642033 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:41.641910 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 14:52:41.889096 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:41.889064 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" event={"ID":"44f20ff6-a1ac-47f8-845d-906159fbce7f","Type":"ContainerStarted","Data":"1eb2873408d672f299ad6adf5544a3dcdc6d90527e3639ed853d999b775bba65"} Apr 16 14:52:41.894910 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:41.894056 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal" event={"ID":"6e540316e449bbc0853c708e7d4b7aa0","Type":"ContainerStarted","Data":"a6f57680e1514a18c846c3641a963541f9c3f25aad6be756946a514ae720df6c"} Apr 16 14:52:41.914091 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:41.913701 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-195.ec2.internal" podStartSLOduration=21.913684572 podStartE2EDuration="21.913684572s" podCreationTimestamp="2026-04-16 14:52:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:41.913378758 +0000 UTC m=+22.831253576" watchObservedRunningTime="2026-04-16 14:52:41.913684572 +0000 UTC m=+22.831559389" Apr 16 14:52:42.705370 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:42.705341 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:42.705549 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:42.705385 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:42.705549 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:42.705485 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpr62" podUID="3e8487d9-5753-42d0-9838-80ce2360a1b0" Apr 16 14:52:42.705668 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:42.705562 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zzl5" podUID="fb239c54-f254-4320-9008-4b4f5895660d" Apr 16 14:52:42.837576 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:42.837546 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2tqg5" Apr 16 14:52:42.838210 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:42.838190 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2tqg5" Apr 16 14:52:42.896394 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:42.896371 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovn-acl-logging/0.log" Apr 16 14:52:42.896896 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:42.896873 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" event={"ID":"7888efd0-340e-44a4-9e27-7fbbad8b7bfd","Type":"ContainerStarted","Data":"6752a70b964401cf072603fcf651fd7810f6464c35cbcaf5c0cfd22f85cc9f80"} Apr 16 14:52:42.898844 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:42.898817 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" event={"ID":"44f20ff6-a1ac-47f8-845d-906159fbce7f","Type":"ContainerStarted","Data":"daf26b846bd6852c6bf0564d5592338e23730f1cdbdc3ff329399373eb2943e3"} Apr 16 14:52:42.899146 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:42.899100 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2tqg5" Apr 16 14:52:42.899716 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:42.899693 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2tqg5" Apr 16 14:52:42.916147 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:42.916105 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsnln" podStartSLOduration=2.971468831 podStartE2EDuration="23.916081863s" podCreationTimestamp="2026-04-16 14:52:19 +0000 UTC" firstStartedPulling="2026-04-16 14:52:20.909906378 +0000 UTC m=+1.827781175" lastFinishedPulling="2026-04-16 14:52:41.854519403 +0000 UTC m=+22.772394207" observedRunningTime="2026-04-16 14:52:42.916069498 +0000 UTC m=+23.833944313" watchObservedRunningTime="2026-04-16 14:52:42.916081863 +0000 UTC m=+23.833956681" Apr 16 14:52:44.705397 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:44.705362 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:44.705818 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:44.705362 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:44.705818 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:44.705479 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zzl5" podUID="fb239c54-f254-4320-9008-4b4f5895660d" Apr 16 14:52:44.705818 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:44.705536 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpr62" podUID="3e8487d9-5753-42d0-9838-80ce2360a1b0" Apr 16 14:52:44.907588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:44.907398 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovn-acl-logging/0.log" Apr 16 14:52:44.908930 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:44.908777 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" event={"ID":"7888efd0-340e-44a4-9e27-7fbbad8b7bfd","Type":"ContainerStarted","Data":"85659d073e0c350a0244b5d6a254fb9131580b6ab26b3b6731564912a079f1e5"} Apr 16 14:52:44.908930 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:44.908823 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:44.908930 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:44.908833 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:44.909085 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:44.909021 2576 scope.go:117] "RemoveContainer" containerID="6991062f3beb10ed6d11e882f9b6479f241e0d4ae6a96eb4b4bbc06d5e960de4" Apr 16 14:52:44.929624 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:44.927892 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:44.929624 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:44.928950 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:45.912983 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:45.912947 2576 generic.go:358] "Generic (PLEG): container finished" podID="fac8cd8d-4a46-40df-b803-d33c16259cc1" containerID="741823b6e023712d833964a6f826a38da0dd77731e289430a9b6f43606336f5a" exitCode=0 Apr 16 14:52:45.913710 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:45.913032 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gn47f" event={"ID":"fac8cd8d-4a46-40df-b803-d33c16259cc1","Type":"ContainerDied","Data":"741823b6e023712d833964a6f826a38da0dd77731e289430a9b6f43606336f5a"} Apr 16 14:52:45.916400 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:45.916381 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovn-acl-logging/0.log" Apr 16 14:52:45.916762 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:45.916737 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" event={"ID":"7888efd0-340e-44a4-9e27-7fbbad8b7bfd","Type":"ContainerStarted","Data":"3b7570226d1c8e960560c692e37cf69593270742b637875c486e24010988c4cb"} Apr 16 14:52:45.916880 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:45.916867 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 14:52:45.954905 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:45.954869 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" podStartSLOduration=8.374202675 podStartE2EDuration="26.954859989s" podCreationTimestamp="2026-04-16 14:52:19 +0000 UTC" firstStartedPulling="2026-04-16 14:52:20.941922796 +0000 UTC m=+1.859797590" lastFinishedPulling="2026-04-16 14:52:39.522580102 +0000 UTC m=+20.440454904" observedRunningTime="2026-04-16 14:52:45.953683731 +0000 UTC m=+26.871558546" watchObservedRunningTime="2026-04-16 14:52:45.954859989 +0000 UTC m=+26.872734804" Apr 16 14:52:46.705238 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:46.705094 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:46.705323 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:46.705157 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:46.705323 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:46.705311 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zzl5" podUID="fb239c54-f254-4320-9008-4b4f5895660d" Apr 16 14:52:46.705394 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:46.705376 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpr62" podUID="3e8487d9-5753-42d0-9838-80ce2360a1b0" Apr 16 14:52:46.816118 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:46.816027 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dpr62"] Apr 16 14:52:46.818996 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:46.818976 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5zzl5"] Apr 16 14:52:46.920408 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:46.920380 2576 generic.go:358] "Generic (PLEG): container finished" podID="fac8cd8d-4a46-40df-b803-d33c16259cc1" containerID="58aa2ab630d73b311341661c71ab887650411d35591236e20ed7bda2aea1e2c3" exitCode=0 Apr 16 14:52:46.920753 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:46.920483 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gn47f" event={"ID":"fac8cd8d-4a46-40df-b803-d33c16259cc1","Type":"ContainerDied","Data":"58aa2ab630d73b311341661c71ab887650411d35591236e20ed7bda2aea1e2c3"} Apr 16 14:52:46.920753 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:46.920588 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:46.920753 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:46.920690 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:46.920753 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:46.920693 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpr62" podUID="3e8487d9-5753-42d0-9838-80ce2360a1b0" Apr 16 14:52:46.920914 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:46.920774 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zzl5" podUID="fb239c54-f254-4320-9008-4b4f5895660d" Apr 16 14:52:46.920914 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:46.920892 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 14:52:47.924306 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:47.924275 2576 generic.go:358] "Generic (PLEG): container finished" podID="fac8cd8d-4a46-40df-b803-d33c16259cc1" containerID="95f42a089498b9bd7f43ce6905ad42aaafb5cf5ccc7a0b61f282fdd103c91481" exitCode=0 Apr 16 14:52:47.924655 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:47.924325 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gn47f" event={"ID":"fac8cd8d-4a46-40df-b803-d33c16259cc1","Type":"ContainerDied","Data":"95f42a089498b9bd7f43ce6905ad42aaafb5cf5ccc7a0b61f282fdd103c91481"} Apr 16 14:52:48.705481 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:48.705445 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:48.705633 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:48.705452 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:48.705633 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:48.705580 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zzl5" podUID="fb239c54-f254-4320-9008-4b4f5895660d" Apr 16 14:52:48.705771 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:48.705657 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpr62" podUID="3e8487d9-5753-42d0-9838-80ce2360a1b0" Apr 16 14:52:49.546588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:49.546557 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:52:49.547080 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:49.546789 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 14:52:49.559068 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:49.559001 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" podUID="7888efd0-340e-44a4-9e27-7fbbad8b7bfd" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 14:52:49.569599 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:49.569570 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" podUID="7888efd0-340e-44a4-9e27-7fbbad8b7bfd" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 14:52:50.704808 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:50.704773 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:50.705186 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:50.704899 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpr62" podUID="3e8487d9-5753-42d0-9838-80ce2360a1b0" Apr 16 14:52:50.705186 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:50.704939 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:50.705186 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:50.705024 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zzl5" podUID="fb239c54-f254-4320-9008-4b4f5895660d" Apr 16 14:52:52.323238 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.323206 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs\") pod \"network-metrics-daemon-5zzl5\" (UID: \"fb239c54-f254-4320-9008-4b4f5895660d\") " pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:52.323673 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.323254 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgz68\" (UniqueName: \"kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68\") pod \"network-check-target-dpr62\" (UID: \"3e8487d9-5753-42d0-9838-80ce2360a1b0\") " pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:52.323673 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:52.323372 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:52.323673 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:52.323407 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:52.323673 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:52.323423 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:52.323673 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:52.323435 2576 projected.go:194] Error preparing data for projected volume kube-api-access-sgz68 for pod openshift-network-diagnostics/network-check-target-dpr62: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:52.323673 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:52.323437 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs podName:fb239c54-f254-4320-9008-4b4f5895660d nodeName:}" failed. No retries permitted until 2026-04-16 14:53:24.323417096 +0000 UTC m=+65.241291890 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs") pod "network-metrics-daemon-5zzl5" (UID: "fb239c54-f254-4320-9008-4b4f5895660d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:52.323673 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:52.323487 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68 podName:3e8487d9-5753-42d0-9838-80ce2360a1b0 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:24.323470532 +0000 UTC m=+65.241345368 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-sgz68" (UniqueName: "kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68") pod "network-check-target-dpr62" (UID: "3e8487d9-5753-42d0-9838-80ce2360a1b0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:52.370864 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.370840 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-195.ec2.internal" event="NodeReady" Apr 16 14:52:52.370981 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.370969 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:52:52.425299 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.425269 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jwtvm"] Apr 16 14:52:52.452141 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.452116 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bb2ms"] Apr 16 14:52:52.452298 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.452277 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jwtvm" Apr 16 14:52:52.454991 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.454878 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:52:52.454991 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.454887 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-njff8\"" Apr 16 14:52:52.454991 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.454889 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:52:52.464601 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.464583 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jwtvm"] Apr 16 14:52:52.464731 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.464609 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bb2ms"] Apr 16 14:52:52.464795 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.464737 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bb2ms" Apr 16 14:52:52.467046 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.467025 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:52:52.467131 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.467073 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:52:52.467168 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.467031 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:52:52.467223 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.467031 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gcjjc\"" Apr 16 14:52:52.625200 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.625171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert\") pod \"ingress-canary-bb2ms\" (UID: \"f88b98b0-264f-41ee-a565-3a2941c70020\") " pod="openshift-ingress-canary/ingress-canary-bb2ms" Apr 16 14:52:52.625378 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.625209 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmp2l\" (UniqueName: \"kubernetes.io/projected/f88b98b0-264f-41ee-a565-3a2941c70020-kube-api-access-nmp2l\") pod \"ingress-canary-bb2ms\" (UID: \"f88b98b0-264f-41ee-a565-3a2941c70020\") " pod="openshift-ingress-canary/ingress-canary-bb2ms" Apr 16 14:52:52.625378 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.625242 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-config-volume\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:52:52.625378 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.625323 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-tmp-dir\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:52:52.625378 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.625373 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9z6r\" (UniqueName: \"kubernetes.io/projected/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-kube-api-access-m9z6r\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:52:52.625585 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.625429 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:52:52.705558 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.705527 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:52:52.705724 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.705564 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:52:52.708479 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.708450 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dpmcp\"" Apr 16 14:52:52.708606 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.708490 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:52:52.708606 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.708493 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:52:52.708606 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.708517 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hst6r\"" Apr 16 14:52:52.708606 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.708454 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:52:52.726441 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.726418 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-config-volume\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:52:52.726527 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.726474 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-tmp-dir\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:52:52.726590 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.726534 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9z6r\" (UniqueName: \"kubernetes.io/projected/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-kube-api-access-m9z6r\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:52:52.726590 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.726560 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:52:52.726711 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.726600 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert\") pod \"ingress-canary-bb2ms\" (UID: \"f88b98b0-264f-41ee-a565-3a2941c70020\") " pod="openshift-ingress-canary/ingress-canary-bb2ms" Apr 16 14:52:52.726711 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.726624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmp2l\" (UniqueName: \"kubernetes.io/projected/f88b98b0-264f-41ee-a565-3a2941c70020-kube-api-access-nmp2l\") pod \"ingress-canary-bb2ms\" (UID: \"f88b98b0-264f-41ee-a565-3a2941c70020\") " pod="openshift-ingress-canary/ingress-canary-bb2ms" Apr 16 14:52:52.726880 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:52.726863 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:52.726949 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:52.726923 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls podName:f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:53.226905191 +0000 UTC m=+34.144779999 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls") pod "dns-default-jwtvm" (UID: "f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4") : secret "dns-default-metrics-tls" not found Apr 16 14:52:52.727013 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:52.726985 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:52.727013 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.726994 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-config-volume\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:52:52.727125 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:52.727034 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert podName:f88b98b0-264f-41ee-a565-3a2941c70020 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:53.22701762 +0000 UTC m=+34.144892428 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert") pod "ingress-canary-bb2ms" (UID: "f88b98b0-264f-41ee-a565-3a2941c70020") : secret "canary-serving-cert" not found Apr 16 14:52:52.735199 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.735176 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-tmp-dir\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:52:52.738143 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.738019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9z6r\" (UniqueName: \"kubernetes.io/projected/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-kube-api-access-m9z6r\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:52:52.738237 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:52.738078 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmp2l\" (UniqueName: \"kubernetes.io/projected/f88b98b0-264f-41ee-a565-3a2941c70020-kube-api-access-nmp2l\") pod \"ingress-canary-bb2ms\" (UID: \"f88b98b0-264f-41ee-a565-3a2941c70020\") " pod="openshift-ingress-canary/ingress-canary-bb2ms" Apr 16 14:52:53.231237 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:53.231203 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert\") pod \"ingress-canary-bb2ms\" (UID: \"f88b98b0-264f-41ee-a565-3a2941c70020\") " pod="openshift-ingress-canary/ingress-canary-bb2ms" Apr 16 14:52:53.231445 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:53.231283 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:52:53.231445 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:53.231356 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:53.231445 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:53.231362 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:53.231445 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:53.231424 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert podName:f88b98b0-264f-41ee-a565-3a2941c70020 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:54.231405237 +0000 UTC m=+35.149280044 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert") pod "ingress-canary-bb2ms" (UID: "f88b98b0-264f-41ee-a565-3a2941c70020") : secret "canary-serving-cert" not found Apr 16 14:52:53.231445 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:53.231441 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls podName:f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:54.231432734 +0000 UTC m=+35.149307534 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls") pod "dns-default-jwtvm" (UID: "f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4") : secret "dns-default-metrics-tls" not found Apr 16 14:52:53.973280 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:53.973259 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6"] Apr 16 14:52:53.999300 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:53.999279 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6"] Apr 16 14:52:53.999415 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:53.999384 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" Apr 16 14:52:54.001788 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.001765 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 14:52:54.001880 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.001805 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 14:52:54.001939 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.001813 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 14:52:54.002538 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.002522 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 14:52:54.135990 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.135968 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cgpr\" (UniqueName: \"kubernetes.io/projected/96524ebe-9e7d-422c-9aaf-9436e9cc3c13-kube-api-access-9cgpr\") pod \"klusterlet-addon-workmgr-b579885fd-6dzj6\" (UID: \"96524ebe-9e7d-422c-9aaf-9436e9cc3c13\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" Apr 16 14:52:54.136070 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.136015 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/96524ebe-9e7d-422c-9aaf-9436e9cc3c13-tmp\") pod \"klusterlet-addon-workmgr-b579885fd-6dzj6\" (UID: \"96524ebe-9e7d-422c-9aaf-9436e9cc3c13\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" Apr 16 14:52:54.136070 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.136050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/96524ebe-9e7d-422c-9aaf-9436e9cc3c13-klusterlet-config\") pod \"klusterlet-addon-workmgr-b579885fd-6dzj6\" (UID: \"96524ebe-9e7d-422c-9aaf-9436e9cc3c13\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" Apr 16 14:52:54.236449 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.236425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:52:54.236530 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.236458 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/96524ebe-9e7d-422c-9aaf-9436e9cc3c13-tmp\") pod \"klusterlet-addon-workmgr-b579885fd-6dzj6\" (UID: \"96524ebe-9e7d-422c-9aaf-9436e9cc3c13\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" Apr 16 14:52:54.236530 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.236478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/96524ebe-9e7d-422c-9aaf-9436e9cc3c13-klusterlet-config\") pod \"klusterlet-addon-workmgr-b579885fd-6dzj6\" (UID: \"96524ebe-9e7d-422c-9aaf-9436e9cc3c13\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" Apr 16 14:52:54.236605 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:54.236568 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:54.236665 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.236602 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert\") pod \"ingress-canary-bb2ms\" (UID: \"f88b98b0-264f-41ee-a565-3a2941c70020\") " pod="openshift-ingress-canary/ingress-canary-bb2ms" Apr 16 14:52:54.236707 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:54.236668 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls podName:f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:56.236627956 +0000 UTC m=+37.154502749 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls") pod "dns-default-jwtvm" (UID: "f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4") : secret "dns-default-metrics-tls" not found Apr 16 14:52:54.236707 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:54.236702 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:54.236803 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:54.236737 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert podName:f88b98b0-264f-41ee-a565-3a2941c70020 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:56.236725917 +0000 UTC m=+37.154600713 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert") pod "ingress-canary-bb2ms" (UID: "f88b98b0-264f-41ee-a565-3a2941c70020") : secret "canary-serving-cert" not found Apr 16 14:52:54.236803 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.236755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cgpr\" (UniqueName: \"kubernetes.io/projected/96524ebe-9e7d-422c-9aaf-9436e9cc3c13-kube-api-access-9cgpr\") pod \"klusterlet-addon-workmgr-b579885fd-6dzj6\" (UID: \"96524ebe-9e7d-422c-9aaf-9436e9cc3c13\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" Apr 16 14:52:54.236904 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.236826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/96524ebe-9e7d-422c-9aaf-9436e9cc3c13-tmp\") pod \"klusterlet-addon-workmgr-b579885fd-6dzj6\" (UID: \"96524ebe-9e7d-422c-9aaf-9436e9cc3c13\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" Apr 16 14:52:54.239214 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.239192 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/96524ebe-9e7d-422c-9aaf-9436e9cc3c13-klusterlet-config\") pod \"klusterlet-addon-workmgr-b579885fd-6dzj6\" (UID: \"96524ebe-9e7d-422c-9aaf-9436e9cc3c13\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" Apr 16 14:52:54.246120 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.246098 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cgpr\" (UniqueName: \"kubernetes.io/projected/96524ebe-9e7d-422c-9aaf-9436e9cc3c13-kube-api-access-9cgpr\") pod \"klusterlet-addon-workmgr-b579885fd-6dzj6\" (UID: \"96524ebe-9e7d-422c-9aaf-9436e9cc3c13\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" Apr 16 14:52:54.309718 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.309695 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" Apr 16 14:52:54.483192 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.483164 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6"] Apr 16 14:52:54.487047 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:52:54.487021 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96524ebe_9e7d_422c_9aaf_9436e9cc3c13.slice/crio-29643666327c347bf42463acbf9d26c2d464775e0f9f2910e866ea0566341db6 WatchSource:0}: Error finding container 29643666327c347bf42463acbf9d26c2d464775e0f9f2910e866ea0566341db6: Status 404 returned error can't find the container with id 29643666327c347bf42463acbf9d26c2d464775e0f9f2910e866ea0566341db6 Apr 16 14:52:54.940157 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.940128 2576 generic.go:358] "Generic (PLEG): container finished" podID="fac8cd8d-4a46-40df-b803-d33c16259cc1" containerID="c12c1324a8f720470cab642417d9e989ce6cf38c001dc64a0caef029b5510c76" exitCode=0 Apr 16 14:52:54.940306 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.940208 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gn47f" event={"ID":"fac8cd8d-4a46-40df-b803-d33c16259cc1","Type":"ContainerDied","Data":"c12c1324a8f720470cab642417d9e989ce6cf38c001dc64a0caef029b5510c76"} Apr 16 14:52:54.941260 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:54.941236 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" event={"ID":"96524ebe-9e7d-422c-9aaf-9436e9cc3c13","Type":"ContainerStarted","Data":"29643666327c347bf42463acbf9d26c2d464775e0f9f2910e866ea0566341db6"} Apr 16 14:52:55.947220 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:55.947184 2576 generic.go:358] "Generic (PLEG): container finished" podID="fac8cd8d-4a46-40df-b803-d33c16259cc1" containerID="f27fa5de321cf4d5f36c4566aada7b572269f7bd8d328c242849b0f48a09829e" exitCode=0 Apr 16 14:52:55.947933 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:55.947252 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gn47f" event={"ID":"fac8cd8d-4a46-40df-b803-d33c16259cc1","Type":"ContainerDied","Data":"f27fa5de321cf4d5f36c4566aada7b572269f7bd8d328c242849b0f48a09829e"} Apr 16 14:52:56.254437 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:56.254343 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:52:56.254437 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:56.254410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert\") pod \"ingress-canary-bb2ms\" (UID: \"f88b98b0-264f-41ee-a565-3a2941c70020\") " pod="openshift-ingress-canary/ingress-canary-bb2ms" Apr 16 14:52:56.254660 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:56.254521 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:56.254660 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:56.254526 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:56.254660 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:56.254585 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert podName:f88b98b0-264f-41ee-a565-3a2941c70020 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:00.254562864 +0000 UTC m=+41.172437657 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert") pod "ingress-canary-bb2ms" (UID: "f88b98b0-264f-41ee-a565-3a2941c70020") : secret "canary-serving-cert" not found Apr 16 14:52:56.254660 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:52:56.254600 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls podName:f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:00.254594175 +0000 UTC m=+41.172468968 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls") pod "dns-default-jwtvm" (UID: "f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4") : secret "dns-default-metrics-tls" not found Apr 16 14:52:56.951538 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:56.951502 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gn47f" event={"ID":"fac8cd8d-4a46-40df-b803-d33c16259cc1","Type":"ContainerStarted","Data":"b93bd122ded69e15b78c0d79713ad2891c58e1440308d33e2648d25beba57eaf"} Apr 16 14:52:56.974556 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:56.974501 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gn47f" podStartSLOduration=5.13628983 podStartE2EDuration="37.974483425s" podCreationTimestamp="2026-04-16 14:52:19 +0000 UTC" firstStartedPulling="2026-04-16 14:52:21.001956726 +0000 UTC m=+1.919831519" lastFinishedPulling="2026-04-16 14:52:53.84015031 +0000 UTC m=+34.758025114" observedRunningTime="2026-04-16 14:52:56.973812279 +0000 UTC m=+37.891687118" watchObservedRunningTime="2026-04-16 14:52:56.974483425 +0000 UTC m=+37.892358239" Apr 16 14:52:59.958165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:59.958132 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" event={"ID":"96524ebe-9e7d-422c-9aaf-9436e9cc3c13","Type":"ContainerStarted","Data":"75ae9aa18469cebbaefc38ba498ad76eef8753977b3ad8d446efd39581767360"} Apr 16 14:52:59.958554 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:59.958451 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" Apr 16 14:52:59.959936 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:59.959916 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" Apr 16 14:52:59.972581 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:52:59.972545 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" podStartSLOduration=2.431411332 podStartE2EDuration="6.972530292s" podCreationTimestamp="2026-04-16 14:52:53 +0000 UTC" firstStartedPulling="2026-04-16 14:52:54.488729658 +0000 UTC m=+35.406604451" lastFinishedPulling="2026-04-16 14:52:59.029848603 +0000 UTC m=+39.947723411" observedRunningTime="2026-04-16 14:52:59.972228676 +0000 UTC m=+40.890103492" watchObservedRunningTime="2026-04-16 14:52:59.972530292 +0000 UTC m=+40.890405108" Apr 16 14:53:00.284824 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:00.284760 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:53:00.284824 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:00.284799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert\") pod \"ingress-canary-bb2ms\" (UID: \"f88b98b0-264f-41ee-a565-3a2941c70020\") " pod="openshift-ingress-canary/ingress-canary-bb2ms" Apr 16 14:53:00.284995 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:53:00.284878 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:00.284995 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:53:00.284898 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:00.284995 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:53:00.284920 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert podName:f88b98b0-264f-41ee-a565-3a2941c70020 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:08.284908467 +0000 UTC m=+49.202783260 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert") pod "ingress-canary-bb2ms" (UID: "f88b98b0-264f-41ee-a565-3a2941c70020") : secret "canary-serving-cert" not found Apr 16 14:53:00.284995 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:53:00.284950 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls podName:f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:08.284933238 +0000 UTC m=+49.202808044 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls") pod "dns-default-jwtvm" (UID: "f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4") : secret "dns-default-metrics-tls" not found Apr 16 14:53:08.337857 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:08.337812 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert\") pod \"ingress-canary-bb2ms\" (UID: \"f88b98b0-264f-41ee-a565-3a2941c70020\") " pod="openshift-ingress-canary/ingress-canary-bb2ms" Apr 16 14:53:08.338309 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:08.337909 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:53:08.338309 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:53:08.337976 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:08.338309 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:53:08.338034 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:08.338309 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:53:08.338054 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert podName:f88b98b0-264f-41ee-a565-3a2941c70020 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:24.338037473 +0000 UTC m=+65.255912268 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert") pod "ingress-canary-bb2ms" (UID: "f88b98b0-264f-41ee-a565-3a2941c70020") : secret "canary-serving-cert" not found Apr 16 14:53:08.338309 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:53:08.338097 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls podName:f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:24.338080255 +0000 UTC m=+65.255955049 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls") pod "dns-default-jwtvm" (UID: "f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4") : secret "dns-default-metrics-tls" not found Apr 16 14:53:19.569525 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:19.569483 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jsx45" Apr 16 14:53:24.344825 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:24.344789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert\") pod \"ingress-canary-bb2ms\" (UID: \"f88b98b0-264f-41ee-a565-3a2941c70020\") " pod="openshift-ingress-canary/ingress-canary-bb2ms" Apr 16 14:53:24.345262 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:24.344842 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs\") pod \"network-metrics-daemon-5zzl5\" (UID: \"fb239c54-f254-4320-9008-4b4f5895660d\") " pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:53:24.345262 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:24.344871 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgz68\" (UniqueName: \"kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68\") pod \"network-check-target-dpr62\" (UID: \"3e8487d9-5753-42d0-9838-80ce2360a1b0\") " pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:53:24.345262 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:24.344904 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:53:24.345262 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:53:24.344921 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:24.345262 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:53:24.344990 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert podName:f88b98b0-264f-41ee-a565-3a2941c70020 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:56.344971809 +0000 UTC m=+97.262846614 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert") pod "ingress-canary-bb2ms" (UID: "f88b98b0-264f-41ee-a565-3a2941c70020") : secret "canary-serving-cert" not found Apr 16 14:53:24.345262 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:53:24.345012 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:24.345262 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:53:24.345062 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls podName:f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:56.34504505 +0000 UTC m=+97.262919864 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls") pod "dns-default-jwtvm" (UID: "f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4") : secret "dns-default-metrics-tls" not found Apr 16 14:53:24.347086 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:24.347070 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:53:24.347160 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:24.347123 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:53:24.356104 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:53:24.356084 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:53:24.356218 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:53:24.356129 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs podName:fb239c54-f254-4320-9008-4b4f5895660d nodeName:}" failed. No retries permitted until 2026-04-16 14:54:28.356117418 +0000 UTC m=+129.273992211 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs") pod "network-metrics-daemon-5zzl5" (UID: "fb239c54-f254-4320-9008-4b4f5895660d") : secret "metrics-daemon-secret" not found Apr 16 14:53:24.357736 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:24.357722 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:53:24.370221 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:24.370200 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgz68\" (UniqueName: \"kubernetes.io/projected/3e8487d9-5753-42d0-9838-80ce2360a1b0-kube-api-access-sgz68\") pod \"network-check-target-dpr62\" (UID: \"3e8487d9-5753-42d0-9838-80ce2360a1b0\") " pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:53:24.525414 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:24.525389 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dpmcp\"" Apr 16 14:53:24.533986 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:24.533964 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:53:24.646086 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:24.646058 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dpr62"] Apr 16 14:53:24.649483 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:53:24.649458 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e8487d9_5753_42d0_9838_80ce2360a1b0.slice/crio-83b57d8a7a1ab0ffcfe8d3f031c39b9c13693cea840b3a14ad98d4da29b2f9f9 WatchSource:0}: Error finding container 83b57d8a7a1ab0ffcfe8d3f031c39b9c13693cea840b3a14ad98d4da29b2f9f9: Status 404 returned error can't find the container with id 83b57d8a7a1ab0ffcfe8d3f031c39b9c13693cea840b3a14ad98d4da29b2f9f9 Apr 16 14:53:25.005162 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:25.005128 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dpr62" event={"ID":"3e8487d9-5753-42d0-9838-80ce2360a1b0","Type":"ContainerStarted","Data":"83b57d8a7a1ab0ffcfe8d3f031c39b9c13693cea840b3a14ad98d4da29b2f9f9"} Apr 16 14:53:28.011654 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:28.011612 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dpr62" event={"ID":"3e8487d9-5753-42d0-9838-80ce2360a1b0","Type":"ContainerStarted","Data":"d0c30bc252e70ce512850b4d24963fa5a96fb4e3b93e731933014b58a8113511"} Apr 16 14:53:28.012100 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:28.011773 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:53:28.025706 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:28.025591 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-dpr62" podStartSLOduration=66.385115034 podStartE2EDuration="1m9.025575292s" podCreationTimestamp="2026-04-16 14:52:19 +0000 UTC" firstStartedPulling="2026-04-16 14:53:24.651271144 +0000 UTC m=+65.569145936" lastFinishedPulling="2026-04-16 14:53:27.291731401 +0000 UTC m=+68.209606194" observedRunningTime="2026-04-16 14:53:28.025077279 +0000 UTC m=+68.942952093" watchObservedRunningTime="2026-04-16 14:53:28.025575292 +0000 UTC m=+68.943450108" Apr 16 14:53:56.355884 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:56.355857 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:53:56.356185 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:56.355898 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert\") pod \"ingress-canary-bb2ms\" (UID: \"f88b98b0-264f-41ee-a565-3a2941c70020\") " pod="openshift-ingress-canary/ingress-canary-bb2ms" Apr 16 14:53:56.356185 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:53:56.355995 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:56.356185 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:53:56.356006 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:56.356185 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:53:56.356055 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert podName:f88b98b0-264f-41ee-a565-3a2941c70020 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:00.356042145 +0000 UTC m=+161.273916937 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert") pod "ingress-canary-bb2ms" (UID: "f88b98b0-264f-41ee-a565-3a2941c70020") : secret "canary-serving-cert" not found Apr 16 14:53:56.356185 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:53:56.356073 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls podName:f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:00.356060106 +0000 UTC m=+161.273934899 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls") pod "dns-default-jwtvm" (UID: "f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4") : secret "dns-default-metrics-tls" not found Apr 16 14:53:59.016405 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:53:59.016377 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-dpr62" Apr 16 14:54:28.360509 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:28.360466 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs\") pod \"network-metrics-daemon-5zzl5\" (UID: \"fb239c54-f254-4320-9008-4b4f5895660d\") " pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:54:28.361021 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:28.360576 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:54:28.361021 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:28.360694 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs podName:fb239c54-f254-4320-9008-4b4f5895660d nodeName:}" failed. No retries permitted until 2026-04-16 14:56:30.360678819 +0000 UTC m=+251.278553624 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs") pod "network-metrics-daemon-5zzl5" (UID: "fb239c54-f254-4320-9008-4b4f5895660d") : secret "metrics-daemon-secret" not found Apr 16 14:54:32.602461 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.602430 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-xlbhf"] Apr 16 14:54:32.605001 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.604986 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.607212 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.607184 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:54:32.607349 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.607255 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 14:54:32.607349 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.607266 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 14:54:32.607349 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.607283 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-2z9t6\"" Apr 16 14:54:32.608079 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.608064 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:54:32.615721 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.615701 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 14:54:32.616440 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.616413 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-xlbhf"] Apr 16 14:54:32.683399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.683369 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bc1e63a-763f-4bce-914e-ec0b77b7b58b-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-xlbhf\" (UID: \"2bc1e63a-763f-4bce-914e-ec0b77b7b58b\") " pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.683399 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.683400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bc1e63a-763f-4bce-914e-ec0b77b7b58b-serving-cert\") pod \"insights-operator-5785d4fcdd-xlbhf\" (UID: \"2bc1e63a-763f-4bce-914e-ec0b77b7b58b\") " pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.683551 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.683419 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bc1e63a-763f-4bce-914e-ec0b77b7b58b-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-xlbhf\" (UID: \"2bc1e63a-763f-4bce-914e-ec0b77b7b58b\") " pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.683551 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.683442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2bc1e63a-763f-4bce-914e-ec0b77b7b58b-tmp\") pod \"insights-operator-5785d4fcdd-xlbhf\" (UID: \"2bc1e63a-763f-4bce-914e-ec0b77b7b58b\") " pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.683551 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.683472 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2bc1e63a-763f-4bce-914e-ec0b77b7b58b-snapshots\") pod \"insights-operator-5785d4fcdd-xlbhf\" (UID: \"2bc1e63a-763f-4bce-914e-ec0b77b7b58b\") " pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.683551 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.683517 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84gb4\" (UniqueName: \"kubernetes.io/projected/2bc1e63a-763f-4bce-914e-ec0b77b7b58b-kube-api-access-84gb4\") pod \"insights-operator-5785d4fcdd-xlbhf\" (UID: \"2bc1e63a-763f-4bce-914e-ec0b77b7b58b\") " pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.708884 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.708863 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5"] Apr 16 14:54:32.711472 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.711454 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5" Apr 16 14:54:32.713769 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.713750 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 14:54:32.714131 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.714117 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:54:32.714184 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.714126 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 14:54:32.714184 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.714133 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-5n2sc\"" Apr 16 14:54:32.722934 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.722912 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5"] Apr 16 14:54:32.784266 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.784225 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-npqr5\" (UID: \"41c23f56-15f8-40a6-80cd-4dbe07fc9c5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5" Apr 16 14:54:32.784419 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.784276 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bc1e63a-763f-4bce-914e-ec0b77b7b58b-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-xlbhf\" (UID: \"2bc1e63a-763f-4bce-914e-ec0b77b7b58b\") " pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.784419 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.784303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bc1e63a-763f-4bce-914e-ec0b77b7b58b-serving-cert\") pod \"insights-operator-5785d4fcdd-xlbhf\" (UID: \"2bc1e63a-763f-4bce-914e-ec0b77b7b58b\") " pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.784419 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.784333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbcxw\" (UniqueName: \"kubernetes.io/projected/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-kube-api-access-hbcxw\") pod \"cluster-samples-operator-667775844f-npqr5\" (UID: \"41c23f56-15f8-40a6-80cd-4dbe07fc9c5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5" Apr 16 14:54:32.784419 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.784382 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bc1e63a-763f-4bce-914e-ec0b77b7b58b-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-xlbhf\" (UID: \"2bc1e63a-763f-4bce-914e-ec0b77b7b58b\") " pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.784633 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.784539 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2bc1e63a-763f-4bce-914e-ec0b77b7b58b-tmp\") pod \"insights-operator-5785d4fcdd-xlbhf\" (UID: \"2bc1e63a-763f-4bce-914e-ec0b77b7b58b\") " pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.784633 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.784568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2bc1e63a-763f-4bce-914e-ec0b77b7b58b-snapshots\") pod \"insights-operator-5785d4fcdd-xlbhf\" (UID: \"2bc1e63a-763f-4bce-914e-ec0b77b7b58b\") " pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.784633 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.784590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84gb4\" (UniqueName: \"kubernetes.io/projected/2bc1e63a-763f-4bce-914e-ec0b77b7b58b-kube-api-access-84gb4\") pod \"insights-operator-5785d4fcdd-xlbhf\" (UID: \"2bc1e63a-763f-4bce-914e-ec0b77b7b58b\") " pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.785002 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.784976 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2bc1e63a-763f-4bce-914e-ec0b77b7b58b-tmp\") pod \"insights-operator-5785d4fcdd-xlbhf\" (UID: \"2bc1e63a-763f-4bce-914e-ec0b77b7b58b\") " pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.785089 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.785067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bc1e63a-763f-4bce-914e-ec0b77b7b58b-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-xlbhf\" (UID: \"2bc1e63a-763f-4bce-914e-ec0b77b7b58b\") " pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.785250 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.785234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bc1e63a-763f-4bce-914e-ec0b77b7b58b-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-xlbhf\" (UID: \"2bc1e63a-763f-4bce-914e-ec0b77b7b58b\") " pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.785329 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.785313 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2bc1e63a-763f-4bce-914e-ec0b77b7b58b-snapshots\") pod \"insights-operator-5785d4fcdd-xlbhf\" (UID: \"2bc1e63a-763f-4bce-914e-ec0b77b7b58b\") " pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.786839 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.786823 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bc1e63a-763f-4bce-914e-ec0b77b7b58b-serving-cert\") pod \"insights-operator-5785d4fcdd-xlbhf\" (UID: \"2bc1e63a-763f-4bce-914e-ec0b77b7b58b\") " pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.792016 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.791998 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84gb4\" (UniqueName: \"kubernetes.io/projected/2bc1e63a-763f-4bce-914e-ec0b77b7b58b-kube-api-access-84gb4\") pod \"insights-operator-5785d4fcdd-xlbhf\" (UID: \"2bc1e63a-763f-4bce-914e-ec0b77b7b58b\") " pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.885975 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.885903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-npqr5\" (UID: \"41c23f56-15f8-40a6-80cd-4dbe07fc9c5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5" Apr 16 14:54:32.885975 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.885942 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbcxw\" (UniqueName: \"kubernetes.io/projected/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-kube-api-access-hbcxw\") pod \"cluster-samples-operator-667775844f-npqr5\" (UID: \"41c23f56-15f8-40a6-80cd-4dbe07fc9c5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5" Apr 16 14:54:32.886117 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:32.886047 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:54:32.886155 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:32.886120 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls podName:41c23f56-15f8-40a6-80cd-4dbe07fc9c5a nodeName:}" failed. No retries permitted until 2026-04-16 14:54:33.386101634 +0000 UTC m=+134.303976427 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls") pod "cluster-samples-operator-667775844f-npqr5" (UID: "41c23f56-15f8-40a6-80cd-4dbe07fc9c5a") : secret "samples-operator-tls" not found Apr 16 14:54:32.894080 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.894050 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbcxw\" (UniqueName: \"kubernetes.io/projected/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-kube-api-access-hbcxw\") pod \"cluster-samples-operator-667775844f-npqr5\" (UID: \"41c23f56-15f8-40a6-80cd-4dbe07fc9c5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5" Apr 16 14:54:32.914909 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.914886 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" Apr 16 14:54:32.918024 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.918001 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-676c447f8b-jhws4"] Apr 16 14:54:32.921946 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.921930 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:32.924346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.924195 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:54:32.924346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.924219 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:54:32.924346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.924284 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-gptk5\"" Apr 16 14:54:32.924346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.924290 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:54:32.929416 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.929340 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:54:32.930409 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.930391 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-676c447f8b-jhws4"] Apr 16 14:54:32.986460 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.986376 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-certificates\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:32.986460 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.986411 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-bound-sa-token\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:32.986460 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.986433 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrptd\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-kube-api-access-mrptd\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:32.986680 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.986502 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:32.986680 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.986526 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-ca-trust-extracted\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:32.986680 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.986549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-image-registry-private-configuration\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:32.986680 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.986573 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-trusted-ca\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:32.986680 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:32.986589 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-installation-pull-secrets\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:33.028058 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.028027 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-xlbhf"] Apr 16 14:54:33.031398 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:54:33.031374 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bc1e63a_763f_4bce_914e_ec0b77b7b58b.slice/crio-d86b4e68743590bcaaaac9cfc98d03de4ac7c7808471c9bc006e95e5616b6b81 WatchSource:0}: Error finding container d86b4e68743590bcaaaac9cfc98d03de4ac7c7808471c9bc006e95e5616b6b81: Status 404 returned error can't find the container with id d86b4e68743590bcaaaac9cfc98d03de4ac7c7808471c9bc006e95e5616b6b81 Apr 16 14:54:33.087441 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.087404 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-certificates\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:33.087441 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.087440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-bound-sa-token\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:33.087683 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.087503 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrptd\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-kube-api-access-mrptd\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:33.087683 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.087549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:33.087683 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.087575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-ca-trust-extracted\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:33.087683 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.087603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-image-registry-private-configuration\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:33.087683 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.087628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-trusted-ca\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:33.087683 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.087678 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-installation-pull-secrets\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:33.087967 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:33.087725 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:54:33.087967 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:33.087746 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-676c447f8b-jhws4: secret "image-registry-tls" not found Apr 16 14:54:33.087967 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:33.087812 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls podName:e7d7f762-97a3-49ef-8274-38d96b0ddb6f nodeName:}" failed. No retries permitted until 2026-04-16 14:54:33.587790884 +0000 UTC m=+134.505665679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls") pod "image-registry-676c447f8b-jhws4" (UID: "e7d7f762-97a3-49ef-8274-38d96b0ddb6f") : secret "image-registry-tls" not found Apr 16 14:54:33.088078 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.088051 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-ca-trust-extracted\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:33.088137 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.088113 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-certificates\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:33.089139 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.089121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-trusted-ca\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:33.090085 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.090063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-image-registry-private-configuration\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:33.090172 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.090156 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-installation-pull-secrets\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:33.095387 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.095360 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-bound-sa-token\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:33.095467 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.095401 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrptd\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-kube-api-access-mrptd\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:33.129469 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.129436 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" event={"ID":"2bc1e63a-763f-4bce-914e-ec0b77b7b58b","Type":"ContainerStarted","Data":"d86b4e68743590bcaaaac9cfc98d03de4ac7c7808471c9bc006e95e5616b6b81"} Apr 16 14:54:33.390001 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.389968 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-npqr5\" (UID: \"41c23f56-15f8-40a6-80cd-4dbe07fc9c5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5" Apr 16 14:54:33.390164 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:33.390090 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:54:33.390164 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:33.390150 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls podName:41c23f56-15f8-40a6-80cd-4dbe07fc9c5a nodeName:}" failed. No retries permitted until 2026-04-16 14:54:34.390136373 +0000 UTC m=+135.308011177 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls") pod "cluster-samples-operator-667775844f-npqr5" (UID: "41c23f56-15f8-40a6-80cd-4dbe07fc9c5a") : secret "samples-operator-tls" not found Apr 16 14:54:33.591946 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:33.591911 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:33.592136 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:33.592090 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:54:33.592136 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:33.592121 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-676c447f8b-jhws4: secret "image-registry-tls" not found Apr 16 14:54:33.592242 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:33.592204 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls podName:e7d7f762-97a3-49ef-8274-38d96b0ddb6f nodeName:}" failed. No retries permitted until 2026-04-16 14:54:34.592181481 +0000 UTC m=+135.510056276 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls") pod "image-registry-676c447f8b-jhws4" (UID: "e7d7f762-97a3-49ef-8274-38d96b0ddb6f") : secret "image-registry-tls" not found Apr 16 14:54:34.398173 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:34.398136 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-npqr5\" (UID: \"41c23f56-15f8-40a6-80cd-4dbe07fc9c5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5" Apr 16 14:54:34.398657 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:34.398311 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:54:34.398657 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:34.398382 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls podName:41c23f56-15f8-40a6-80cd-4dbe07fc9c5a nodeName:}" failed. No retries permitted until 2026-04-16 14:54:36.398367432 +0000 UTC m=+137.316242228 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls") pod "cluster-samples-operator-667775844f-npqr5" (UID: "41c23f56-15f8-40a6-80cd-4dbe07fc9c5a") : secret "samples-operator-tls" not found Apr 16 14:54:34.599588 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:34.599546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:34.599794 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:34.599730 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:54:34.599794 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:34.599753 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-676c447f8b-jhws4: secret "image-registry-tls" not found Apr 16 14:54:34.599905 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:34.599832 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls podName:e7d7f762-97a3-49ef-8274-38d96b0ddb6f nodeName:}" failed. No retries permitted until 2026-04-16 14:54:36.599810142 +0000 UTC m=+137.517684938 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls") pod "image-registry-676c447f8b-jhws4" (UID: "e7d7f762-97a3-49ef-8274-38d96b0ddb6f") : secret "image-registry-tls" not found Apr 16 14:54:35.134533 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:35.134493 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" event={"ID":"2bc1e63a-763f-4bce-914e-ec0b77b7b58b","Type":"ContainerStarted","Data":"42c62445f5f8415704b6c3333977c06c1deec59a77f010db7560a1da678af75b"} Apr 16 14:54:35.149669 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:35.149610 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" podStartSLOduration=1.293015328 podStartE2EDuration="3.14959949s" podCreationTimestamp="2026-04-16 14:54:32 +0000 UTC" firstStartedPulling="2026-04-16 14:54:33.033126806 +0000 UTC m=+133.951001598" lastFinishedPulling="2026-04-16 14:54:34.889710965 +0000 UTC m=+135.807585760" observedRunningTime="2026-04-16 14:54:35.149361041 +0000 UTC m=+136.067235856" watchObservedRunningTime="2026-04-16 14:54:35.14959949 +0000 UTC m=+136.067474303" Apr 16 14:54:36.415024 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:36.414994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-npqr5\" (UID: \"41c23f56-15f8-40a6-80cd-4dbe07fc9c5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5" Apr 16 14:54:36.415376 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:36.415128 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:54:36.415376 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:36.415189 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls podName:41c23f56-15f8-40a6-80cd-4dbe07fc9c5a nodeName:}" failed. No retries permitted until 2026-04-16 14:54:40.415173359 +0000 UTC m=+141.333048163 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls") pod "cluster-samples-operator-667775844f-npqr5" (UID: "41c23f56-15f8-40a6-80cd-4dbe07fc9c5a") : secret "samples-operator-tls" not found Apr 16 14:54:36.616176 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:36.616145 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:36.616325 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:36.616289 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:54:36.616325 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:36.616303 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-676c447f8b-jhws4: secret "image-registry-tls" not found Apr 16 14:54:36.616392 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:36.616381 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls podName:e7d7f762-97a3-49ef-8274-38d96b0ddb6f nodeName:}" failed. No retries permitted until 2026-04-16 14:54:40.616363954 +0000 UTC m=+141.534238747 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls") pod "image-registry-676c447f8b-jhws4" (UID: "e7d7f762-97a3-49ef-8274-38d96b0ddb6f") : secret "image-registry-tls" not found Apr 16 14:54:37.542497 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:37.542469 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-8d689"] Apr 16 14:54:37.545330 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:37.545315 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8d689" Apr 16 14:54:37.547235 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:37.547215 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-7cnfd\"" Apr 16 14:54:37.552109 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:37.552076 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-8d689"] Apr 16 14:54:37.589750 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:37.589727 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8js26_15bab44e-b8bf-4170-b623-a5f844d8bfb0/dns-node-resolver/0.log" Apr 16 14:54:37.623010 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:37.622984 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxd7t\" (UniqueName: \"kubernetes.io/projected/07abaf9c-1ea5-40a9-85a0-8d5cf01a1693-kube-api-access-pxd7t\") pod \"network-check-source-7b678d77c7-8d689\" (UID: \"07abaf9c-1ea5-40a9-85a0-8d5cf01a1693\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8d689" Apr 16 14:54:37.724142 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:37.724116 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxd7t\" (UniqueName: \"kubernetes.io/projected/07abaf9c-1ea5-40a9-85a0-8d5cf01a1693-kube-api-access-pxd7t\") pod \"network-check-source-7b678d77c7-8d689\" (UID: \"07abaf9c-1ea5-40a9-85a0-8d5cf01a1693\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8d689" Apr 16 14:54:37.733155 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:37.733128 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxd7t\" (UniqueName: \"kubernetes.io/projected/07abaf9c-1ea5-40a9-85a0-8d5cf01a1693-kube-api-access-pxd7t\") pod \"network-check-source-7b678d77c7-8d689\" (UID: \"07abaf9c-1ea5-40a9-85a0-8d5cf01a1693\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8d689" Apr 16 14:54:37.853491 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:37.853444 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8d689" Apr 16 14:54:37.960916 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:37.960891 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-8d689"] Apr 16 14:54:37.964163 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:54:37.964137 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07abaf9c_1ea5_40a9_85a0_8d5cf01a1693.slice/crio-09abf4f414b0441507194a2da10862c3981bcaac49d93e6c95a955418204d34a WatchSource:0}: Error finding container 09abf4f414b0441507194a2da10862c3981bcaac49d93e6c95a955418204d34a: Status 404 returned error can't find the container with id 09abf4f414b0441507194a2da10862c3981bcaac49d93e6c95a955418204d34a Apr 16 14:54:38.142282 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:38.142249 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8d689" event={"ID":"07abaf9c-1ea5-40a9-85a0-8d5cf01a1693","Type":"ContainerStarted","Data":"2e69be37e9f2551468347b80f512c45d570ec1722a0e5aff97b5a5aabbb54064"} Apr 16 14:54:38.142420 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:38.142288 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8d689" event={"ID":"07abaf9c-1ea5-40a9-85a0-8d5cf01a1693","Type":"ContainerStarted","Data":"09abf4f414b0441507194a2da10862c3981bcaac49d93e6c95a955418204d34a"} Apr 16 14:54:38.155403 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:38.155357 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8d689" podStartSLOduration=1.155340265 podStartE2EDuration="1.155340265s" podCreationTimestamp="2026-04-16 14:54:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:54:38.155313522 +0000 UTC m=+139.073188486" watchObservedRunningTime="2026-04-16 14:54:38.155340265 +0000 UTC m=+139.073215080" Apr 16 14:54:38.510102 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:38.510043 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-8t6wg"] Apr 16 14:54:38.513013 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:38.512996 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8t6wg" Apr 16 14:54:38.515215 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:38.515195 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 14:54:38.515312 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:38.515197 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 14:54:38.515312 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:38.515254 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-5tdmj\"" Apr 16 14:54:38.521760 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:38.521740 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-8t6wg"] Apr 16 14:54:38.589207 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:38.589189 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7xqtf_653d6001-f9f0-440a-9aab-e87455bc4e3f/node-ca/0.log" Apr 16 14:54:38.630199 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:38.630176 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpphr\" (UniqueName: \"kubernetes.io/projected/3b4f3d1e-d7d2-45c2-a9ac-b5217f58501c-kube-api-access-mpphr\") pod \"migrator-64d4d94569-8t6wg\" (UID: \"3b4f3d1e-d7d2-45c2-a9ac-b5217f58501c\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8t6wg" Apr 16 14:54:38.730908 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:38.730885 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpphr\" (UniqueName: \"kubernetes.io/projected/3b4f3d1e-d7d2-45c2-a9ac-b5217f58501c-kube-api-access-mpphr\") pod \"migrator-64d4d94569-8t6wg\" (UID: \"3b4f3d1e-d7d2-45c2-a9ac-b5217f58501c\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8t6wg" Apr 16 14:54:38.738438 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:38.738416 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpphr\" (UniqueName: \"kubernetes.io/projected/3b4f3d1e-d7d2-45c2-a9ac-b5217f58501c-kube-api-access-mpphr\") pod \"migrator-64d4d94569-8t6wg\" (UID: \"3b4f3d1e-d7d2-45c2-a9ac-b5217f58501c\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8t6wg" Apr 16 14:54:38.821854 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:38.821804 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8t6wg" Apr 16 14:54:38.934579 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:38.934534 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-8t6wg"] Apr 16 14:54:38.937831 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:54:38.937804 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b4f3d1e_d7d2_45c2_a9ac_b5217f58501c.slice/crio-a48e253dbec19b3d38a8747727793a2b7e0311ec15368dbfae1e3c7efb14150d WatchSource:0}: Error finding container a48e253dbec19b3d38a8747727793a2b7e0311ec15368dbfae1e3c7efb14150d: Status 404 returned error can't find the container with id a48e253dbec19b3d38a8747727793a2b7e0311ec15368dbfae1e3c7efb14150d Apr 16 14:54:39.145297 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:39.145270 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8t6wg" event={"ID":"3b4f3d1e-d7d2-45c2-a9ac-b5217f58501c","Type":"ContainerStarted","Data":"a48e253dbec19b3d38a8747727793a2b7e0311ec15368dbfae1e3c7efb14150d"} Apr 16 14:54:40.442591 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:40.442561 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-npqr5\" (UID: \"41c23f56-15f8-40a6-80cd-4dbe07fc9c5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5" Apr 16 14:54:40.442909 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:40.442674 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:54:40.442909 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:40.442717 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls podName:41c23f56-15f8-40a6-80cd-4dbe07fc9c5a nodeName:}" failed. No retries permitted until 2026-04-16 14:54:48.442704083 +0000 UTC m=+149.360578875 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls") pod "cluster-samples-operator-667775844f-npqr5" (UID: "41c23f56-15f8-40a6-80cd-4dbe07fc9c5a") : secret "samples-operator-tls" not found Apr 16 14:54:40.643400 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:40.643375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:40.643521 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:40.643504 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:54:40.643568 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:40.643524 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-676c447f8b-jhws4: secret "image-registry-tls" not found Apr 16 14:54:40.643568 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:40.643566 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls podName:e7d7f762-97a3-49ef-8274-38d96b0ddb6f nodeName:}" failed. No retries permitted until 2026-04-16 14:54:48.643552045 +0000 UTC m=+149.561426841 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls") pod "image-registry-676c447f8b-jhws4" (UID: "e7d7f762-97a3-49ef-8274-38d96b0ddb6f") : secret "image-registry-tls" not found Apr 16 14:54:41.150397 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:41.150360 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8t6wg" event={"ID":"3b4f3d1e-d7d2-45c2-a9ac-b5217f58501c","Type":"ContainerStarted","Data":"97ea4dc04ed1751c9484dcbabd9c96711f98b04807be9742702373cacf908ba6"} Apr 16 14:54:41.150397 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:41.150397 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8t6wg" event={"ID":"3b4f3d1e-d7d2-45c2-a9ac-b5217f58501c","Type":"ContainerStarted","Data":"dc67b547748adf54e30f49a982b29ca6a1c93427f4c50c4195184e320a849a77"} Apr 16 14:54:41.165456 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:41.165412 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8t6wg" podStartSLOduration=1.871221141 podStartE2EDuration="3.165398812s" podCreationTimestamp="2026-04-16 14:54:38 +0000 UTC" firstStartedPulling="2026-04-16 14:54:38.939721752 +0000 UTC m=+139.857596544" lastFinishedPulling="2026-04-16 14:54:40.233899423 +0000 UTC m=+141.151774215" observedRunningTime="2026-04-16 14:54:41.164779167 +0000 UTC m=+142.082653981" watchObservedRunningTime="2026-04-16 14:54:41.165398812 +0000 UTC m=+142.083273608" Apr 16 14:54:48.497079 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:48.497043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-npqr5\" (UID: \"41c23f56-15f8-40a6-80cd-4dbe07fc9c5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5" Apr 16 14:54:48.497418 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:48.497176 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:54:48.497418 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:48.497235 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls podName:41c23f56-15f8-40a6-80cd-4dbe07fc9c5a nodeName:}" failed. No retries permitted until 2026-04-16 14:55:04.497218941 +0000 UTC m=+165.415093734 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls") pod "cluster-samples-operator-667775844f-npqr5" (UID: "41c23f56-15f8-40a6-80cd-4dbe07fc9c5a") : secret "samples-operator-tls" not found Apr 16 14:54:48.698266 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:48.698238 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:54:48.698383 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:48.698370 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:54:48.698420 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:48.698385 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-676c447f8b-jhws4: secret "image-registry-tls" not found Apr 16 14:54:48.698454 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:48.698427 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls podName:e7d7f762-97a3-49ef-8274-38d96b0ddb6f nodeName:}" failed. No retries permitted until 2026-04-16 14:55:04.69841423 +0000 UTC m=+165.616289022 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls") pod "image-registry-676c447f8b-jhws4" (UID: "e7d7f762-97a3-49ef-8274-38d96b0ddb6f") : secret "image-registry-tls" not found Apr 16 14:54:55.463311 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:55.463276 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-jwtvm" podUID="f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4" Apr 16 14:54:55.474429 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:55.474401 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-bb2ms" podUID="f88b98b0-264f-41ee-a565-3a2941c70020" Apr 16 14:54:55.717856 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:54:55.717799 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-5zzl5" podUID="fb239c54-f254-4320-9008-4b4f5895660d" Apr 16 14:54:56.184114 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:56.184090 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jwtvm" Apr 16 14:54:59.959433 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:54:59.959397 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" podUID="96524ebe-9e7d-422c-9aaf-9436e9cc3c13" containerName="acm-agent" probeResult="failure" output="Get \"http://10.134.0.7:8000/readyz\": dial tcp 10.134.0.7:8000: connect: connection refused" Apr 16 14:55:00.193901 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:00.193874 2576 generic.go:358] "Generic (PLEG): container finished" podID="96524ebe-9e7d-422c-9aaf-9436e9cc3c13" containerID="75ae9aa18469cebbaefc38ba498ad76eef8753977b3ad8d446efd39581767360" exitCode=1 Apr 16 14:55:00.194004 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:00.193936 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" event={"ID":"96524ebe-9e7d-422c-9aaf-9436e9cc3c13","Type":"ContainerDied","Data":"75ae9aa18469cebbaefc38ba498ad76eef8753977b3ad8d446efd39581767360"} Apr 16 14:55:00.194202 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:00.194187 2576 scope.go:117] "RemoveContainer" containerID="75ae9aa18469cebbaefc38ba498ad76eef8753977b3ad8d446efd39581767360" Apr 16 14:55:00.378842 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:00.378800 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:55:00.378981 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:00.378885 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert\") pod \"ingress-canary-bb2ms\" (UID: \"f88b98b0-264f-41ee-a565-3a2941c70020\") " pod="openshift-ingress-canary/ingress-canary-bb2ms" Apr 16 14:55:00.381196 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:00.381173 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4-metrics-tls\") pod \"dns-default-jwtvm\" (UID: \"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4\") " pod="openshift-dns/dns-default-jwtvm" Apr 16 14:55:00.381196 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:00.381191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f88b98b0-264f-41ee-a565-3a2941c70020-cert\") pod \"ingress-canary-bb2ms\" (UID: \"f88b98b0-264f-41ee-a565-3a2941c70020\") " pod="openshift-ingress-canary/ingress-canary-bb2ms" Apr 16 14:55:00.387076 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:00.387058 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-njff8\"" Apr 16 14:55:00.396051 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:00.396033 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jwtvm" Apr 16 14:55:00.504356 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:00.504292 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jwtvm"] Apr 16 14:55:00.507494 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:55:00.507464 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5c2c4ed_d5b3_4fe6_be69_9b6eed9488d4.slice/crio-ce8763a192d7cb2ad967105e8c81ad208b3e2ad1ee4a568a39a969dced0c4b26 WatchSource:0}: Error finding container ce8763a192d7cb2ad967105e8c81ad208b3e2ad1ee4a568a39a969dced0c4b26: Status 404 returned error can't find the container with id ce8763a192d7cb2ad967105e8c81ad208b3e2ad1ee4a568a39a969dced0c4b26 Apr 16 14:55:01.197711 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:01.197662 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jwtvm" event={"ID":"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4","Type":"ContainerStarted","Data":"ce8763a192d7cb2ad967105e8c81ad208b3e2ad1ee4a568a39a969dced0c4b26"} Apr 16 14:55:01.199280 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:01.199258 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" event={"ID":"96524ebe-9e7d-422c-9aaf-9436e9cc3c13","Type":"ContainerStarted","Data":"d91665e12d775f0fb3bc78dc3b143e8ab05c477eef7acdff4233b7f4cd7151a8"} Apr 16 14:55:01.199590 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:01.199558 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" Apr 16 14:55:01.200270 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:01.200252 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b579885fd-6dzj6" Apr 16 14:55:02.203067 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:02.203037 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jwtvm" event={"ID":"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4","Type":"ContainerStarted","Data":"db8dd20e59db1dd72a8aff2d608febd5f242c418a69fc6903936e42758e0892a"} Apr 16 14:55:02.203442 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:02.203072 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jwtvm" event={"ID":"f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4","Type":"ContainerStarted","Data":"1ecda4797e36bdda031fd2f046850e05ba0843eb0b6a9758aea42d98b4012765"} Apr 16 14:55:02.219254 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:02.219213 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jwtvm" podStartSLOduration=128.926947989 podStartE2EDuration="2m10.219198871s" podCreationTimestamp="2026-04-16 14:52:52 +0000 UTC" firstStartedPulling="2026-04-16 14:55:00.509292009 +0000 UTC m=+161.427166801" lastFinishedPulling="2026-04-16 14:55:01.801542887 +0000 UTC m=+162.719417683" observedRunningTime="2026-04-16 14:55:02.217939265 +0000 UTC m=+163.135814077" watchObservedRunningTime="2026-04-16 14:55:02.219198871 +0000 UTC m=+163.137073685" Apr 16 14:55:03.205335 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.205304 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-jwtvm" Apr 16 14:55:03.266370 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.266341 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-676c447f8b-jhws4"] Apr 16 14:55:03.266507 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:55:03.266490 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-676c447f8b-jhws4" podUID="e7d7f762-97a3-49ef-8274-38d96b0ddb6f" Apr 16 14:55:03.307391 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.307358 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-d2qg6"] Apr 16 14:55:03.310345 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.310325 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d2qg6" Apr 16 14:55:03.312531 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.312509 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:55:03.312531 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.312522 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-clwz5\"" Apr 16 14:55:03.312681 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.312614 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:55:03.320953 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.320934 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d2qg6"] Apr 16 14:55:03.400038 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.400015 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d9e99e03-4d4d-406e-80e2-a86bf88da5d6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d2qg6\" (UID: \"d9e99e03-4d4d-406e-80e2-a86bf88da5d6\") " pod="openshift-insights/insights-runtime-extractor-d2qg6" Apr 16 14:55:03.400130 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.400046 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2shp\" (UniqueName: \"kubernetes.io/projected/d9e99e03-4d4d-406e-80e2-a86bf88da5d6-kube-api-access-f2shp\") pod \"insights-runtime-extractor-d2qg6\" (UID: \"d9e99e03-4d4d-406e-80e2-a86bf88da5d6\") " pod="openshift-insights/insights-runtime-extractor-d2qg6" Apr 16 14:55:03.400130 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.400079 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d9e99e03-4d4d-406e-80e2-a86bf88da5d6-data-volume\") pod \"insights-runtime-extractor-d2qg6\" (UID: \"d9e99e03-4d4d-406e-80e2-a86bf88da5d6\") " pod="openshift-insights/insights-runtime-extractor-d2qg6" Apr 16 14:55:03.400130 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.400099 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d9e99e03-4d4d-406e-80e2-a86bf88da5d6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d2qg6\" (UID: \"d9e99e03-4d4d-406e-80e2-a86bf88da5d6\") " pod="openshift-insights/insights-runtime-extractor-d2qg6" Apr 16 14:55:03.400239 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.400181 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d9e99e03-4d4d-406e-80e2-a86bf88da5d6-crio-socket\") pod \"insights-runtime-extractor-d2qg6\" (UID: \"d9e99e03-4d4d-406e-80e2-a86bf88da5d6\") " pod="openshift-insights/insights-runtime-extractor-d2qg6" Apr 16 14:55:03.413967 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.413946 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6955d6c465-cssgp"] Apr 16 14:55:03.416832 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.416819 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.428847 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.428830 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6955d6c465-cssgp"] Apr 16 14:55:03.501113 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.501068 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d9e99e03-4d4d-406e-80e2-a86bf88da5d6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d2qg6\" (UID: \"d9e99e03-4d4d-406e-80e2-a86bf88da5d6\") " pod="openshift-insights/insights-runtime-extractor-d2qg6" Apr 16 14:55:03.501113 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.501109 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d060e048-d462-47d1-a500-82e8b6eff8ba-trusted-ca\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.501255 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.501128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d060e048-d462-47d1-a500-82e8b6eff8ba-registry-certificates\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.501255 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.501150 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d9e99e03-4d4d-406e-80e2-a86bf88da5d6-crio-socket\") pod \"insights-runtime-extractor-d2qg6\" (UID: \"d9e99e03-4d4d-406e-80e2-a86bf88da5d6\") " pod="openshift-insights/insights-runtime-extractor-d2qg6" Apr 16 14:55:03.501255 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.501174 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rsgq\" (UniqueName: \"kubernetes.io/projected/d060e048-d462-47d1-a500-82e8b6eff8ba-kube-api-access-4rsgq\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.501255 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.501196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d060e048-d462-47d1-a500-82e8b6eff8ba-bound-sa-token\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.501255 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.501233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d9e99e03-4d4d-406e-80e2-a86bf88da5d6-crio-socket\") pod \"insights-runtime-extractor-d2qg6\" (UID: \"d9e99e03-4d4d-406e-80e2-a86bf88da5d6\") " pod="openshift-insights/insights-runtime-extractor-d2qg6" Apr 16 14:55:03.501420 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.501291 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d060e048-d462-47d1-a500-82e8b6eff8ba-registry-tls\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.501420 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.501321 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d060e048-d462-47d1-a500-82e8b6eff8ba-ca-trust-extracted\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.501420 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.501366 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d9e99e03-4d4d-406e-80e2-a86bf88da5d6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d2qg6\" (UID: \"d9e99e03-4d4d-406e-80e2-a86bf88da5d6\") " pod="openshift-insights/insights-runtime-extractor-d2qg6" Apr 16 14:55:03.501420 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.501403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2shp\" (UniqueName: \"kubernetes.io/projected/d9e99e03-4d4d-406e-80e2-a86bf88da5d6-kube-api-access-f2shp\") pod \"insights-runtime-extractor-d2qg6\" (UID: \"d9e99e03-4d4d-406e-80e2-a86bf88da5d6\") " pod="openshift-insights/insights-runtime-extractor-d2qg6" Apr 16 14:55:03.501566 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.501545 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d060e048-d462-47d1-a500-82e8b6eff8ba-image-registry-private-configuration\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.501598 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.501589 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d060e048-d462-47d1-a500-82e8b6eff8ba-installation-pull-secrets\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.501659 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.501630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d9e99e03-4d4d-406e-80e2-a86bf88da5d6-data-volume\") pod \"insights-runtime-extractor-d2qg6\" (UID: \"d9e99e03-4d4d-406e-80e2-a86bf88da5d6\") " pod="openshift-insights/insights-runtime-extractor-d2qg6" Apr 16 14:55:03.501716 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.501700 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d9e99e03-4d4d-406e-80e2-a86bf88da5d6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d2qg6\" (UID: \"d9e99e03-4d4d-406e-80e2-a86bf88da5d6\") " pod="openshift-insights/insights-runtime-extractor-d2qg6" Apr 16 14:55:03.501867 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.501853 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d9e99e03-4d4d-406e-80e2-a86bf88da5d6-data-volume\") pod \"insights-runtime-extractor-d2qg6\" (UID: \"d9e99e03-4d4d-406e-80e2-a86bf88da5d6\") " pod="openshift-insights/insights-runtime-extractor-d2qg6" Apr 16 14:55:03.503606 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.503589 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d9e99e03-4d4d-406e-80e2-a86bf88da5d6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d2qg6\" (UID: \"d9e99e03-4d4d-406e-80e2-a86bf88da5d6\") " pod="openshift-insights/insights-runtime-extractor-d2qg6" Apr 16 14:55:03.508887 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.508867 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2shp\" (UniqueName: \"kubernetes.io/projected/d9e99e03-4d4d-406e-80e2-a86bf88da5d6-kube-api-access-f2shp\") pod \"insights-runtime-extractor-d2qg6\" (UID: \"d9e99e03-4d4d-406e-80e2-a86bf88da5d6\") " pod="openshift-insights/insights-runtime-extractor-d2qg6" Apr 16 14:55:03.602408 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.602386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d060e048-d462-47d1-a500-82e8b6eff8ba-image-registry-private-configuration\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.602489 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.602414 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d060e048-d462-47d1-a500-82e8b6eff8ba-installation-pull-secrets\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.602489 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.602450 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d060e048-d462-47d1-a500-82e8b6eff8ba-trusted-ca\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.602489 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.602478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d060e048-d462-47d1-a500-82e8b6eff8ba-registry-certificates\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.602599 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.602509 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rsgq\" (UniqueName: \"kubernetes.io/projected/d060e048-d462-47d1-a500-82e8b6eff8ba-kube-api-access-4rsgq\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.602599 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.602538 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d060e048-d462-47d1-a500-82e8b6eff8ba-bound-sa-token\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.602599 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.602562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d060e048-d462-47d1-a500-82e8b6eff8ba-registry-tls\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.602772 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.602692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d060e048-d462-47d1-a500-82e8b6eff8ba-ca-trust-extracted\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.603314 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.603292 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d060e048-d462-47d1-a500-82e8b6eff8ba-ca-trust-extracted\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.603562 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.603456 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d060e048-d462-47d1-a500-82e8b6eff8ba-trusted-ca\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.603562 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.603534 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d060e048-d462-47d1-a500-82e8b6eff8ba-registry-certificates\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.604776 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.604752 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d060e048-d462-47d1-a500-82e8b6eff8ba-image-registry-private-configuration\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.604850 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.604832 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d060e048-d462-47d1-a500-82e8b6eff8ba-installation-pull-secrets\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.605147 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.605131 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d060e048-d462-47d1-a500-82e8b6eff8ba-registry-tls\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.609868 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.609846 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rsgq\" (UniqueName: \"kubernetes.io/projected/d060e048-d462-47d1-a500-82e8b6eff8ba-kube-api-access-4rsgq\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.610319 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.610299 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d060e048-d462-47d1-a500-82e8b6eff8ba-bound-sa-token\") pod \"image-registry-6955d6c465-cssgp\" (UID: \"d060e048-d462-47d1-a500-82e8b6eff8ba\") " pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.619178 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.619161 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d2qg6" Apr 16 14:55:03.724101 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.724071 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:03.729535 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.729514 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d2qg6"] Apr 16 14:55:03.732852 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:55:03.732824 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9e99e03_4d4d_406e_80e2_a86bf88da5d6.slice/crio-449ae87a795aded47a0846b1f6834ac478a8f9e6f6bac72b8584ac4db38ac7d4 WatchSource:0}: Error finding container 449ae87a795aded47a0846b1f6834ac478a8f9e6f6bac72b8584ac4db38ac7d4: Status 404 returned error can't find the container with id 449ae87a795aded47a0846b1f6834ac478a8f9e6f6bac72b8584ac4db38ac7d4 Apr 16 14:55:03.850772 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:03.850744 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6955d6c465-cssgp"] Apr 16 14:55:03.853993 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:55:03.853970 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd060e048_d462_47d1_a500_82e8b6eff8ba.slice/crio-fda1983c12f98e25e9def3fa1bb6347a88fb48de952d92648a6cafe4be8a4375 WatchSource:0}: Error finding container fda1983c12f98e25e9def3fa1bb6347a88fb48de952d92648a6cafe4be8a4375: Status 404 returned error can't find the container with id fda1983c12f98e25e9def3fa1bb6347a88fb48de952d92648a6cafe4be8a4375 Apr 16 14:55:04.208879 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.208847 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6955d6c465-cssgp" event={"ID":"d060e048-d462-47d1-a500-82e8b6eff8ba","Type":"ContainerStarted","Data":"28bbd98a245b41c086d2f864d182be57abee4108ebb54655daeeb06e71c9cfbc"} Apr 16 14:55:04.208879 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.208881 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6955d6c465-cssgp" event={"ID":"d060e048-d462-47d1-a500-82e8b6eff8ba","Type":"ContainerStarted","Data":"fda1983c12f98e25e9def3fa1bb6347a88fb48de952d92648a6cafe4be8a4375"} Apr 16 14:55:04.209289 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.208912 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:04.210092 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.210069 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d2qg6" event={"ID":"d9e99e03-4d4d-406e-80e2-a86bf88da5d6","Type":"ContainerStarted","Data":"a46f99c3289225ce493ff7e72f536f874c788db9d9b613420496923f70c48770"} Apr 16 14:55:04.210176 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.210099 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d2qg6" event={"ID":"d9e99e03-4d4d-406e-80e2-a86bf88da5d6","Type":"ContainerStarted","Data":"449ae87a795aded47a0846b1f6834ac478a8f9e6f6bac72b8584ac4db38ac7d4"} Apr 16 14:55:04.210226 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.210178 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:55:04.213810 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.213793 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:55:04.229168 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.229132 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6955d6c465-cssgp" podStartSLOduration=1.229121937 podStartE2EDuration="1.229121937s" podCreationTimestamp="2026-04-16 14:55:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:04.228485501 +0000 UTC m=+165.146360319" watchObservedRunningTime="2026-04-16 14:55:04.229121937 +0000 UTC m=+165.146996741" Apr 16 14:55:04.307915 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.307887 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-ca-trust-extracted\") pod \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " Apr 16 14:55:04.308059 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.307924 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-image-registry-private-configuration\") pod \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " Apr 16 14:55:04.308059 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.307955 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-bound-sa-token\") pod \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " Apr 16 14:55:04.308059 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.307985 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-certificates\") pod \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " Apr 16 14:55:04.308059 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.308013 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-trusted-ca\") pod \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " Apr 16 14:55:04.308272 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.308071 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrptd\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-kube-api-access-mrptd\") pod \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " Apr 16 14:55:04.308272 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.308123 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-installation-pull-secrets\") pod \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " Apr 16 14:55:04.308272 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.308173 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e7d7f762-97a3-49ef-8274-38d96b0ddb6f" (UID: "e7d7f762-97a3-49ef-8274-38d96b0ddb6f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:55:04.309333 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.308497 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e7d7f762-97a3-49ef-8274-38d96b0ddb6f" (UID: "e7d7f762-97a3-49ef-8274-38d96b0ddb6f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:04.309333 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.308819 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e7d7f762-97a3-49ef-8274-38d96b0ddb6f" (UID: "e7d7f762-97a3-49ef-8274-38d96b0ddb6f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:04.309333 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.309326 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-ca-trust-extracted\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:55:04.309573 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.309349 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-certificates\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:55:04.309573 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.309364 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-trusted-ca\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:55:04.310514 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.310488 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e7d7f762-97a3-49ef-8274-38d96b0ddb6f" (UID: "e7d7f762-97a3-49ef-8274-38d96b0ddb6f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:04.310623 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.310515 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-kube-api-access-mrptd" (OuterVolumeSpecName: "kube-api-access-mrptd") pod "e7d7f762-97a3-49ef-8274-38d96b0ddb6f" (UID: "e7d7f762-97a3-49ef-8274-38d96b0ddb6f"). InnerVolumeSpecName "kube-api-access-mrptd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:04.310857 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.310834 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e7d7f762-97a3-49ef-8274-38d96b0ddb6f" (UID: "e7d7f762-97a3-49ef-8274-38d96b0ddb6f"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:04.311160 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.311141 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e7d7f762-97a3-49ef-8274-38d96b0ddb6f" (UID: "e7d7f762-97a3-49ef-8274-38d96b0ddb6f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:04.410696 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.410671 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-image-registry-private-configuration\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:55:04.410790 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.410697 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-bound-sa-token\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:55:04.410790 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.410720 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mrptd\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-kube-api-access-mrptd\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:55:04.410790 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.410735 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-installation-pull-secrets\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:55:04.511199 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.511132 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-npqr5\" (UID: \"41c23f56-15f8-40a6-80cd-4dbe07fc9c5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5" Apr 16 14:55:04.513773 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.513725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/41c23f56-15f8-40a6-80cd-4dbe07fc9c5a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-npqr5\" (UID: \"41c23f56-15f8-40a6-80cd-4dbe07fc9c5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5" Apr 16 14:55:04.519449 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.519426 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5" Apr 16 14:55:04.657544 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.657523 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5"] Apr 16 14:55:04.712583 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.712562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:55:04.714606 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.714586 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls\") pod \"image-registry-676c447f8b-jhws4\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:55:04.813561 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.813514 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls\") pod \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\" (UID: \"e7d7f762-97a3-49ef-8274-38d96b0ddb6f\") " Apr 16 14:55:04.815193 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.815171 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e7d7f762-97a3-49ef-8274-38d96b0ddb6f" (UID: "e7d7f762-97a3-49ef-8274-38d96b0ddb6f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:04.914937 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:04.914917 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7d7f762-97a3-49ef-8274-38d96b0ddb6f-registry-tls\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:55:05.214013 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:05.213977 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5" event={"ID":"41c23f56-15f8-40a6-80cd-4dbe07fc9c5a","Type":"ContainerStarted","Data":"4d53c13171d93afa99bad98dfd7515e6913196ae5f3f15ab730e7e25683c396d"} Apr 16 14:55:05.215825 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:05.215798 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d2qg6" event={"ID":"d9e99e03-4d4d-406e-80e2-a86bf88da5d6","Type":"ContainerStarted","Data":"adc33c8c60ee58dd222fda224d5f1731eb26e851c0fcebe9c2ba5033cc4e3602"} Apr 16 14:55:05.216042 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:05.216024 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-676c447f8b-jhws4" Apr 16 14:55:05.282265 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:05.282239 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-676c447f8b-jhws4"] Apr 16 14:55:05.289828 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:05.289803 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-676c447f8b-jhws4"] Apr 16 14:55:05.705525 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:05.705477 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bb2ms" Apr 16 14:55:05.708174 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:05.708150 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gcjjc\"" Apr 16 14:55:05.709802 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:05.709737 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7d7f762-97a3-49ef-8274-38d96b0ddb6f" path="/var/lib/kubelet/pods/e7d7f762-97a3-49ef-8274-38d96b0ddb6f/volumes" Apr 16 14:55:05.716999 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:05.716974 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bb2ms" Apr 16 14:55:06.579806 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:06.579757 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bb2ms"] Apr 16 14:55:06.593626 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:55:06.593600 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf88b98b0_264f_41ee_a565_3a2941c70020.slice/crio-2e98dc9b703baaac05bd5580170eb999202d8d1f756212d47efca869070964bb WatchSource:0}: Error finding container 2e98dc9b703baaac05bd5580170eb999202d8d1f756212d47efca869070964bb: Status 404 returned error can't find the container with id 2e98dc9b703baaac05bd5580170eb999202d8d1f756212d47efca869070964bb Apr 16 14:55:07.223229 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:07.223186 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5" event={"ID":"41c23f56-15f8-40a6-80cd-4dbe07fc9c5a","Type":"ContainerStarted","Data":"f20e1eb7fb64598e2a88b0278f09434790dc74c50e992aa713c2e93e8d00ccb0"} Apr 16 14:55:07.223229 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:07.223232 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5" event={"ID":"41c23f56-15f8-40a6-80cd-4dbe07fc9c5a","Type":"ContainerStarted","Data":"94ebe28d491d63b234503fef861efed4d84c6148593fb5e3a3b3568e63ac9633"} Apr 16 14:55:07.224338 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:07.224308 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bb2ms" event={"ID":"f88b98b0-264f-41ee-a565-3a2941c70020","Type":"ContainerStarted","Data":"2e98dc9b703baaac05bd5580170eb999202d8d1f756212d47efca869070964bb"} Apr 16 14:55:07.226064 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:07.226031 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d2qg6" event={"ID":"d9e99e03-4d4d-406e-80e2-a86bf88da5d6","Type":"ContainerStarted","Data":"ea46980a523f523a369806e0f47696366c2bd1f76e97a767509c5e4b72ee65d4"} Apr 16 14:55:07.239860 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:07.239812 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-npqr5" podStartSLOduration=33.464869474 podStartE2EDuration="35.239800701s" podCreationTimestamp="2026-04-16 14:54:32 +0000 UTC" firstStartedPulling="2026-04-16 14:55:04.692441623 +0000 UTC m=+165.610316415" lastFinishedPulling="2026-04-16 14:55:06.467372832 +0000 UTC m=+167.385247642" observedRunningTime="2026-04-16 14:55:07.238932005 +0000 UTC m=+168.156806818" watchObservedRunningTime="2026-04-16 14:55:07.239800701 +0000 UTC m=+168.157675515" Apr 16 14:55:07.263592 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:07.263535 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-d2qg6" podStartSLOduration=1.6025622959999999 podStartE2EDuration="4.263522909s" podCreationTimestamp="2026-04-16 14:55:03 +0000 UTC" firstStartedPulling="2026-04-16 14:55:03.802828535 +0000 UTC m=+164.720703327" lastFinishedPulling="2026-04-16 14:55:06.463789134 +0000 UTC m=+167.381663940" observedRunningTime="2026-04-16 14:55:07.262581065 +0000 UTC m=+168.180455878" watchObservedRunningTime="2026-04-16 14:55:07.263522909 +0000 UTC m=+168.181397723" Apr 16 14:55:07.705555 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:07.705519 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:55:08.230068 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:08.230033 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bb2ms" event={"ID":"f88b98b0-264f-41ee-a565-3a2941c70020","Type":"ContainerStarted","Data":"145dc5009b3644bb82a7683ec47be845189fcc61eab2b5c7e652f9978cde6036"} Apr 16 14:55:08.252466 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:08.252426 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bb2ms" podStartSLOduration=134.840110215 podStartE2EDuration="2m16.25241264s" podCreationTimestamp="2026-04-16 14:52:52 +0000 UTC" firstStartedPulling="2026-04-16 14:55:06.596175075 +0000 UTC m=+167.514049884" lastFinishedPulling="2026-04-16 14:55:08.008477517 +0000 UTC m=+168.926352309" observedRunningTime="2026-04-16 14:55:08.250887408 +0000 UTC m=+169.168762221" watchObservedRunningTime="2026-04-16 14:55:08.25241264 +0000 UTC m=+169.170287458" Apr 16 14:55:13.211875 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.211851 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jwtvm" Apr 16 14:55:13.419690 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.419664 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8d8v6"] Apr 16 14:55:13.424365 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.424345 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.426576 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.426552 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:55:13.426883 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.426858 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:55:13.427041 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.426934 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:55:13.427041 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.427024 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:55:13.427243 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.426944 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:55:13.427308 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.427249 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:55:13.427625 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.427607 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mcggm\"" Apr 16 14:55:13.575773 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.575724 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-node-exporter-textfile\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.575773 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.575752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.575913 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.575780 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-node-exporter-wtmp\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.575913 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.575796 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lvrg\" (UniqueName: \"kubernetes.io/projected/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-kube-api-access-7lvrg\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.575913 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.575856 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-root\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.576001 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.575928 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-sys\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.576001 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.575951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-node-exporter-accelerators-collector-config\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.576001 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.575974 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-metrics-client-ca\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.576001 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.575993 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-node-exporter-tls\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.677157 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.677127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-node-exporter-textfile\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.677157 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.677158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.677327 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.677190 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-node-exporter-wtmp\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.677327 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.677213 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lvrg\" (UniqueName: \"kubernetes.io/projected/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-kube-api-access-7lvrg\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.677327 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.677243 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-root\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.677327 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.677289 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-sys\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.677524 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.677341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-sys\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.677524 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.677343 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-node-exporter-wtmp\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.677524 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.677368 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-root\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.677524 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.677414 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-node-exporter-accelerators-collector-config\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.677524 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.677455 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-metrics-client-ca\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.677524 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.677482 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-node-exporter-textfile\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.677524 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.677485 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-node-exporter-tls\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.677965 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.677943 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-node-exporter-accelerators-collector-config\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.678383 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.678363 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-metrics-client-ca\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.679524 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.679496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.679703 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.679686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-node-exporter-tls\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.685510 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.685483 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lvrg\" (UniqueName: \"kubernetes.io/projected/032e1c7b-011b-4be3-b4a6-5a7eb806fbff-kube-api-access-7lvrg\") pod \"node-exporter-8d8v6\" (UID: \"032e1c7b-011b-4be3-b4a6-5a7eb806fbff\") " pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.737886 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:13.734456 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8d8v6" Apr 16 14:55:13.744961 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:55:13.744940 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod032e1c7b_011b_4be3_b4a6_5a7eb806fbff.slice/crio-dea198b4033b95a62aa1f8cceb3cf97a49be7fd5721203d0fc9a16978fd7a063 WatchSource:0}: Error finding container dea198b4033b95a62aa1f8cceb3cf97a49be7fd5721203d0fc9a16978fd7a063: Status 404 returned error can't find the container with id dea198b4033b95a62aa1f8cceb3cf97a49be7fd5721203d0fc9a16978fd7a063 Apr 16 14:55:14.246303 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:14.246267 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8d8v6" event={"ID":"032e1c7b-011b-4be3-b4a6-5a7eb806fbff","Type":"ContainerStarted","Data":"dea198b4033b95a62aa1f8cceb3cf97a49be7fd5721203d0fc9a16978fd7a063"} Apr 16 14:55:15.249902 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:15.249870 2576 generic.go:358] "Generic (PLEG): container finished" podID="032e1c7b-011b-4be3-b4a6-5a7eb806fbff" containerID="737031fd3d1a57581b56b77af465665604f671b6c9f7100355e2a5a70685b569" exitCode=0 Apr 16 14:55:15.250266 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:15.249915 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8d8v6" event={"ID":"032e1c7b-011b-4be3-b4a6-5a7eb806fbff","Type":"ContainerDied","Data":"737031fd3d1a57581b56b77af465665604f671b6c9f7100355e2a5a70685b569"} Apr 16 14:55:16.254144 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.254112 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8d8v6" event={"ID":"032e1c7b-011b-4be3-b4a6-5a7eb806fbff","Type":"ContainerStarted","Data":"b2b06057934edce3ac4957d43fd6fd0d7236ce49854f4b410a586caa6b3071ab"} Apr 16 14:55:16.254144 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.254145 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8d8v6" event={"ID":"032e1c7b-011b-4be3-b4a6-5a7eb806fbff","Type":"ContainerStarted","Data":"dbd70677eb1960314cbbedb46a136e0195a070a899e9b52ab4003d1d9f6c853c"} Apr 16 14:55:16.400501 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.400454 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8d8v6" podStartSLOduration=2.729898231 podStartE2EDuration="3.400438781s" podCreationTimestamp="2026-04-16 14:55:13 +0000 UTC" firstStartedPulling="2026-04-16 14:55:13.746582359 +0000 UTC m=+174.664457151" lastFinishedPulling="2026-04-16 14:55:14.417122906 +0000 UTC m=+175.334997701" observedRunningTime="2026-04-16 14:55:16.275725356 +0000 UTC m=+177.193600168" watchObservedRunningTime="2026-04-16 14:55:16.400438781 +0000 UTC m=+177.318313595" Apr 16 14:55:16.400936 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.400917 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-9fd75b6db-n4768"] Apr 16 14:55:16.404226 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.404207 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.406627 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.406593 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-2bp8t\"" Apr 16 14:55:16.406748 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.406658 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 14:55:16.406748 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.406706 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 14:55:16.406748 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.406738 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-4b096cgqs52ds\"" Apr 16 14:55:16.406897 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.406755 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 14:55:16.406897 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.406710 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 14:55:16.406897 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.406771 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 14:55:16.414137 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.414117 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-9fd75b6db-n4768"] Apr 16 14:55:16.599038 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.598977 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.599038 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.599008 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.599038 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.599026 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-metrics-client-ca\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.599291 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.599045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-secret-grpc-tls\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.599291 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.599077 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.599291 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.599149 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.599291 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.599185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n98s7\" (UniqueName: \"kubernetes.io/projected/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-kube-api-access-n98s7\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.599291 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.599227 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-secret-thanos-querier-tls\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.699727 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.699707 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.699804 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.699739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.699804 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.699758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-metrics-client-ca\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.699804 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.699778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-secret-grpc-tls\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.699923 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.699801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.699923 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.699847 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.699923 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.699871 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n98s7\" (UniqueName: \"kubernetes.io/projected/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-kube-api-access-n98s7\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.699923 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.699901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-secret-thanos-querier-tls\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.700526 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.700505 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-metrics-client-ca\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.702859 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.702830 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.702947 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.702928 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.703015 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.702960 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.703015 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.703005 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.703105 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.703049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-secret-thanos-querier-tls\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.703287 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.703271 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-secret-grpc-tls\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.706934 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.706919 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n98s7\" (UniqueName: \"kubernetes.io/projected/a555f4b5-439d-49f4-b8ed-e9bc2efe809d-kube-api-access-n98s7\") pod \"thanos-querier-9fd75b6db-n4768\" (UID: \"a555f4b5-439d-49f4-b8ed-e9bc2efe809d\") " pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.712817 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.712797 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:16.830412 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:16.830287 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-9fd75b6db-n4768"] Apr 16 14:55:16.832751 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:55:16.832725 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda555f4b5_439d_49f4_b8ed_e9bc2efe809d.slice/crio-b0ec2d0e1b615212e991bb1b5d5dce06e4bffc4091ecb6bd52c8cb604b3dae74 WatchSource:0}: Error finding container b0ec2d0e1b615212e991bb1b5d5dce06e4bffc4091ecb6bd52c8cb604b3dae74: Status 404 returned error can't find the container with id b0ec2d0e1b615212e991bb1b5d5dce06e4bffc4091ecb6bd52c8cb604b3dae74 Apr 16 14:55:17.257356 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:17.257324 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" event={"ID":"a555f4b5-439d-49f4-b8ed-e9bc2efe809d","Type":"ContainerStarted","Data":"b0ec2d0e1b615212e991bb1b5d5dce06e4bffc4091ecb6bd52c8cb604b3dae74"} Apr 16 14:55:18.142677 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:18.142624 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-kdlzz"] Apr 16 14:55:18.145956 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:18.145937 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kdlzz" Apr 16 14:55:18.148038 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:18.148010 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 14:55:18.148155 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:18.148137 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-xd2hn\"" Apr 16 14:55:18.152311 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:18.152291 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-kdlzz"] Apr 16 14:55:18.212308 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:18.212281 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/246df5c2-dafa-46a3-b572-3e25f6e310a8-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-kdlzz\" (UID: \"246df5c2-dafa-46a3-b572-3e25f6e310a8\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kdlzz" Apr 16 14:55:18.313080 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:18.313047 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/246df5c2-dafa-46a3-b572-3e25f6e310a8-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-kdlzz\" (UID: \"246df5c2-dafa-46a3-b572-3e25f6e310a8\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kdlzz" Apr 16 14:55:18.316107 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:18.316077 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/246df5c2-dafa-46a3-b572-3e25f6e310a8-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-kdlzz\" (UID: \"246df5c2-dafa-46a3-b572-3e25f6e310a8\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kdlzz" Apr 16 14:55:18.458858 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:18.458732 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kdlzz" Apr 16 14:55:18.611357 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:18.611336 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-kdlzz"] Apr 16 14:55:18.614055 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:55:18.614030 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod246df5c2_dafa_46a3_b572_3e25f6e310a8.slice/crio-80b6c3d63a146f84207078858e40827aa23f2ef50d94af6763c78c137c061f52 WatchSource:0}: Error finding container 80b6c3d63a146f84207078858e40827aa23f2ef50d94af6763c78c137c061f52: Status 404 returned error can't find the container with id 80b6c3d63a146f84207078858e40827aa23f2ef50d94af6763c78c137c061f52 Apr 16 14:55:19.265574 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.265498 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" event={"ID":"a555f4b5-439d-49f4-b8ed-e9bc2efe809d","Type":"ContainerStarted","Data":"07e23b3eff92f2407b02ea0025265dce6000780b1f381b3cc21c95384645f9c9"} Apr 16 14:55:19.265574 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.265539 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" event={"ID":"a555f4b5-439d-49f4-b8ed-e9bc2efe809d","Type":"ContainerStarted","Data":"1d02a87a7c244d87baa9f47f25746220114df1c1544a1f1f840ab9286fde0a09"} Apr 16 14:55:19.265574 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.265551 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" event={"ID":"a555f4b5-439d-49f4-b8ed-e9bc2efe809d","Type":"ContainerStarted","Data":"e919ba8bd93fef572704b75cf1f0b7eb024ca4a71799d2edf0777c7100a6676e"} Apr 16 14:55:19.266791 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.266762 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kdlzz" event={"ID":"246df5c2-dafa-46a3-b572-3e25f6e310a8","Type":"ContainerStarted","Data":"80b6c3d63a146f84207078858e40827aa23f2ef50d94af6763c78c137c061f52"} Apr 16 14:55:19.582323 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.582298 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:55:19.586526 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.586509 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.589123 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.588620 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:55:19.589123 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.588713 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:55:19.589123 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.588799 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-lsms4\"" Apr 16 14:55:19.589123 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.588898 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:55:19.589123 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.588959 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:55:19.589123 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.589013 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:55:19.589123 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.589058 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-e9qreni9a445k\"" Apr 16 14:55:19.589810 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.589447 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:55:19.589810 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.589582 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:55:19.589810 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.589743 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:55:19.590013 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.589941 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:55:19.590166 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.590149 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:55:19.590230 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.590169 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:55:19.592269 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.592244 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:55:19.595125 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.595104 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:55:19.600968 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.600831 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:55:19.623754 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.623726 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h58l\" (UniqueName: \"kubernetes.io/projected/144abada-be56-43df-a723-bfd2c3e3cef5-kube-api-access-7h58l\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.623863 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.623766 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.623863 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.623831 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.623980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.623860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.623980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.623894 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-web-config\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.623980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.623918 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.623980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.623951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.624182 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.623982 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.624182 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.624027 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-config\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.624182 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.624061 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/144abada-be56-43df-a723-bfd2c3e3cef5-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.624182 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.624093 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.624182 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.624134 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.624182 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.624161 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.624982 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.624197 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.624982 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.624226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.624982 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.624267 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/144abada-be56-43df-a723-bfd2c3e3cef5-config-out\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.624982 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.624431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.624982 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.624475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/144abada-be56-43df-a723-bfd2c3e3cef5-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.725368 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.725334 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-config\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.725509 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.725374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/144abada-be56-43df-a723-bfd2c3e3cef5-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.725509 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.725398 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.725509 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.725432 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.725711 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.725562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.725711 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.725633 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.725711 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.725708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.725856 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.725752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/144abada-be56-43df-a723-bfd2c3e3cef5-config-out\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.725856 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.725776 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.725856 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.725801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/144abada-be56-43df-a723-bfd2c3e3cef5-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.725856 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.725843 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7h58l\" (UniqueName: \"kubernetes.io/projected/144abada-be56-43df-a723-bfd2c3e3cef5-kube-api-access-7h58l\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.726047 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.725871 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.726047 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.725949 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.726047 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.725979 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.726047 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.726009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-web-config\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.726047 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.726033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.726265 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.726064 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.726265 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.726094 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.726370 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.726233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.726775 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.726412 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.726775 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.726705 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/144abada-be56-43df-a723-bfd2c3e3cef5-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.727255 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.727229 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.728842 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.728814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.729160 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.729101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.729250 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.729230 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-config\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.730030 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.729977 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/144abada-be56-43df-a723-bfd2c3e3cef5-config-out\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.730528 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.730216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/144abada-be56-43df-a723-bfd2c3e3cef5-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.730528 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.730491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.730792 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.730776 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.731085 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.731000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.731585 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.731561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-web-config\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.732165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.732111 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.732165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.732135 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.732366 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.732342 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.732924 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.732905 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.735464 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.735444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h58l\" (UniqueName: \"kubernetes.io/projected/144abada-be56-43df-a723-bfd2c3e3cef5-kube-api-access-7h58l\") pod \"prometheus-k8s-0\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:19.899351 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:19.899322 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:20.061699 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:20.061673 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:55:20.065097 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:55:20.065074 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod144abada_be56_43df_a723_bfd2c3e3cef5.slice/crio-d8717bd6a9fe00a79ef679fc28f217e7df7a0dab6965dc0a0249a2808a1f51d3 WatchSource:0}: Error finding container d8717bd6a9fe00a79ef679fc28f217e7df7a0dab6965dc0a0249a2808a1f51d3: Status 404 returned error can't find the container with id d8717bd6a9fe00a79ef679fc28f217e7df7a0dab6965dc0a0249a2808a1f51d3 Apr 16 14:55:20.271707 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:20.271626 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" event={"ID":"a555f4b5-439d-49f4-b8ed-e9bc2efe809d","Type":"ContainerStarted","Data":"51c0e0ff3d5a487aaa08eacfbd22cded247ecf61c55a6a4cd30bd1bdda959d37"} Apr 16 14:55:20.271707 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:20.271679 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" event={"ID":"a555f4b5-439d-49f4-b8ed-e9bc2efe809d","Type":"ContainerStarted","Data":"8fab4f96902b749fa3ecaa111479a4c39bed1a5f232a543bf12cabd5a7779a9d"} Apr 16 14:55:20.271707 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:20.271688 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" event={"ID":"a555f4b5-439d-49f4-b8ed-e9bc2efe809d","Type":"ContainerStarted","Data":"0c39778cbc4150590f9593a01935dcbe715498dd3f2d640751a788e5ccc69609"} Apr 16 14:55:20.271891 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:20.271853 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:20.272777 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:20.272749 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"144abada-be56-43df-a723-bfd2c3e3cef5","Type":"ContainerStarted","Data":"d8717bd6a9fe00a79ef679fc28f217e7df7a0dab6965dc0a0249a2808a1f51d3"} Apr 16 14:55:20.273939 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:20.273914 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kdlzz" event={"ID":"246df5c2-dafa-46a3-b572-3e25f6e310a8","Type":"ContainerStarted","Data":"6f892366d08a5a925d5a01a8176a47170fd0bb340806ad461790825cb703bd06"} Apr 16 14:55:20.274138 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:20.274118 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kdlzz" Apr 16 14:55:20.278348 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:20.278318 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kdlzz" Apr 16 14:55:20.295866 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:20.295823 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" podStartSLOduration=1.730556284 podStartE2EDuration="4.295807631s" podCreationTimestamp="2026-04-16 14:55:16 +0000 UTC" firstStartedPulling="2026-04-16 14:55:16.834549926 +0000 UTC m=+177.752424726" lastFinishedPulling="2026-04-16 14:55:19.39980128 +0000 UTC m=+180.317676073" observedRunningTime="2026-04-16 14:55:20.29486355 +0000 UTC m=+181.212738364" watchObservedRunningTime="2026-04-16 14:55:20.295807631 +0000 UTC m=+181.213682445" Apr 16 14:55:20.309081 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:20.309043 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kdlzz" podStartSLOduration=0.94792737 podStartE2EDuration="2.309032663s" podCreationTimestamp="2026-04-16 14:55:18 +0000 UTC" firstStartedPulling="2026-04-16 14:55:18.616029169 +0000 UTC m=+179.533903962" lastFinishedPulling="2026-04-16 14:55:19.977134461 +0000 UTC m=+180.895009255" observedRunningTime="2026-04-16 14:55:20.308400052 +0000 UTC m=+181.226274865" watchObservedRunningTime="2026-04-16 14:55:20.309032663 +0000 UTC m=+181.226907476" Apr 16 14:55:21.277450 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:21.277384 2576 generic.go:358] "Generic (PLEG): container finished" podID="144abada-be56-43df-a723-bfd2c3e3cef5" containerID="41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1" exitCode=0 Apr 16 14:55:21.277859 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:21.277466 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"144abada-be56-43df-a723-bfd2c3e3cef5","Type":"ContainerDied","Data":"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1"} Apr 16 14:55:24.287700 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:24.287621 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"144abada-be56-43df-a723-bfd2c3e3cef5","Type":"ContainerStarted","Data":"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245"} Apr 16 14:55:24.287700 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:24.287676 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"144abada-be56-43df-a723-bfd2c3e3cef5","Type":"ContainerStarted","Data":"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0"} Apr 16 14:55:24.287700 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:24.287690 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"144abada-be56-43df-a723-bfd2c3e3cef5","Type":"ContainerStarted","Data":"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d"} Apr 16 14:55:24.287700 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:24.287698 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"144abada-be56-43df-a723-bfd2c3e3cef5","Type":"ContainerStarted","Data":"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf"} Apr 16 14:55:24.288090 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:24.287706 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"144abada-be56-43df-a723-bfd2c3e3cef5","Type":"ContainerStarted","Data":"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2"} Apr 16 14:55:24.288090 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:24.287715 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"144abada-be56-43df-a723-bfd2c3e3cef5","Type":"ContainerStarted","Data":"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c"} Apr 16 14:55:24.316531 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:24.316487 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.604390644 podStartE2EDuration="5.316475284s" podCreationTimestamp="2026-04-16 14:55:19 +0000 UTC" firstStartedPulling="2026-04-16 14:55:20.066914029 +0000 UTC m=+180.984788834" lastFinishedPulling="2026-04-16 14:55:23.778998681 +0000 UTC m=+184.696873474" observedRunningTime="2026-04-16 14:55:24.314234365 +0000 UTC m=+185.232109177" watchObservedRunningTime="2026-04-16 14:55:24.316475284 +0000 UTC m=+185.234350102" Apr 16 14:55:24.899543 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:24.899516 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:25.219525 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:25.219470 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6955d6c465-cssgp" Apr 16 14:55:26.283839 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:26.283816 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-9fd75b6db-n4768" Apr 16 14:55:51.362586 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:51.362552 2576 generic.go:358] "Generic (PLEG): container finished" podID="2bc1e63a-763f-4bce-914e-ec0b77b7b58b" containerID="42c62445f5f8415704b6c3333977c06c1deec59a77f010db7560a1da678af75b" exitCode=0 Apr 16 14:55:51.362989 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:51.362629 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" event={"ID":"2bc1e63a-763f-4bce-914e-ec0b77b7b58b","Type":"ContainerDied","Data":"42c62445f5f8415704b6c3333977c06c1deec59a77f010db7560a1da678af75b"} Apr 16 14:55:51.362989 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:51.362979 2576 scope.go:117] "RemoveContainer" containerID="42c62445f5f8415704b6c3333977c06c1deec59a77f010db7560a1da678af75b" Apr 16 14:55:52.367025 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:55:52.366991 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-xlbhf" event={"ID":"2bc1e63a-763f-4bce-914e-ec0b77b7b58b","Type":"ContainerStarted","Data":"1eb37ddbc73ebfea3ec2fe58e94cbe05e6f90fba0c2a13ea9f7e39d78ce5b024"} Apr 16 14:56:19.900050 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:19.900017 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:19.918875 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:19.918851 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:20.455530 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:20.455503 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:30.437188 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:30.437112 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs\") pod \"network-metrics-daemon-5zzl5\" (UID: \"fb239c54-f254-4320-9008-4b4f5895660d\") " pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:56:30.439223 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:30.439204 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb239c54-f254-4320-9008-4b4f5895660d-metrics-certs\") pod \"network-metrics-daemon-5zzl5\" (UID: \"fb239c54-f254-4320-9008-4b4f5895660d\") " pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:56:30.508358 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:30.508336 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hst6r\"" Apr 16 14:56:30.516249 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:30.516231 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zzl5" Apr 16 14:56:30.625734 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:30.625690 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5zzl5"] Apr 16 14:56:30.628311 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:56:30.628281 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb239c54_f254_4320_9008_4b4f5895660d.slice/crio-7edb3dc7790b090a6e7633ab602ea110a6179afe86de5f92f5ce9862f0f185ec WatchSource:0}: Error finding container 7edb3dc7790b090a6e7633ab602ea110a6179afe86de5f92f5ce9862f0f185ec: Status 404 returned error can't find the container with id 7edb3dc7790b090a6e7633ab602ea110a6179afe86de5f92f5ce9862f0f185ec Apr 16 14:56:31.476529 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:31.476459 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5zzl5" event={"ID":"fb239c54-f254-4320-9008-4b4f5895660d","Type":"ContainerStarted","Data":"7edb3dc7790b090a6e7633ab602ea110a6179afe86de5f92f5ce9862f0f185ec"} Apr 16 14:56:32.483853 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:32.483813 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5zzl5" event={"ID":"fb239c54-f254-4320-9008-4b4f5895660d","Type":"ContainerStarted","Data":"171ea85c6b1252abbaf5fbdc3daaf6fd6e6427905c8929ab89859438894872f4"} Apr 16 14:56:32.483853 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:32.483852 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5zzl5" event={"ID":"fb239c54-f254-4320-9008-4b4f5895660d","Type":"ContainerStarted","Data":"f93edafcf31cd925eab27c9801f17a778923ef5227972dc15a35c69bc573ec1f"} Apr 16 14:56:32.498509 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:32.498468 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5zzl5" podStartSLOduration=252.519109108 podStartE2EDuration="4m13.498448493s" podCreationTimestamp="2026-04-16 14:52:19 +0000 UTC" firstStartedPulling="2026-04-16 14:56:30.630134254 +0000 UTC m=+251.548009060" lastFinishedPulling="2026-04-16 14:56:31.609473648 +0000 UTC m=+252.527348445" observedRunningTime="2026-04-16 14:56:32.497961425 +0000 UTC m=+253.415836238" watchObservedRunningTime="2026-04-16 14:56:32.498448493 +0000 UTC m=+253.416323312" Apr 16 14:56:37.935934 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:37.935899 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:56:37.936454 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:37.936403 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="prometheus" containerID="cri-o://609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c" gracePeriod=600 Apr 16 14:56:37.936571 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:37.936414 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="thanos-sidecar" containerID="cri-o://c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf" gracePeriod=600 Apr 16 14:56:37.936571 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:37.936418 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="kube-rbac-proxy-thanos" containerID="cri-o://e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245" gracePeriod=600 Apr 16 14:56:37.936571 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:37.936400 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="kube-rbac-proxy" containerID="cri-o://60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0" gracePeriod=600 Apr 16 14:56:37.936571 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:37.936417 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="config-reloader" containerID="cri-o://a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2" gracePeriod=600 Apr 16 14:56:37.936797 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:37.936442 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="kube-rbac-proxy-web" containerID="cri-o://7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d" gracePeriod=600 Apr 16 14:56:38.172011 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.171987 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.292867 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.292811 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h58l\" (UniqueName: \"kubernetes.io/projected/144abada-be56-43df-a723-bfd2c3e3cef5-kube-api-access-7h58l\") pod \"144abada-be56-43df-a723-bfd2c3e3cef5\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " Apr 16 14:56:38.292867 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.292844 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/144abada-be56-43df-a723-bfd2c3e3cef5-config-out\") pod \"144abada-be56-43df-a723-bfd2c3e3cef5\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " Apr 16 14:56:38.293041 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.292877 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-metrics-client-certs\") pod \"144abada-be56-43df-a723-bfd2c3e3cef5\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " Apr 16 14:56:38.293041 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.292893 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-kube-rbac-proxy\") pod \"144abada-be56-43df-a723-bfd2c3e3cef5\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " Apr 16 14:56:38.293041 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.292912 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-configmap-metrics-client-ca\") pod \"144abada-be56-43df-a723-bfd2c3e3cef5\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " Apr 16 14:56:38.293041 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.292929 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-configmap-serving-certs-ca-bundle\") pod \"144abada-be56-43df-a723-bfd2c3e3cef5\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " Apr 16 14:56:38.293041 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.292946 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"144abada-be56-43df-a723-bfd2c3e3cef5\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " Apr 16 14:56:38.293041 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.292973 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-web-config\") pod \"144abada-be56-43df-a723-bfd2c3e3cef5\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " Apr 16 14:56:38.293041 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.292998 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/144abada-be56-43df-a723-bfd2c3e3cef5-prometheus-k8s-db\") pod \"144abada-be56-43df-a723-bfd2c3e3cef5\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " Apr 16 14:56:38.293424 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.293370 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "144abada-be56-43df-a723-bfd2c3e3cef5" (UID: "144abada-be56-43df-a723-bfd2c3e3cef5"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:38.293424 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.293375 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "144abada-be56-43df-a723-bfd2c3e3cef5" (UID: "144abada-be56-43df-a723-bfd2c3e3cef5"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:38.293835 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.293514 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-config\") pod \"144abada-be56-43df-a723-bfd2c3e3cef5\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " Apr 16 14:56:38.293835 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.293569 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-prometheus-k8s-rulefiles-0\") pod \"144abada-be56-43df-a723-bfd2c3e3cef5\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " Apr 16 14:56:38.293835 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.293601 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/144abada-be56-43df-a723-bfd2c3e3cef5-tls-assets\") pod \"144abada-be56-43df-a723-bfd2c3e3cef5\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " Apr 16 14:56:38.293835 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.293829 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-prometheus-k8s-tls\") pod \"144abada-be56-43df-a723-bfd2c3e3cef5\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " Apr 16 14:56:38.294047 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.293864 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-thanos-prometheus-http-client-file\") pod \"144abada-be56-43df-a723-bfd2c3e3cef5\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " Apr 16 14:56:38.294047 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.293911 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"144abada-be56-43df-a723-bfd2c3e3cef5\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " Apr 16 14:56:38.294047 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.293937 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-grpc-tls\") pod \"144abada-be56-43df-a723-bfd2c3e3cef5\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " Apr 16 14:56:38.294047 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.293969 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-configmap-kubelet-serving-ca-bundle\") pod \"144abada-be56-43df-a723-bfd2c3e3cef5\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " Apr 16 14:56:38.294047 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.293999 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-prometheus-trusted-ca-bundle\") pod \"144abada-be56-43df-a723-bfd2c3e3cef5\" (UID: \"144abada-be56-43df-a723-bfd2c3e3cef5\") " Apr 16 14:56:38.294276 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.294225 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-configmap-metrics-client-ca\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:56:38.294276 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.294242 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:56:38.294927 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.294595 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/144abada-be56-43df-a723-bfd2c3e3cef5-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "144abada-be56-43df-a723-bfd2c3e3cef5" (UID: "144abada-be56-43df-a723-bfd2c3e3cef5"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:56:38.294927 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.294674 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "144abada-be56-43df-a723-bfd2c3e3cef5" (UID: "144abada-be56-43df-a723-bfd2c3e3cef5"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:38.296604 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.296436 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/144abada-be56-43df-a723-bfd2c3e3cef5-config-out" (OuterVolumeSpecName: "config-out") pod "144abada-be56-43df-a723-bfd2c3e3cef5" (UID: "144abada-be56-43df-a723-bfd2c3e3cef5"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:56:38.296604 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.296517 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/144abada-be56-43df-a723-bfd2c3e3cef5-kube-api-access-7h58l" (OuterVolumeSpecName: "kube-api-access-7h58l") pod "144abada-be56-43df-a723-bfd2c3e3cef5" (UID: "144abada-be56-43df-a723-bfd2c3e3cef5"). InnerVolumeSpecName "kube-api-access-7h58l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:56:38.296604 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.296572 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "144abada-be56-43df-a723-bfd2c3e3cef5" (UID: "144abada-be56-43df-a723-bfd2c3e3cef5"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:38.297152 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.297091 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "144abada-be56-43df-a723-bfd2c3e3cef5" (UID: "144abada-be56-43df-a723-bfd2c3e3cef5"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:38.297247 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.297210 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "144abada-be56-43df-a723-bfd2c3e3cef5" (UID: "144abada-be56-43df-a723-bfd2c3e3cef5"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:38.297306 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.297262 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-config" (OuterVolumeSpecName: "config") pod "144abada-be56-43df-a723-bfd2c3e3cef5" (UID: "144abada-be56-43df-a723-bfd2c3e3cef5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:38.297782 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.297367 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "144abada-be56-43df-a723-bfd2c3e3cef5" (UID: "144abada-be56-43df-a723-bfd2c3e3cef5"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:38.297782 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.297390 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "144abada-be56-43df-a723-bfd2c3e3cef5" (UID: "144abada-be56-43df-a723-bfd2c3e3cef5"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:38.297782 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.297493 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "144abada-be56-43df-a723-bfd2c3e3cef5" (UID: "144abada-be56-43df-a723-bfd2c3e3cef5"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:38.297919 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.297853 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "144abada-be56-43df-a723-bfd2c3e3cef5" (UID: "144abada-be56-43df-a723-bfd2c3e3cef5"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:38.297919 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.297868 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/144abada-be56-43df-a723-bfd2c3e3cef5-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "144abada-be56-43df-a723-bfd2c3e3cef5" (UID: "144abada-be56-43df-a723-bfd2c3e3cef5"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:56:38.297989 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.297977 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "144abada-be56-43df-a723-bfd2c3e3cef5" (UID: "144abada-be56-43df-a723-bfd2c3e3cef5"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:38.298961 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.298925 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "144abada-be56-43df-a723-bfd2c3e3cef5" (UID: "144abada-be56-43df-a723-bfd2c3e3cef5"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:38.307892 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.307873 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-web-config" (OuterVolumeSpecName: "web-config") pod "144abada-be56-43df-a723-bfd2c3e3cef5" (UID: "144abada-be56-43df-a723-bfd2c3e3cef5"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:38.395346 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.395328 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7h58l\" (UniqueName: \"kubernetes.io/projected/144abada-be56-43df-a723-bfd2c3e3cef5-kube-api-access-7h58l\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:56:38.395436 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.395349 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/144abada-be56-43df-a723-bfd2c3e3cef5-config-out\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:56:38.395436 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.395357 2576 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-metrics-client-certs\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:56:38.395436 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.395367 2576 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-kube-rbac-proxy\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:56:38.395436 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.395376 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:56:38.395436 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.395385 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-web-config\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:56:38.395436 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.395393 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/144abada-be56-43df-a723-bfd2c3e3cef5-prometheus-k8s-db\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:56:38.395436 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.395401 2576 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-config\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:56:38.395436 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.395409 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:56:38.395436 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.395417 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/144abada-be56-43df-a723-bfd2c3e3cef5-tls-assets\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:56:38.395436 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.395425 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-prometheus-k8s-tls\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:56:38.395436 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.395433 2576 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-thanos-prometheus-http-client-file\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:56:38.395783 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.395442 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:56:38.395783 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.395451 2576 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/144abada-be56-43df-a723-bfd2c3e3cef5-secret-grpc-tls\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:56:38.395783 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.395459 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:56:38.395783 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.395467 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144abada-be56-43df-a723-bfd2c3e3cef5-prometheus-trusted-ca-bundle\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 14:56:38.501324 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.501298 2576 generic.go:358] "Generic (PLEG): container finished" podID="144abada-be56-43df-a723-bfd2c3e3cef5" containerID="e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245" exitCode=0 Apr 16 14:56:38.501424 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.501325 2576 generic.go:358] "Generic (PLEG): container finished" podID="144abada-be56-43df-a723-bfd2c3e3cef5" containerID="60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0" exitCode=0 Apr 16 14:56:38.501424 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.501337 2576 generic.go:358] "Generic (PLEG): container finished" podID="144abada-be56-43df-a723-bfd2c3e3cef5" containerID="7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d" exitCode=0 Apr 16 14:56:38.501424 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.501347 2576 generic.go:358] "Generic (PLEG): container finished" podID="144abada-be56-43df-a723-bfd2c3e3cef5" containerID="c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf" exitCode=0 Apr 16 14:56:38.501424 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.501354 2576 generic.go:358] "Generic (PLEG): container finished" podID="144abada-be56-43df-a723-bfd2c3e3cef5" containerID="a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2" exitCode=0 Apr 16 14:56:38.501424 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.501362 2576 generic.go:358] "Generic (PLEG): container finished" podID="144abada-be56-43df-a723-bfd2c3e3cef5" containerID="609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c" exitCode=0 Apr 16 14:56:38.501424 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.501373 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"144abada-be56-43df-a723-bfd2c3e3cef5","Type":"ContainerDied","Data":"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245"} Apr 16 14:56:38.501424 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.501407 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"144abada-be56-43df-a723-bfd2c3e3cef5","Type":"ContainerDied","Data":"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0"} Apr 16 14:56:38.501424 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.501407 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.501424 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.501418 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"144abada-be56-43df-a723-bfd2c3e3cef5","Type":"ContainerDied","Data":"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d"} Apr 16 14:56:38.501424 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.501428 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"144abada-be56-43df-a723-bfd2c3e3cef5","Type":"ContainerDied","Data":"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf"} Apr 16 14:56:38.501788 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.501437 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"144abada-be56-43df-a723-bfd2c3e3cef5","Type":"ContainerDied","Data":"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2"} Apr 16 14:56:38.501788 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.501447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"144abada-be56-43df-a723-bfd2c3e3cef5","Type":"ContainerDied","Data":"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c"} Apr 16 14:56:38.501788 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.501455 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"144abada-be56-43df-a723-bfd2c3e3cef5","Type":"ContainerDied","Data":"d8717bd6a9fe00a79ef679fc28f217e7df7a0dab6965dc0a0249a2808a1f51d3"} Apr 16 14:56:38.501788 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.501469 2576 scope.go:117] "RemoveContainer" containerID="e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245" Apr 16 14:56:38.508464 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.508448 2576 scope.go:117] "RemoveContainer" containerID="60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0" Apr 16 14:56:38.514775 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.514690 2576 scope.go:117] "RemoveContainer" containerID="7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d" Apr 16 14:56:38.520738 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.520726 2576 scope.go:117] "RemoveContainer" containerID="c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf" Apr 16 14:56:38.525148 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.525127 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:56:38.527872 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.527855 2576 scope.go:117] "RemoveContainer" containerID="a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2" Apr 16 14:56:38.529555 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.529502 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:56:38.536783 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.536766 2576 scope.go:117] "RemoveContainer" containerID="609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c" Apr 16 14:56:38.542998 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.542984 2576 scope.go:117] "RemoveContainer" containerID="41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1" Apr 16 14:56:38.549300 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.549275 2576 scope.go:117] "RemoveContainer" containerID="e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245" Apr 16 14:56:38.549526 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:56:38.549509 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245\": container with ID starting with e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245 not found: ID does not exist" containerID="e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245" Apr 16 14:56:38.549580 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.549532 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245"} err="failed to get container status \"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245\": rpc error: code = NotFound desc = could not find container \"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245\": container with ID starting with e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245 not found: ID does not exist" Apr 16 14:56:38.549580 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.549560 2576 scope.go:117] "RemoveContainer" containerID="60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0" Apr 16 14:56:38.549787 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:56:38.549771 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0\": container with ID starting with 60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0 not found: ID does not exist" containerID="60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0" Apr 16 14:56:38.549820 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.549801 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0"} err="failed to get container status \"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0\": rpc error: code = NotFound desc = could not find container \"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0\": container with ID starting with 60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0 not found: ID does not exist" Apr 16 14:56:38.549820 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.549816 2576 scope.go:117] "RemoveContainer" containerID="7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d" Apr 16 14:56:38.550012 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:56:38.549998 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d\": container with ID starting with 7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d not found: ID does not exist" containerID="7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d" Apr 16 14:56:38.550051 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.550016 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d"} err="failed to get container status \"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d\": rpc error: code = NotFound desc = could not find container \"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d\": container with ID starting with 7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d not found: ID does not exist" Apr 16 14:56:38.550051 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.550029 2576 scope.go:117] "RemoveContainer" containerID="c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf" Apr 16 14:56:38.550191 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:56:38.550176 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf\": container with ID starting with c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf not found: ID does not exist" containerID="c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf" Apr 16 14:56:38.550237 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.550194 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf"} err="failed to get container status \"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf\": rpc error: code = NotFound desc = could not find container \"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf\": container with ID starting with c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf not found: ID does not exist" Apr 16 14:56:38.550237 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.550207 2576 scope.go:117] "RemoveContainer" containerID="a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2" Apr 16 14:56:38.550379 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:56:38.550364 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2\": container with ID starting with a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2 not found: ID does not exist" containerID="a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2" Apr 16 14:56:38.550415 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.550383 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2"} err="failed to get container status \"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2\": rpc error: code = NotFound desc = could not find container \"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2\": container with ID starting with a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2 not found: ID does not exist" Apr 16 14:56:38.550415 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.550395 2576 scope.go:117] "RemoveContainer" containerID="609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c" Apr 16 14:56:38.550590 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:56:38.550577 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c\": container with ID starting with 609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c not found: ID does not exist" containerID="609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c" Apr 16 14:56:38.550631 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.550593 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c"} err="failed to get container status \"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c\": rpc error: code = NotFound desc = could not find container \"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c\": container with ID starting with 609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c not found: ID does not exist" Apr 16 14:56:38.550631 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.550605 2576 scope.go:117] "RemoveContainer" containerID="41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1" Apr 16 14:56:38.550836 ip-10-0-141-195 kubenswrapper[2576]: E0416 14:56:38.550820 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1\": container with ID starting with 41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1 not found: ID does not exist" containerID="41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1" Apr 16 14:56:38.550882 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.550839 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1"} err="failed to get container status \"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1\": rpc error: code = NotFound desc = could not find container \"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1\": container with ID starting with 41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1 not found: ID does not exist" Apr 16 14:56:38.550882 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.550851 2576 scope.go:117] "RemoveContainer" containerID="e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245" Apr 16 14:56:38.551012 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.550994 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245"} err="failed to get container status \"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245\": rpc error: code = NotFound desc = could not find container \"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245\": container with ID starting with e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245 not found: ID does not exist" Apr 16 14:56:38.551075 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.551016 2576 scope.go:117] "RemoveContainer" containerID="60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0" Apr 16 14:56:38.551198 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.551182 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0"} err="failed to get container status \"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0\": rpc error: code = NotFound desc = could not find container \"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0\": container with ID starting with 60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0 not found: ID does not exist" Apr 16 14:56:38.551259 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.551199 2576 scope.go:117] "RemoveContainer" containerID="7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d" Apr 16 14:56:38.551388 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.551373 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d"} err="failed to get container status \"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d\": rpc error: code = NotFound desc = could not find container \"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d\": container with ID starting with 7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d not found: ID does not exist" Apr 16 14:56:38.551449 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.551389 2576 scope.go:117] "RemoveContainer" containerID="c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf" Apr 16 14:56:38.551600 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.551583 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf"} err="failed to get container status \"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf\": rpc error: code = NotFound desc = could not find container \"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf\": container with ID starting with c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf not found: ID does not exist" Apr 16 14:56:38.551670 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.551602 2576 scope.go:117] "RemoveContainer" containerID="a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2" Apr 16 14:56:38.551861 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.551843 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2"} err="failed to get container status \"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2\": rpc error: code = NotFound desc = could not find container \"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2\": container with ID starting with a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2 not found: ID does not exist" Apr 16 14:56:38.551926 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.551862 2576 scope.go:117] "RemoveContainer" containerID="609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c" Apr 16 14:56:38.552049 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.552028 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c"} err="failed to get container status \"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c\": rpc error: code = NotFound desc = could not find container \"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c\": container with ID starting with 609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c not found: ID does not exist" Apr 16 14:56:38.552091 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.552051 2576 scope.go:117] "RemoveContainer" containerID="41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1" Apr 16 14:56:38.552269 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.552251 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1"} err="failed to get container status \"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1\": rpc error: code = NotFound desc = could not find container \"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1\": container with ID starting with 41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1 not found: ID does not exist" Apr 16 14:56:38.552333 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.552270 2576 scope.go:117] "RemoveContainer" containerID="e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245" Apr 16 14:56:38.552484 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.552469 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245"} err="failed to get container status \"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245\": rpc error: code = NotFound desc = could not find container \"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245\": container with ID starting with e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245 not found: ID does not exist" Apr 16 14:56:38.552544 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.552485 2576 scope.go:117] "RemoveContainer" containerID="60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0" Apr 16 14:56:38.552718 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.552700 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0"} err="failed to get container status \"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0\": rpc error: code = NotFound desc = could not find container \"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0\": container with ID starting with 60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0 not found: ID does not exist" Apr 16 14:56:38.552786 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.552718 2576 scope.go:117] "RemoveContainer" containerID="7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d" Apr 16 14:56:38.552914 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.552894 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d"} err="failed to get container status \"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d\": rpc error: code = NotFound desc = could not find container \"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d\": container with ID starting with 7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d not found: ID does not exist" Apr 16 14:56:38.552953 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.552916 2576 scope.go:117] "RemoveContainer" containerID="c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf" Apr 16 14:56:38.553116 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.553100 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf"} err="failed to get container status \"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf\": rpc error: code = NotFound desc = could not find container \"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf\": container with ID starting with c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf not found: ID does not exist" Apr 16 14:56:38.553179 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.553119 2576 scope.go:117] "RemoveContainer" containerID="a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2" Apr 16 14:56:38.553310 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.553292 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2"} err="failed to get container status \"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2\": rpc error: code = NotFound desc = could not find container \"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2\": container with ID starting with a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2 not found: ID does not exist" Apr 16 14:56:38.553350 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.553311 2576 scope.go:117] "RemoveContainer" containerID="609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c" Apr 16 14:56:38.553482 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.553459 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c"} err="failed to get container status \"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c\": rpc error: code = NotFound desc = could not find container \"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c\": container with ID starting with 609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c not found: ID does not exist" Apr 16 14:56:38.553530 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.553485 2576 scope.go:117] "RemoveContainer" containerID="41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1" Apr 16 14:56:38.553718 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.553703 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1"} err="failed to get container status \"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1\": rpc error: code = NotFound desc = could not find container \"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1\": container with ID starting with 41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1 not found: ID does not exist" Apr 16 14:56:38.553765 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.553719 2576 scope.go:117] "RemoveContainer" containerID="e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245" Apr 16 14:56:38.553911 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.553885 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245"} err="failed to get container status \"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245\": rpc error: code = NotFound desc = could not find container \"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245\": container with ID starting with e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245 not found: ID does not exist" Apr 16 14:56:38.553980 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.553912 2576 scope.go:117] "RemoveContainer" containerID="60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0" Apr 16 14:56:38.554122 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.554102 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0"} err="failed to get container status \"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0\": rpc error: code = NotFound desc = could not find container \"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0\": container with ID starting with 60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0 not found: ID does not exist" Apr 16 14:56:38.554165 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.554123 2576 scope.go:117] "RemoveContainer" containerID="7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d" Apr 16 14:56:38.554299 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.554281 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d"} err="failed to get container status \"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d\": rpc error: code = NotFound desc = could not find container \"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d\": container with ID starting with 7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d not found: ID does not exist" Apr 16 14:56:38.554363 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.554301 2576 scope.go:117] "RemoveContainer" containerID="c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf" Apr 16 14:56:38.554481 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.554466 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf"} err="failed to get container status \"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf\": rpc error: code = NotFound desc = could not find container \"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf\": container with ID starting with c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf not found: ID does not exist" Apr 16 14:56:38.554528 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.554481 2576 scope.go:117] "RemoveContainer" containerID="a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2" Apr 16 14:56:38.554695 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.554674 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2"} err="failed to get container status \"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2\": rpc error: code = NotFound desc = could not find container \"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2\": container with ID starting with a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2 not found: ID does not exist" Apr 16 14:56:38.554742 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.554696 2576 scope.go:117] "RemoveContainer" containerID="609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c" Apr 16 14:56:38.554889 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.554872 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c"} err="failed to get container status \"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c\": rpc error: code = NotFound desc = could not find container \"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c\": container with ID starting with 609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c not found: ID does not exist" Apr 16 14:56:38.554952 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.554890 2576 scope.go:117] "RemoveContainer" containerID="41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1" Apr 16 14:56:38.555077 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.555059 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1"} err="failed to get container status \"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1\": rpc error: code = NotFound desc = could not find container \"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1\": container with ID starting with 41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1 not found: ID does not exist" Apr 16 14:56:38.555125 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.555078 2576 scope.go:117] "RemoveContainer" containerID="e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245" Apr 16 14:56:38.555234 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.555220 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245"} err="failed to get container status \"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245\": rpc error: code = NotFound desc = could not find container \"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245\": container with ID starting with e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245 not found: ID does not exist" Apr 16 14:56:38.555273 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.555234 2576 scope.go:117] "RemoveContainer" containerID="60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0" Apr 16 14:56:38.555410 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.555392 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0"} err="failed to get container status \"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0\": rpc error: code = NotFound desc = could not find container \"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0\": container with ID starting with 60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0 not found: ID does not exist" Apr 16 14:56:38.555448 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.555412 2576 scope.go:117] "RemoveContainer" containerID="7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d" Apr 16 14:56:38.555601 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.555583 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d"} err="failed to get container status \"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d\": rpc error: code = NotFound desc = could not find container \"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d\": container with ID starting with 7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d not found: ID does not exist" Apr 16 14:56:38.555680 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.555602 2576 scope.go:117] "RemoveContainer" containerID="c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf" Apr 16 14:56:38.555808 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.555792 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf"} err="failed to get container status \"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf\": rpc error: code = NotFound desc = could not find container \"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf\": container with ID starting with c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf not found: ID does not exist" Apr 16 14:56:38.555869 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.555809 2576 scope.go:117] "RemoveContainer" containerID="a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2" Apr 16 14:56:38.556117 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.556098 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2"} err="failed to get container status \"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2\": rpc error: code = NotFound desc = could not find container \"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2\": container with ID starting with a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2 not found: ID does not exist" Apr 16 14:56:38.556202 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.556118 2576 scope.go:117] "RemoveContainer" containerID="609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c" Apr 16 14:56:38.556383 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.556356 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c"} err="failed to get container status \"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c\": rpc error: code = NotFound desc = could not find container \"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c\": container with ID starting with 609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c not found: ID does not exist" Apr 16 14:56:38.556383 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.556379 2576 scope.go:117] "RemoveContainer" containerID="41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1" Apr 16 14:56:38.556618 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.556600 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1"} err="failed to get container status \"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1\": rpc error: code = NotFound desc = could not find container \"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1\": container with ID starting with 41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1 not found: ID does not exist" Apr 16 14:56:38.556689 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.556620 2576 scope.go:117] "RemoveContainer" containerID="e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245" Apr 16 14:56:38.556954 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.556933 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245"} err="failed to get container status \"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245\": rpc error: code = NotFound desc = could not find container \"e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245\": container with ID starting with e06e91e9d6b2a2c2d45bcd27ac60dc430535396faf92c85658bca9331c24a245 not found: ID does not exist" Apr 16 14:56:38.556954 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.556954 2576 scope.go:117] "RemoveContainer" containerID="60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0" Apr 16 14:56:38.557271 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.557246 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0"} err="failed to get container status \"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0\": rpc error: code = NotFound desc = could not find container \"60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0\": container with ID starting with 60e8c58c3afe43b06f4d3171c7bca1e470062d1a132565ceb666863047f3cac0 not found: ID does not exist" Apr 16 14:56:38.557361 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.557273 2576 scope.go:117] "RemoveContainer" containerID="7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d" Apr 16 14:56:38.557554 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.557536 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d"} err="failed to get container status \"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d\": rpc error: code = NotFound desc = could not find container \"7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d\": container with ID starting with 7feae9d79cb7f65b44e98d928b40fad058409a8420885ed0d2fe909808a5e23d not found: ID does not exist" Apr 16 14:56:38.557628 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.557566 2576 scope.go:117] "RemoveContainer" containerID="c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf" Apr 16 14:56:38.557825 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.557797 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf"} err="failed to get container status \"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf\": rpc error: code = NotFound desc = could not find container \"c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf\": container with ID starting with c97912db684a01000c5df4f9e3d265a5decbd00782f521b29386e1ab88faecbf not found: ID does not exist" Apr 16 14:56:38.557825 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.557815 2576 scope.go:117] "RemoveContainer" containerID="a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2" Apr 16 14:56:38.558030 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.558014 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2"} err="failed to get container status \"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2\": rpc error: code = NotFound desc = could not find container \"a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2\": container with ID starting with a0effcc0e1a181eddf987e37c2a7fe28c8b8cab6311661561a948f2fd9b72bd2 not found: ID does not exist" Apr 16 14:56:38.558092 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.558030 2576 scope.go:117] "RemoveContainer" containerID="609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c" Apr 16 14:56:38.558262 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.558219 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c"} err="failed to get container status \"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c\": rpc error: code = NotFound desc = could not find container \"609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c\": container with ID starting with 609eacaadf4f846be516e67a03090460ac5d47ece7fea4d5460e69959246120c not found: ID does not exist" Apr 16 14:56:38.558327 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.558264 2576 scope.go:117] "RemoveContainer" containerID="41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1" Apr 16 14:56:38.558531 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.558514 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1"} err="failed to get container status \"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1\": rpc error: code = NotFound desc = could not find container \"41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1\": container with ID starting with 41ac7da6624f0fcdc16cda30157146e07eb9a5d15fe7d23d6bb1c4a8b9aa24e1 not found: ID does not exist" Apr 16 14:56:38.558898 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.558881 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:56:38.559242 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559227 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="prometheus" Apr 16 14:56:38.559273 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559247 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="prometheus" Apr 16 14:56:38.559273 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559264 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="kube-rbac-proxy" Apr 16 14:56:38.559332 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559272 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="kube-rbac-proxy" Apr 16 14:56:38.559332 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559288 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="config-reloader" Apr 16 14:56:38.559332 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559297 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="config-reloader" Apr 16 14:56:38.559332 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559320 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="kube-rbac-proxy-web" Apr 16 14:56:38.559332 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559330 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="kube-rbac-proxy-web" Apr 16 14:56:38.559467 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559340 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="thanos-sidecar" Apr 16 14:56:38.559467 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559348 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="thanos-sidecar" Apr 16 14:56:38.559467 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559357 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="kube-rbac-proxy-thanos" Apr 16 14:56:38.559467 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559365 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="kube-rbac-proxy-thanos" Apr 16 14:56:38.559467 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559376 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="init-config-reloader" Apr 16 14:56:38.559467 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559384 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="init-config-reloader" Apr 16 14:56:38.559467 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559444 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="config-reloader" Apr 16 14:56:38.559467 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559456 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="thanos-sidecar" Apr 16 14:56:38.559467 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559466 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="kube-rbac-proxy-web" Apr 16 14:56:38.559732 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559477 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="kube-rbac-proxy" Apr 16 14:56:38.559732 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559485 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="kube-rbac-proxy-thanos" Apr 16 14:56:38.559732 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.559494 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" containerName="prometheus" Apr 16 14:56:38.564374 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.564359 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.566849 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.566834 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:56:38.567105 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.567092 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:56:38.567313 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.567300 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:56:38.567392 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.567312 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:56:38.567392 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.567347 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-lsms4\"" Apr 16 14:56:38.567392 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.567379 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:56:38.567517 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.567394 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:56:38.568357 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.568341 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:56:38.568412 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.568372 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:56:38.568510 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.568495 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:56:38.568562 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.568514 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:56:38.568939 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.568922 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:56:38.569393 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.569368 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-e9qreni9a445k\"" Apr 16 14:56:38.570494 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.570473 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:56:38.574626 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.574446 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:56:38.575804 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.575778 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:56:38.697450 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.697419 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-web-config\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.697581 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.697456 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.697581 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.697484 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.697581 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.697548 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.697700 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.697606 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.697700 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.697629 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-config-out\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.697700 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.697685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.697788 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.697708 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.697788 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.697724 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.697788 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.697737 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.697788 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.697752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.697788 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.697776 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.697931 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.697796 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.697931 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.697848 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.697931 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.697872 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-config\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.697931 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.697889 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.697931 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.697913 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.697931 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.697931 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvjw8\" (UniqueName: \"kubernetes.io/projected/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-kube-api-access-xvjw8\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.798741 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.798620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.798741 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.798700 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.798741 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.798722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.798741 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.798737 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.799108 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.798759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.799108 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.798783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.799108 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.798812 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.799108 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.798888 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.799108 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.798928 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-config\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.799108 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.798947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.799108 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.798970 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.799108 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.798993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvjw8\" (UniqueName: \"kubernetes.io/projected/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-kube-api-access-xvjw8\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.799108 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.799023 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-web-config\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.799108 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.799062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.799108 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.799093 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.799664 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.799125 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.799664 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.799202 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.799664 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.799214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.799664 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.799248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-config-out\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.800039 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.799704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.800039 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.799729 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.802256 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.801810 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-config-out\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.802256 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.802029 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.802256 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.802032 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.802256 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.802108 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.802563 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.802538 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.802625 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.802110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-web-config\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.802625 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.802583 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.802944 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.802829 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.803378 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.803351 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.803883 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.803856 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.804327 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.804309 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-config\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.804801 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.804779 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.804801 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.804796 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.805209 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.805191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.810515 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.810499 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvjw8\" (UniqueName: \"kubernetes.io/projected/dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59-kube-api-access-xvjw8\") pod \"prometheus-k8s-0\" (UID: \"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:38.873633 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:38.873594 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:39.000863 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:39.000813 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:56:39.004876 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:56:39.004845 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfe686f1_f0d5_49a0_81a4_42aa4e1e6b59.slice/crio-84203fe62ba75f802c91376abef02cac197359f827395c180f5ee7dc3b51db8f WatchSource:0}: Error finding container 84203fe62ba75f802c91376abef02cac197359f827395c180f5ee7dc3b51db8f: Status 404 returned error can't find the container with id 84203fe62ba75f802c91376abef02cac197359f827395c180f5ee7dc3b51db8f Apr 16 14:56:39.505850 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:39.505818 2576 generic.go:358] "Generic (PLEG): container finished" podID="dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59" containerID="61fd2237999a8acd641c31b6322561f73e48dfa3d0da957b54539aefdd2e9cac" exitCode=0 Apr 16 14:56:39.505958 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:39.505860 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59","Type":"ContainerDied","Data":"61fd2237999a8acd641c31b6322561f73e48dfa3d0da957b54539aefdd2e9cac"} Apr 16 14:56:39.505958 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:39.505880 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59","Type":"ContainerStarted","Data":"84203fe62ba75f802c91376abef02cac197359f827395c180f5ee7dc3b51db8f"} Apr 16 14:56:39.712725 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:39.712652 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="144abada-be56-43df-a723-bfd2c3e3cef5" path="/var/lib/kubelet/pods/144abada-be56-43df-a723-bfd2c3e3cef5/volumes" Apr 16 14:56:40.511152 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:40.511121 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59","Type":"ContainerStarted","Data":"a5b8882447455ca39a9de5e08a97e1592693c8b6f2ec84bfb1352d868b9091c7"} Apr 16 14:56:40.511473 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:40.511158 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59","Type":"ContainerStarted","Data":"e8463584a77b4c3b3a95f787da704174d1d2de17a36d0394c923e4ca5fbcda58"} Apr 16 14:56:40.511473 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:40.511169 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59","Type":"ContainerStarted","Data":"767636c6c1a31e1ecd7962a05b7fca2494b77c4eb6bd55c13ef2a936957ea2f8"} Apr 16 14:56:40.511473 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:40.511177 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59","Type":"ContainerStarted","Data":"8e91b33677bc68a242ced0dbf3468d5f8ba8d60dd6e9221f054a4d5a0f298728"} Apr 16 14:56:40.511473 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:40.511185 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59","Type":"ContainerStarted","Data":"3a3cdf1e36519ecec74126f78c3bdebfdab322b5e38c2dafad66b2544d2f041d"} Apr 16 14:56:40.511473 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:40.511194 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59","Type":"ContainerStarted","Data":"84f6939e980d6686619967ff598e8b806c7f72b25b1dbbecf55cb37997eefd46"} Apr 16 14:56:40.538102 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:40.538031 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.538017308 podStartE2EDuration="2.538017308s" podCreationTimestamp="2026-04-16 14:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:56:40.535787103 +0000 UTC m=+261.453661918" watchObservedRunningTime="2026-04-16 14:56:40.538017308 +0000 UTC m=+261.455892122" Apr 16 14:56:43.874682 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:56:43.874626 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:19.585987 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:57:19.585966 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovn-acl-logging/0.log" Apr 16 14:57:19.586370 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:57:19.586069 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovn-acl-logging/0.log" Apr 16 14:57:19.590048 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:57:19.590028 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:57:38.873977 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:57:38.873943 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:38.890364 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:57:38.890338 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:39.693264 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:57:39.693241 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:58:29.820270 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:29.820184 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-5927m"] Apr 16 14:58:29.822271 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:29.822251 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5927m" Apr 16 14:58:29.824520 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:29.824499 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:58:29.830823 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:29.830792 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5927m"] Apr 16 14:58:29.903945 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:29.903923 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e378af81-ef05-47a5-b87b-2d78018a3700-kubelet-config\") pod \"global-pull-secret-syncer-5927m\" (UID: \"e378af81-ef05-47a5-b87b-2d78018a3700\") " pod="kube-system/global-pull-secret-syncer-5927m" Apr 16 14:58:29.904036 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:29.903952 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e378af81-ef05-47a5-b87b-2d78018a3700-original-pull-secret\") pod \"global-pull-secret-syncer-5927m\" (UID: \"e378af81-ef05-47a5-b87b-2d78018a3700\") " pod="kube-system/global-pull-secret-syncer-5927m" Apr 16 14:58:29.904036 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:29.904028 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e378af81-ef05-47a5-b87b-2d78018a3700-dbus\") pod \"global-pull-secret-syncer-5927m\" (UID: \"e378af81-ef05-47a5-b87b-2d78018a3700\") " pod="kube-system/global-pull-secret-syncer-5927m" Apr 16 14:58:30.004519 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:30.004499 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e378af81-ef05-47a5-b87b-2d78018a3700-original-pull-secret\") pod \"global-pull-secret-syncer-5927m\" (UID: \"e378af81-ef05-47a5-b87b-2d78018a3700\") " pod="kube-system/global-pull-secret-syncer-5927m" Apr 16 14:58:30.004622 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:30.004559 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e378af81-ef05-47a5-b87b-2d78018a3700-dbus\") pod \"global-pull-secret-syncer-5927m\" (UID: \"e378af81-ef05-47a5-b87b-2d78018a3700\") " pod="kube-system/global-pull-secret-syncer-5927m" Apr 16 14:58:30.004622 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:30.004589 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e378af81-ef05-47a5-b87b-2d78018a3700-kubelet-config\") pod \"global-pull-secret-syncer-5927m\" (UID: \"e378af81-ef05-47a5-b87b-2d78018a3700\") " pod="kube-system/global-pull-secret-syncer-5927m" Apr 16 14:58:30.004729 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:30.004670 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e378af81-ef05-47a5-b87b-2d78018a3700-kubelet-config\") pod \"global-pull-secret-syncer-5927m\" (UID: \"e378af81-ef05-47a5-b87b-2d78018a3700\") " pod="kube-system/global-pull-secret-syncer-5927m" Apr 16 14:58:30.004800 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:30.004777 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e378af81-ef05-47a5-b87b-2d78018a3700-dbus\") pod \"global-pull-secret-syncer-5927m\" (UID: \"e378af81-ef05-47a5-b87b-2d78018a3700\") " pod="kube-system/global-pull-secret-syncer-5927m" Apr 16 14:58:30.006869 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:30.006850 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e378af81-ef05-47a5-b87b-2d78018a3700-original-pull-secret\") pod \"global-pull-secret-syncer-5927m\" (UID: \"e378af81-ef05-47a5-b87b-2d78018a3700\") " pod="kube-system/global-pull-secret-syncer-5927m" Apr 16 14:58:30.131722 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:30.131665 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5927m" Apr 16 14:58:30.248109 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:30.248082 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5927m"] Apr 16 14:58:30.250516 ip-10-0-141-195 kubenswrapper[2576]: W0416 14:58:30.250487 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode378af81_ef05_47a5_b87b_2d78018a3700.slice/crio-4a0b5ac1adc9d6662855731480f02b730e89e08a44c6214de72fbbc0bf925997 WatchSource:0}: Error finding container 4a0b5ac1adc9d6662855731480f02b730e89e08a44c6214de72fbbc0bf925997: Status 404 returned error can't find the container with id 4a0b5ac1adc9d6662855731480f02b730e89e08a44c6214de72fbbc0bf925997 Apr 16 14:58:30.252102 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:30.252086 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:58:30.814332 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:30.814290 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5927m" event={"ID":"e378af81-ef05-47a5-b87b-2d78018a3700","Type":"ContainerStarted","Data":"4a0b5ac1adc9d6662855731480f02b730e89e08a44c6214de72fbbc0bf925997"} Apr 16 14:58:34.827127 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:34.827085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5927m" event={"ID":"e378af81-ef05-47a5-b87b-2d78018a3700","Type":"ContainerStarted","Data":"8be1c92c60fedd7780cf07f7c6c0e4217be0b42fd2c402596dd658989cc17a19"} Apr 16 14:58:34.840585 ip-10-0-141-195 kubenswrapper[2576]: I0416 14:58:34.840530 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-5927m" podStartSLOduration=2.273617395 podStartE2EDuration="5.840515945s" podCreationTimestamp="2026-04-16 14:58:29 +0000 UTC" firstStartedPulling="2026-04-16 14:58:30.252221198 +0000 UTC m=+371.170095991" lastFinishedPulling="2026-04-16 14:58:33.819119733 +0000 UTC m=+374.736994541" observedRunningTime="2026-04-16 14:58:34.839490267 +0000 UTC m=+375.757365096" watchObservedRunningTime="2026-04-16 14:58:34.840515945 +0000 UTC m=+375.758390759" Apr 16 15:02:19.607792 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:02:19.607721 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovn-acl-logging/0.log" Apr 16 15:02:19.608181 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:02:19.607974 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovn-acl-logging/0.log" Apr 16 15:02:53.994302 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:02:53.994265 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-xftc8"] Apr 16 15:02:53.997436 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:02:53.997422 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-xftc8" Apr 16 15:02:53.999778 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:02:53.999756 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 15:02:53.999896 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:02:53.999779 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 15:02:54.000517 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:02:54.000499 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-qpr59\"" Apr 16 15:02:54.000563 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:02:54.000512 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 15:02:54.003386 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:02:54.003360 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-xftc8"] Apr 16 15:02:54.034372 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:02:54.034347 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsbmn\" (UniqueName: \"kubernetes.io/projected/f904b88f-252d-430d-a4a8-25180af6783b-kube-api-access-hsbmn\") pod \"s3-init-xftc8\" (UID: \"f904b88f-252d-430d-a4a8-25180af6783b\") " pod="kserve/s3-init-xftc8" Apr 16 15:02:54.134832 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:02:54.134808 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsbmn\" (UniqueName: \"kubernetes.io/projected/f904b88f-252d-430d-a4a8-25180af6783b-kube-api-access-hsbmn\") pod \"s3-init-xftc8\" (UID: \"f904b88f-252d-430d-a4a8-25180af6783b\") " pod="kserve/s3-init-xftc8" Apr 16 15:02:54.144504 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:02:54.144481 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsbmn\" (UniqueName: \"kubernetes.io/projected/f904b88f-252d-430d-a4a8-25180af6783b-kube-api-access-hsbmn\") pod \"s3-init-xftc8\" (UID: \"f904b88f-252d-430d-a4a8-25180af6783b\") " pod="kserve/s3-init-xftc8" Apr 16 15:02:54.318953 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:02:54.318896 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-xftc8" Apr 16 15:02:54.437219 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:02:54.437188 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-xftc8"] Apr 16 15:02:54.440817 ip-10-0-141-195 kubenswrapper[2576]: W0416 15:02:54.440782 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf904b88f_252d_430d_a4a8_25180af6783b.slice/crio-82df0442a15efd88e1779376eadd6b0262c2f5cde6419612c551fd05928afd58 WatchSource:0}: Error finding container 82df0442a15efd88e1779376eadd6b0262c2f5cde6419612c551fd05928afd58: Status 404 returned error can't find the container with id 82df0442a15efd88e1779376eadd6b0262c2f5cde6419612c551fd05928afd58 Apr 16 15:02:54.560705 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:02:54.560670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-xftc8" event={"ID":"f904b88f-252d-430d-a4a8-25180af6783b","Type":"ContainerStarted","Data":"82df0442a15efd88e1779376eadd6b0262c2f5cde6419612c551fd05928afd58"} Apr 16 15:02:59.579278 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:02:59.579184 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-xftc8" event={"ID":"f904b88f-252d-430d-a4a8-25180af6783b","Type":"ContainerStarted","Data":"453de05cdc93e62137c599b7ef1297f4efda99714d7f45a7c7e7bc8938a65590"} Apr 16 15:02:59.596405 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:02:59.596351 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-xftc8" podStartSLOduration=1.8915523159999998 podStartE2EDuration="6.596333385s" podCreationTimestamp="2026-04-16 15:02:53 +0000 UTC" firstStartedPulling="2026-04-16 15:02:54.443072962 +0000 UTC m=+635.360947756" lastFinishedPulling="2026-04-16 15:02:59.147854025 +0000 UTC m=+640.065728825" observedRunningTime="2026-04-16 15:02:59.595403061 +0000 UTC m=+640.513277873" watchObservedRunningTime="2026-04-16 15:02:59.596333385 +0000 UTC m=+640.514208235" Apr 16 15:03:02.588557 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:03:02.588524 2576 generic.go:358] "Generic (PLEG): container finished" podID="f904b88f-252d-430d-a4a8-25180af6783b" containerID="453de05cdc93e62137c599b7ef1297f4efda99714d7f45a7c7e7bc8938a65590" exitCode=0 Apr 16 15:03:02.588902 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:03:02.588600 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-xftc8" event={"ID":"f904b88f-252d-430d-a4a8-25180af6783b","Type":"ContainerDied","Data":"453de05cdc93e62137c599b7ef1297f4efda99714d7f45a7c7e7bc8938a65590"} Apr 16 15:03:03.717939 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:03:03.717917 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-xftc8" Apr 16 15:03:03.822766 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:03:03.822737 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsbmn\" (UniqueName: \"kubernetes.io/projected/f904b88f-252d-430d-a4a8-25180af6783b-kube-api-access-hsbmn\") pod \"f904b88f-252d-430d-a4a8-25180af6783b\" (UID: \"f904b88f-252d-430d-a4a8-25180af6783b\") " Apr 16 15:03:03.824985 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:03:03.824948 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f904b88f-252d-430d-a4a8-25180af6783b-kube-api-access-hsbmn" (OuterVolumeSpecName: "kube-api-access-hsbmn") pod "f904b88f-252d-430d-a4a8-25180af6783b" (UID: "f904b88f-252d-430d-a4a8-25180af6783b"). InnerVolumeSpecName "kube-api-access-hsbmn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:03:03.923608 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:03:03.923590 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hsbmn\" (UniqueName: \"kubernetes.io/projected/f904b88f-252d-430d-a4a8-25180af6783b-kube-api-access-hsbmn\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 15:03:04.597686 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:03:04.597662 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-xftc8" Apr 16 15:03:04.597686 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:03:04.597671 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-xftc8" event={"ID":"f904b88f-252d-430d-a4a8-25180af6783b","Type":"ContainerDied","Data":"82df0442a15efd88e1779376eadd6b0262c2f5cde6419612c551fd05928afd58"} Apr 16 15:03:04.597887 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:03:04.597700 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82df0442a15efd88e1779376eadd6b0262c2f5cde6419612c551fd05928afd58" Apr 16 15:07:19.630688 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:07:19.630593 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovn-acl-logging/0.log" Apr 16 15:07:19.632007 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:07:19.631986 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovn-acl-logging/0.log" Apr 16 15:12:19.651383 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:12:19.651348 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovn-acl-logging/0.log" Apr 16 15:12:19.653225 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:12:19.653200 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovn-acl-logging/0.log" Apr 16 15:16:52.655269 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:52.655234 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c628w/must-gather-6ptvz"] Apr 16 15:16:52.655765 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:52.655504 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f904b88f-252d-430d-a4a8-25180af6783b" containerName="s3-init" Apr 16 15:16:52.655765 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:52.655514 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f904b88f-252d-430d-a4a8-25180af6783b" containerName="s3-init" Apr 16 15:16:52.655765 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:52.655570 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f904b88f-252d-430d-a4a8-25180af6783b" containerName="s3-init" Apr 16 15:16:52.657415 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:52.657400 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c628w/must-gather-6ptvz" Apr 16 15:16:52.660520 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:52.660493 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-c628w\"/\"default-dockercfg-d8djl\"" Apr 16 15:16:52.664034 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:52.660829 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-c628w\"/\"openshift-service-ca.crt\"" Apr 16 15:16:52.664034 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:52.660858 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-c628w\"/\"kube-root-ca.crt\"" Apr 16 15:16:52.668433 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:52.668407 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-c628w/must-gather-6ptvz"] Apr 16 15:16:52.764090 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:52.764062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/68bfc33e-ccd5-48f7-b0c6-31b75b0611af-must-gather-output\") pod \"must-gather-6ptvz\" (UID: \"68bfc33e-ccd5-48f7-b0c6-31b75b0611af\") " pod="openshift-must-gather-c628w/must-gather-6ptvz" Apr 16 15:16:52.764236 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:52.764106 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26w2j\" (UniqueName: \"kubernetes.io/projected/68bfc33e-ccd5-48f7-b0c6-31b75b0611af-kube-api-access-26w2j\") pod \"must-gather-6ptvz\" (UID: \"68bfc33e-ccd5-48f7-b0c6-31b75b0611af\") " pod="openshift-must-gather-c628w/must-gather-6ptvz" Apr 16 15:16:52.864580 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:52.864532 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/68bfc33e-ccd5-48f7-b0c6-31b75b0611af-must-gather-output\") pod \"must-gather-6ptvz\" (UID: \"68bfc33e-ccd5-48f7-b0c6-31b75b0611af\") " pod="openshift-must-gather-c628w/must-gather-6ptvz" Apr 16 15:16:52.864580 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:52.864583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26w2j\" (UniqueName: \"kubernetes.io/projected/68bfc33e-ccd5-48f7-b0c6-31b75b0611af-kube-api-access-26w2j\") pod \"must-gather-6ptvz\" (UID: \"68bfc33e-ccd5-48f7-b0c6-31b75b0611af\") " pod="openshift-must-gather-c628w/must-gather-6ptvz" Apr 16 15:16:52.864899 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:52.864879 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/68bfc33e-ccd5-48f7-b0c6-31b75b0611af-must-gather-output\") pod \"must-gather-6ptvz\" (UID: \"68bfc33e-ccd5-48f7-b0c6-31b75b0611af\") " pod="openshift-must-gather-c628w/must-gather-6ptvz" Apr 16 15:16:52.873310 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:52.873275 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26w2j\" (UniqueName: \"kubernetes.io/projected/68bfc33e-ccd5-48f7-b0c6-31b75b0611af-kube-api-access-26w2j\") pod \"must-gather-6ptvz\" (UID: \"68bfc33e-ccd5-48f7-b0c6-31b75b0611af\") " pod="openshift-must-gather-c628w/must-gather-6ptvz" Apr 16 15:16:52.983203 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:52.983112 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c628w/must-gather-6ptvz" Apr 16 15:16:53.099394 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:53.099362 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-c628w/must-gather-6ptvz"] Apr 16 15:16:53.102382 ip-10-0-141-195 kubenswrapper[2576]: W0416 15:16:53.102354 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68bfc33e_ccd5_48f7_b0c6_31b75b0611af.slice/crio-69e60d93ac178b4b9ef712ee34c2bfc1148e6b8ee6dcf021fa28d2c507940eba WatchSource:0}: Error finding container 69e60d93ac178b4b9ef712ee34c2bfc1148e6b8ee6dcf021fa28d2c507940eba: Status 404 returned error can't find the container with id 69e60d93ac178b4b9ef712ee34c2bfc1148e6b8ee6dcf021fa28d2c507940eba Apr 16 15:16:53.104039 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:53.104021 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:16:53.929148 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:53.929097 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c628w/must-gather-6ptvz" event={"ID":"68bfc33e-ccd5-48f7-b0c6-31b75b0611af","Type":"ContainerStarted","Data":"69e60d93ac178b4b9ef712ee34c2bfc1148e6b8ee6dcf021fa28d2c507940eba"} Apr 16 15:16:57.943260 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:57.943223 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c628w/must-gather-6ptvz" event={"ID":"68bfc33e-ccd5-48f7-b0c6-31b75b0611af","Type":"ContainerStarted","Data":"6b136d95920cef708316a489b08778578408fdea0e61fad2fa2dc340485707c4"} Apr 16 15:16:57.943260 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:57.943264 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c628w/must-gather-6ptvz" event={"ID":"68bfc33e-ccd5-48f7-b0c6-31b75b0611af","Type":"ContainerStarted","Data":"25f38de73db7e78c3e8147e37f3755dd7c5de1bf79cea33f27498702a018b6a7"} Apr 16 15:16:57.957200 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:16:57.957143 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-c628w/must-gather-6ptvz" podStartSLOduration=1.448844928 podStartE2EDuration="5.957126203s" podCreationTimestamp="2026-04-16 15:16:52 +0000 UTC" firstStartedPulling="2026-04-16 15:16:53.104207733 +0000 UTC m=+1474.022082539" lastFinishedPulling="2026-04-16 15:16:57.612489015 +0000 UTC m=+1478.530363814" observedRunningTime="2026-04-16 15:16:57.95670145 +0000 UTC m=+1478.874576278" watchObservedRunningTime="2026-04-16 15:16:57.957126203 +0000 UTC m=+1478.875001019" Apr 16 15:17:14.992276 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:14.992241 2576 generic.go:358] "Generic (PLEG): container finished" podID="68bfc33e-ccd5-48f7-b0c6-31b75b0611af" containerID="25f38de73db7e78c3e8147e37f3755dd7c5de1bf79cea33f27498702a018b6a7" exitCode=0 Apr 16 15:17:14.992736 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:14.992317 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c628w/must-gather-6ptvz" event={"ID":"68bfc33e-ccd5-48f7-b0c6-31b75b0611af","Type":"ContainerDied","Data":"25f38de73db7e78c3e8147e37f3755dd7c5de1bf79cea33f27498702a018b6a7"} Apr 16 15:17:14.992736 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:14.992615 2576 scope.go:117] "RemoveContainer" containerID="25f38de73db7e78c3e8147e37f3755dd7c5de1bf79cea33f27498702a018b6a7" Apr 16 15:17:15.615489 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:15.615436 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c628w_must-gather-6ptvz_68bfc33e-ccd5-48f7-b0c6-31b75b0611af/gather/0.log" Apr 16 15:17:18.800271 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:18.800242 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-5927m_e378af81-ef05-47a5-b87b-2d78018a3700/global-pull-secret-syncer/0.log" Apr 16 15:17:18.945200 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:18.945161 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2tqg5_0d793735-b1fb-4ca3-bc99-1447700e773f/konnectivity-agent/0.log" Apr 16 15:17:19.025873 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:19.025846 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-195.ec2.internal_d244d209c4485d5d4fbaebc8851f6290/haproxy/0.log" Apr 16 15:17:19.679436 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:19.679410 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovn-acl-logging/0.log" Apr 16 15:17:19.679618 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:19.679541 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovn-acl-logging/0.log" Apr 16 15:17:21.009829 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:21.009791 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c628w/must-gather-6ptvz"] Apr 16 15:17:21.010312 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:21.010095 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-c628w/must-gather-6ptvz" podUID="68bfc33e-ccd5-48f7-b0c6-31b75b0611af" containerName="copy" containerID="cri-o://6b136d95920cef708316a489b08778578408fdea0e61fad2fa2dc340485707c4" gracePeriod=2 Apr 16 15:17:21.011622 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:21.011592 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c628w/must-gather-6ptvz"] Apr 16 15:17:21.011847 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:21.011822 2576 status_manager.go:895] "Failed to get status for pod" podUID="68bfc33e-ccd5-48f7-b0c6-31b75b0611af" pod="openshift-must-gather-c628w/must-gather-6ptvz" err="pods \"must-gather-6ptvz\" is forbidden: User \"system:node:ip-10-0-141-195.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-c628w\": no relationship found between node 'ip-10-0-141-195.ec2.internal' and this object" Apr 16 15:17:21.236402 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:21.236381 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c628w_must-gather-6ptvz_68bfc33e-ccd5-48f7-b0c6-31b75b0611af/copy/0.log" Apr 16 15:17:21.236721 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:21.236704 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c628w/must-gather-6ptvz" Apr 16 15:17:21.238613 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:21.238592 2576 status_manager.go:895] "Failed to get status for pod" podUID="68bfc33e-ccd5-48f7-b0c6-31b75b0611af" pod="openshift-must-gather-c628w/must-gather-6ptvz" err="pods \"must-gather-6ptvz\" is forbidden: User \"system:node:ip-10-0-141-195.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-c628w\": no relationship found between node 'ip-10-0-141-195.ec2.internal' and this object" Apr 16 15:17:21.404677 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:21.404631 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/68bfc33e-ccd5-48f7-b0c6-31b75b0611af-must-gather-output\") pod \"68bfc33e-ccd5-48f7-b0c6-31b75b0611af\" (UID: \"68bfc33e-ccd5-48f7-b0c6-31b75b0611af\") " Apr 16 15:17:21.404839 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:21.404715 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26w2j\" (UniqueName: \"kubernetes.io/projected/68bfc33e-ccd5-48f7-b0c6-31b75b0611af-kube-api-access-26w2j\") pod \"68bfc33e-ccd5-48f7-b0c6-31b75b0611af\" (UID: \"68bfc33e-ccd5-48f7-b0c6-31b75b0611af\") " Apr 16 15:17:21.405961 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:21.405935 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68bfc33e-ccd5-48f7-b0c6-31b75b0611af-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "68bfc33e-ccd5-48f7-b0c6-31b75b0611af" (UID: "68bfc33e-ccd5-48f7-b0c6-31b75b0611af"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:17:21.407048 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:21.407025 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68bfc33e-ccd5-48f7-b0c6-31b75b0611af-kube-api-access-26w2j" (OuterVolumeSpecName: "kube-api-access-26w2j") pod "68bfc33e-ccd5-48f7-b0c6-31b75b0611af" (UID: "68bfc33e-ccd5-48f7-b0c6-31b75b0611af"). InnerVolumeSpecName "kube-api-access-26w2j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:17:21.505991 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:21.505960 2576 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/68bfc33e-ccd5-48f7-b0c6-31b75b0611af-must-gather-output\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 15:17:21.505991 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:21.505989 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-26w2j\" (UniqueName: \"kubernetes.io/projected/68bfc33e-ccd5-48f7-b0c6-31b75b0611af-kube-api-access-26w2j\") on node \"ip-10-0-141-195.ec2.internal\" DevicePath \"\"" Apr 16 15:17:21.709579 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:21.709501 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68bfc33e-ccd5-48f7-b0c6-31b75b0611af" path="/var/lib/kubelet/pods/68bfc33e-ccd5-48f7-b0c6-31b75b0611af/volumes" Apr 16 15:17:22.017434 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:22.017349 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c628w_must-gather-6ptvz_68bfc33e-ccd5-48f7-b0c6-31b75b0611af/copy/0.log" Apr 16 15:17:22.017872 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:22.017683 2576 generic.go:358] "Generic (PLEG): container finished" podID="68bfc33e-ccd5-48f7-b0c6-31b75b0611af" containerID="6b136d95920cef708316a489b08778578408fdea0e61fad2fa2dc340485707c4" exitCode=143 Apr 16 15:17:22.017872 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:22.017739 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c628w/must-gather-6ptvz" Apr 16 15:17:22.017872 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:22.017779 2576 scope.go:117] "RemoveContainer" containerID="6b136d95920cef708316a489b08778578408fdea0e61fad2fa2dc340485707c4" Apr 16 15:17:22.025097 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:22.025081 2576 scope.go:117] "RemoveContainer" containerID="25f38de73db7e78c3e8147e37f3755dd7c5de1bf79cea33f27498702a018b6a7" Apr 16 15:17:22.036613 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:22.036598 2576 scope.go:117] "RemoveContainer" containerID="6b136d95920cef708316a489b08778578408fdea0e61fad2fa2dc340485707c4" Apr 16 15:17:22.036893 ip-10-0-141-195 kubenswrapper[2576]: E0416 15:17:22.036874 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b136d95920cef708316a489b08778578408fdea0e61fad2fa2dc340485707c4\": container with ID starting with 6b136d95920cef708316a489b08778578408fdea0e61fad2fa2dc340485707c4 not found: ID does not exist" containerID="6b136d95920cef708316a489b08778578408fdea0e61fad2fa2dc340485707c4" Apr 16 15:17:22.036943 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:22.036902 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b136d95920cef708316a489b08778578408fdea0e61fad2fa2dc340485707c4"} err="failed to get container status \"6b136d95920cef708316a489b08778578408fdea0e61fad2fa2dc340485707c4\": rpc error: code = NotFound desc = could not find container \"6b136d95920cef708316a489b08778578408fdea0e61fad2fa2dc340485707c4\": container with ID starting with 6b136d95920cef708316a489b08778578408fdea0e61fad2fa2dc340485707c4 not found: ID does not exist" Apr 16 15:17:22.036943 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:22.036920 2576 scope.go:117] "RemoveContainer" containerID="25f38de73db7e78c3e8147e37f3755dd7c5de1bf79cea33f27498702a018b6a7" Apr 16 15:17:22.037134 ip-10-0-141-195 kubenswrapper[2576]: E0416 15:17:22.037115 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f38de73db7e78c3e8147e37f3755dd7c5de1bf79cea33f27498702a018b6a7\": container with ID starting with 25f38de73db7e78c3e8147e37f3755dd7c5de1bf79cea33f27498702a018b6a7 not found: ID does not exist" containerID="25f38de73db7e78c3e8147e37f3755dd7c5de1bf79cea33f27498702a018b6a7" Apr 16 15:17:22.037178 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:22.037140 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f38de73db7e78c3e8147e37f3755dd7c5de1bf79cea33f27498702a018b6a7"} err="failed to get container status \"25f38de73db7e78c3e8147e37f3755dd7c5de1bf79cea33f27498702a018b6a7\": rpc error: code = NotFound desc = could not find container \"25f38de73db7e78c3e8147e37f3755dd7c5de1bf79cea33f27498702a018b6a7\": container with ID starting with 25f38de73db7e78c3e8147e37f3755dd7c5de1bf79cea33f27498702a018b6a7 not found: ID does not exist" Apr 16 15:17:23.003289 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:23.003267 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-kdlzz_246df5c2-dafa-46a3-b572-3e25f6e310a8/monitoring-plugin/0.log" Apr 16 15:17:23.032368 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:23.032341 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8d8v6_032e1c7b-011b-4be3-b4a6-5a7eb806fbff/node-exporter/0.log" Apr 16 15:17:23.054121 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:23.054098 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8d8v6_032e1c7b-011b-4be3-b4a6-5a7eb806fbff/kube-rbac-proxy/0.log" Apr 16 15:17:23.079272 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:23.079253 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8d8v6_032e1c7b-011b-4be3-b4a6-5a7eb806fbff/init-textfile/0.log" Apr 16 15:17:23.330937 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:23.330865 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59/prometheus/0.log" Apr 16 15:17:23.351873 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:23.351854 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59/config-reloader/0.log" Apr 16 15:17:23.374854 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:23.374832 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59/thanos-sidecar/0.log" Apr 16 15:17:23.395576 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:23.395557 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59/kube-rbac-proxy-web/0.log" Apr 16 15:17:23.419988 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:23.419966 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59/kube-rbac-proxy/0.log" Apr 16 15:17:23.440115 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:23.440092 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59/kube-rbac-proxy-thanos/0.log" Apr 16 15:17:23.460204 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:23.460179 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dfe686f1-f0d5-49a0-81a4-42aa4e1e6b59/init-config-reloader/0.log" Apr 16 15:17:23.660666 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:23.660620 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-9fd75b6db-n4768_a555f4b5-439d-49f4-b8ed-e9bc2efe809d/thanos-query/0.log" Apr 16 15:17:23.693021 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:23.692993 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-9fd75b6db-n4768_a555f4b5-439d-49f4-b8ed-e9bc2efe809d/kube-rbac-proxy-web/0.log" Apr 16 15:17:23.722186 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:23.722158 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-9fd75b6db-n4768_a555f4b5-439d-49f4-b8ed-e9bc2efe809d/kube-rbac-proxy/0.log" Apr 16 15:17:23.754173 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:23.754145 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-9fd75b6db-n4768_a555f4b5-439d-49f4-b8ed-e9bc2efe809d/prom-label-proxy/0.log" Apr 16 15:17:23.785845 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:23.785819 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-9fd75b6db-n4768_a555f4b5-439d-49f4-b8ed-e9bc2efe809d/kube-rbac-proxy-rules/0.log" Apr 16 15:17:23.816539 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:23.816503 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-9fd75b6db-n4768_a555f4b5-439d-49f4-b8ed-e9bc2efe809d/kube-rbac-proxy-metrics/0.log" Apr 16 15:17:26.238876 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.238846 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd"] Apr 16 15:17:26.239211 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.239122 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68bfc33e-ccd5-48f7-b0c6-31b75b0611af" containerName="gather" Apr 16 15:17:26.239211 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.239133 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="68bfc33e-ccd5-48f7-b0c6-31b75b0611af" containerName="gather" Apr 16 15:17:26.239211 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.239155 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68bfc33e-ccd5-48f7-b0c6-31b75b0611af" containerName="copy" Apr 16 15:17:26.239211 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.239160 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="68bfc33e-ccd5-48f7-b0c6-31b75b0611af" containerName="copy" Apr 16 15:17:26.239211 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.239212 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="68bfc33e-ccd5-48f7-b0c6-31b75b0611af" containerName="copy" Apr 16 15:17:26.239370 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.239223 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="68bfc33e-ccd5-48f7-b0c6-31b75b0611af" containerName="gather" Apr 16 15:17:26.243005 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.242985 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:26.245259 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.245237 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xtlq6\"/\"kube-root-ca.crt\"" Apr 16 15:17:26.246022 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.246006 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xtlq6\"/\"openshift-service-ca.crt\"" Apr 16 15:17:26.246095 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.246011 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xtlq6\"/\"default-dockercfg-sktqr\"" Apr 16 15:17:26.251731 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.251710 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd"] Apr 16 15:17:26.343506 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.343483 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ddf873ee-df8b-41f5-92a7-260a2b7f433a-sys\") pod \"perf-node-gather-daemonset-h5hdd\" (UID: \"ddf873ee-df8b-41f5-92a7-260a2b7f433a\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:26.343653 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.343527 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ddf873ee-df8b-41f5-92a7-260a2b7f433a-podres\") pod \"perf-node-gather-daemonset-h5hdd\" (UID: \"ddf873ee-df8b-41f5-92a7-260a2b7f433a\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:26.343653 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.343555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ddf873ee-df8b-41f5-92a7-260a2b7f433a-lib-modules\") pod \"perf-node-gather-daemonset-h5hdd\" (UID: \"ddf873ee-df8b-41f5-92a7-260a2b7f433a\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:26.343778 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.343664 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8jp6\" (UniqueName: \"kubernetes.io/projected/ddf873ee-df8b-41f5-92a7-260a2b7f433a-kube-api-access-c8jp6\") pod \"perf-node-gather-daemonset-h5hdd\" (UID: \"ddf873ee-df8b-41f5-92a7-260a2b7f433a\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:26.343778 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.343708 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ddf873ee-df8b-41f5-92a7-260a2b7f433a-proc\") pod \"perf-node-gather-daemonset-h5hdd\" (UID: \"ddf873ee-df8b-41f5-92a7-260a2b7f433a\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:26.444876 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.444851 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ddf873ee-df8b-41f5-92a7-260a2b7f433a-proc\") pod \"perf-node-gather-daemonset-h5hdd\" (UID: \"ddf873ee-df8b-41f5-92a7-260a2b7f433a\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:26.444975 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.444885 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ddf873ee-df8b-41f5-92a7-260a2b7f433a-sys\") pod \"perf-node-gather-daemonset-h5hdd\" (UID: \"ddf873ee-df8b-41f5-92a7-260a2b7f433a\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:26.444975 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.444922 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ddf873ee-df8b-41f5-92a7-260a2b7f433a-podres\") pod \"perf-node-gather-daemonset-h5hdd\" (UID: \"ddf873ee-df8b-41f5-92a7-260a2b7f433a\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:26.444975 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.444950 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ddf873ee-df8b-41f5-92a7-260a2b7f433a-lib-modules\") pod \"perf-node-gather-daemonset-h5hdd\" (UID: \"ddf873ee-df8b-41f5-92a7-260a2b7f433a\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:26.444975 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.444960 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ddf873ee-df8b-41f5-92a7-260a2b7f433a-proc\") pod \"perf-node-gather-daemonset-h5hdd\" (UID: \"ddf873ee-df8b-41f5-92a7-260a2b7f433a\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:26.445123 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.444981 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8jp6\" (UniqueName: \"kubernetes.io/projected/ddf873ee-df8b-41f5-92a7-260a2b7f433a-kube-api-access-c8jp6\") pod \"perf-node-gather-daemonset-h5hdd\" (UID: \"ddf873ee-df8b-41f5-92a7-260a2b7f433a\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:26.445123 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.444985 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ddf873ee-df8b-41f5-92a7-260a2b7f433a-sys\") pod \"perf-node-gather-daemonset-h5hdd\" (UID: \"ddf873ee-df8b-41f5-92a7-260a2b7f433a\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:26.445123 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.445053 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ddf873ee-df8b-41f5-92a7-260a2b7f433a-podres\") pod \"perf-node-gather-daemonset-h5hdd\" (UID: \"ddf873ee-df8b-41f5-92a7-260a2b7f433a\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:26.445123 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.445084 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ddf873ee-df8b-41f5-92a7-260a2b7f433a-lib-modules\") pod \"perf-node-gather-daemonset-h5hdd\" (UID: \"ddf873ee-df8b-41f5-92a7-260a2b7f433a\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:26.452698 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.452682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8jp6\" (UniqueName: \"kubernetes.io/projected/ddf873ee-df8b-41f5-92a7-260a2b7f433a-kube-api-access-c8jp6\") pod \"perf-node-gather-daemonset-h5hdd\" (UID: \"ddf873ee-df8b-41f5-92a7-260a2b7f433a\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:26.553418 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.553367 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:26.669826 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.669795 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd"] Apr 16 15:17:26.673470 ip-10-0-141-195 kubenswrapper[2576]: W0416 15:17:26.673443 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podddf873ee_df8b_41f5_92a7_260a2b7f433a.slice/crio-d6befe485f925493c62c4f10fa745453d8eaed3f8f29c4f6fba64410e3b59258 WatchSource:0}: Error finding container d6befe485f925493c62c4f10fa745453d8eaed3f8f29c4f6fba64410e3b59258: Status 404 returned error can't find the container with id d6befe485f925493c62c4f10fa745453d8eaed3f8f29c4f6fba64410e3b59258 Apr 16 15:17:26.674870 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.674855 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jwtvm_f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4/dns/0.log" Apr 16 15:17:26.700627 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.700604 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jwtvm_f5c2c4ed-d5b3-4fe6-be69-9b6eed9488d4/kube-rbac-proxy/0.log" Apr 16 15:17:26.770502 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:26.770483 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8js26_15bab44e-b8bf-4170-b623-a5f844d8bfb0/dns-node-resolver/0.log" Apr 16 15:17:27.034468 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:27.034043 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" event={"ID":"ddf873ee-df8b-41f5-92a7-260a2b7f433a","Type":"ContainerStarted","Data":"9ae862b1731ede4dd54e4c382b797b8222144f1cfc10f615d17e2d93b97dea94"} Apr 16 15:17:27.034468 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:27.034082 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" event={"ID":"ddf873ee-df8b-41f5-92a7-260a2b7f433a","Type":"ContainerStarted","Data":"d6befe485f925493c62c4f10fa745453d8eaed3f8f29c4f6fba64410e3b59258"} Apr 16 15:17:27.034769 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:27.034735 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:27.050809 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:27.050772 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" podStartSLOduration=1.05075914 podStartE2EDuration="1.05075914s" podCreationTimestamp="2026-04-16 15:17:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:17:27.049791716 +0000 UTC m=+1507.967666528" watchObservedRunningTime="2026-04-16 15:17:27.05075914 +0000 UTC m=+1507.968633954" Apr 16 15:17:27.209995 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:27.209970 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6955d6c465-cssgp_d060e048-d462-47d1-a500-82e8b6eff8ba/registry/0.log" Apr 16 15:17:27.227307 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:27.227286 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7xqtf_653d6001-f9f0-440a-9aab-e87455bc4e3f/node-ca/0.log" Apr 16 15:17:28.284700 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:28.284609 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-bb2ms_f88b98b0-264f-41ee-a565-3a2941c70020/serve-healthcheck-canary/0.log" Apr 16 15:17:28.646954 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:28.646924 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-xlbhf_2bc1e63a-763f-4bce-914e-ec0b77b7b58b/insights-operator/0.log" Apr 16 15:17:28.647147 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:28.647003 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-xlbhf_2bc1e63a-763f-4bce-914e-ec0b77b7b58b/insights-operator/1.log" Apr 16 15:17:28.785913 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:28.785886 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d2qg6_d9e99e03-4d4d-406e-80e2-a86bf88da5d6/kube-rbac-proxy/0.log" Apr 16 15:17:28.804509 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:28.804489 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d2qg6_d9e99e03-4d4d-406e-80e2-a86bf88da5d6/exporter/0.log" Apr 16 15:17:28.822587 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:28.822568 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d2qg6_d9e99e03-4d4d-406e-80e2-a86bf88da5d6/extractor/0.log" Apr 16 15:17:30.805732 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:30.805702 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-xftc8_f904b88f-252d-430d-a4a8-25180af6783b/s3-init/0.log" Apr 16 15:17:34.049233 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:34.049200 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-h5hdd" Apr 16 15:17:34.463651 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:34.463608 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-8t6wg_3b4f3d1e-d7d2-45c2-a9ac-b5217f58501c/migrator/0.log" Apr 16 15:17:34.481892 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:34.481871 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-8t6wg_3b4f3d1e-d7d2-45c2-a9ac-b5217f58501c/graceful-termination/0.log" Apr 16 15:17:35.831740 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:35.831708 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-76rhf_c68702b9-21dc-43b1-ba9f-d5a236a6b183/kube-multus/0.log" Apr 16 15:17:36.228730 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:36.228705 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gn47f_fac8cd8d-4a46-40df-b803-d33c16259cc1/kube-multus-additional-cni-plugins/0.log" Apr 16 15:17:36.247836 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:36.247811 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gn47f_fac8cd8d-4a46-40df-b803-d33c16259cc1/egress-router-binary-copy/0.log" Apr 16 15:17:36.266200 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:36.266177 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gn47f_fac8cd8d-4a46-40df-b803-d33c16259cc1/cni-plugins/0.log" Apr 16 15:17:36.285851 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:36.285833 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gn47f_fac8cd8d-4a46-40df-b803-d33c16259cc1/bond-cni-plugin/0.log" Apr 16 15:17:36.303719 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:36.303701 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gn47f_fac8cd8d-4a46-40df-b803-d33c16259cc1/routeoverride-cni/0.log" Apr 16 15:17:36.322008 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:36.321986 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gn47f_fac8cd8d-4a46-40df-b803-d33c16259cc1/whereabouts-cni-bincopy/0.log" Apr 16 15:17:36.343102 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:36.343082 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gn47f_fac8cd8d-4a46-40df-b803-d33c16259cc1/whereabouts-cni/0.log" Apr 16 15:17:36.389887 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:36.389834 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5zzl5_fb239c54-f254-4320-9008-4b4f5895660d/network-metrics-daemon/0.log" Apr 16 15:17:36.410078 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:36.410058 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5zzl5_fb239c54-f254-4320-9008-4b4f5895660d/kube-rbac-proxy/0.log" Apr 16 15:17:37.272565 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:37.272540 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovn-controller/0.log" Apr 16 15:17:37.290757 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:37.290732 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovn-acl-logging/0.log" Apr 16 15:17:37.297129 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:37.297096 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovn-acl-logging/1.log" Apr 16 15:17:37.314843 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:37.314827 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/kube-rbac-proxy-node/0.log" Apr 16 15:17:37.336088 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:37.336070 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 15:17:37.353957 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:37.353943 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/northd/0.log" Apr 16 15:17:37.373488 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:37.373468 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/nbdb/0.log" Apr 16 15:17:37.393730 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:37.393710 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/sbdb/0.log" Apr 16 15:17:37.483888 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:37.483861 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jsx45_7888efd0-340e-44a4-9e27-7fbbad8b7bfd/ovnkube-controller/0.log" Apr 16 15:17:38.905150 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:38.905124 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-8d689_07abaf9c-1ea5-40a9-85a0-8d5cf01a1693/check-endpoints/0.log" Apr 16 15:17:38.971474 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:38.971450 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-dpr62_3e8487d9-5753-42d0-9838-80ce2360a1b0/network-check-target-container/0.log" Apr 16 15:17:39.898624 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:39.898600 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-9tmph_ffd2dff3-4034-47e4-b3bf-bd072dba227e/iptables-alerter/0.log" Apr 16 15:17:40.530127 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:40.530089 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-gd596_6cdb3edd-1f53-40cc-a5d3-2aeb3c013c78/tuned/0.log" Apr 16 15:17:42.143490 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:42.143456 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-667775844f-npqr5_41c23f56-15f8-40a6-80cd-4dbe07fc9c5a/cluster-samples-operator/0.log" Apr 16 15:17:42.157752 ip-10-0-141-195 kubenswrapper[2576]: I0416 15:17:42.157732 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-667775844f-npqr5_41c23f56-15f8-40a6-80cd-4dbe07fc9c5a/cluster-samples-operator-watch/0.log"