Apr 16 20:55:31.967414 ip-10-0-138-120 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 20:55:31.967427 ip-10-0-138-120 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 20:55:31.967436 ip-10-0-138-120 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 20:55:31.967746 ip-10-0-138-120 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 20:55:42.201009 ip-10-0-138-120 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 20:55:42.201029 ip-10-0-138-120 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot dc9ff5c0e87e41038ba41180f8dad856 -- Apr 16 20:58:05.692682 ip-10-0-138-120 systemd[1]: Starting Kubernetes Kubelet... Apr 16 20:58:06.084958 ip-10-0-138-120 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:58:06.084958 ip-10-0-138-120 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 20:58:06.084958 ip-10-0-138-120 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:58:06.084958 ip-10-0-138-120 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 20:58:06.084958 ip-10-0-138-120 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:58:06.086136 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.085677 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 20:58:06.091753 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091720 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:58:06.091753 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091743 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:58:06.091753 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091748 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:58:06.091753 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091753 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:58:06.091753 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091756 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:58:06.091753 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091760 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091764 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091768 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091772 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091776 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091780 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091784 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091789 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091794 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091798 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091802 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091805 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091809 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091815 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091821 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091825 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091829 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091832 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091837 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:58:06.092090 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091840 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091844 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091847 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091852 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091856 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091860 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091865 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091868 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091873 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091876 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091880 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091885 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091889 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091893 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091897 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091902 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091907 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091912 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091916 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091920 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:58:06.092864 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091924 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091928 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091933 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091937 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091941 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091945 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091949 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091953 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091957 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091961 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091965 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091969 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091973 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091978 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091981 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091985 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091989 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091993 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.091997 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092004 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:58:06.093537 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092008 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092011 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092015 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092020 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092024 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092027 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092031 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092038 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092042 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092047 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092051 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092055 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092059 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092063 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092068 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092073 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092077 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092081 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092085 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:58:06.094040 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092088 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092092 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092095 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092727 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092736 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092741 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092746 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092751 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092755 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092759 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092764 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092768 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092772 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092776 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092779 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092784 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092788 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092792 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092796 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:58:06.094575 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092801 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092806 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092810 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092815 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092819 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092822 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092826 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092831 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092836 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092840 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092844 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092848 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092852 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092859 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092865 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092871 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092876 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092880 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092884 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:58:06.095143 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092888 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092893 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092897 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092902 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092907 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092911 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092916 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092920 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092924 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092930 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092934 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092938 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092942 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092946 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092950 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092954 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092958 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092962 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092966 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092970 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:58:06.095983 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092974 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092978 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092982 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092987 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092991 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092995 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.092999 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093003 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093008 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093012 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093016 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093020 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093024 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093028 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093032 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093036 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093040 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093044 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093048 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:58:06.096660 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093052 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093056 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093060 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093064 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093067 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093072 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093076 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093080 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093084 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093087 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093091 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.093095 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.094916 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.094934 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.094943 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.094950 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.094958 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.094964 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.094970 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.094977 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.094982 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 20:58:06.097232 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.094988 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.094993 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.094999 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095004 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095009 2568 flags.go:64] FLAG: --cgroup-root="" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095013 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095018 2568 flags.go:64] FLAG: --client-ca-file="" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095023 2568 flags.go:64] FLAG: --cloud-config="" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095028 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095033 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095040 2568 flags.go:64] FLAG: --cluster-domain="" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095045 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095050 2568 flags.go:64] FLAG: --config-dir="" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095054 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095060 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095067 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095072 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095085 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095091 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095095 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095100 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095104 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095109 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095114 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095121 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 20:58:06.097993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095126 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095130 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095135 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095140 2568 flags.go:64] FLAG: --enable-server="true" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095145 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095152 2568 flags.go:64] FLAG: --event-burst="100" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095157 2568 flags.go:64] FLAG: --event-qps="50" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095162 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095167 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095172 2568 flags.go:64] FLAG: --eviction-hard="" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095178 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095183 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095187 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095192 2568 flags.go:64] FLAG: --eviction-soft="" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095197 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095201 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095206 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095211 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095216 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095221 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095226 2568 flags.go:64] FLAG: --feature-gates="" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095232 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095237 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095242 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095247 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095252 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 16 20:58:06.098707 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095257 2568 flags.go:64] FLAG: --help="false" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095261 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-138-120.ec2.internal" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095266 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095271 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095276 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095282 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095287 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095292 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095296 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095301 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095306 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095311 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095316 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095322 2568 flags.go:64] FLAG: --kube-reserved="" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095326 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095331 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095336 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095340 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095345 2568 flags.go:64] FLAG: --lock-file="" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095350 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095354 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095359 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095369 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 20:58:06.099450 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095391 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095396 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095401 2568 flags.go:64] FLAG: --logging-format="text" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095405 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095411 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095415 2568 flags.go:64] FLAG: --manifest-url="" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095420 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095427 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095432 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095443 2568 flags.go:64] FLAG: --max-pods="110" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095449 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095454 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095458 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095463 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095467 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095472 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095477 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095490 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095495 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095501 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095506 2568 flags.go:64] FLAG: --pod-cidr="" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095511 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095519 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095524 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 20:58:06.100014 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095529 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095534 2568 flags.go:64] FLAG: --port="10250" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095539 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095543 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06a0f730f543d12ae" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095548 2568 flags.go:64] FLAG: --qos-reserved="" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095553 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095557 2568 flags.go:64] FLAG: --register-node="true" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095562 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095567 2568 flags.go:64] FLAG: --register-with-taints="" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095574 2568 flags.go:64] FLAG: --registry-burst="10" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095578 2568 flags.go:64] FLAG: --registry-qps="5" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095583 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095587 2568 flags.go:64] FLAG: --reserved-memory="" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095593 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095598 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095603 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095608 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095612 2568 flags.go:64] FLAG: --runonce="false" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095617 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095622 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095627 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095631 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095636 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095641 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095646 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095650 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 20:58:06.100660 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095655 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095660 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095665 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095679 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095685 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095690 2568 flags.go:64] FLAG: --system-cgroups="" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095694 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095703 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095708 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095713 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095720 2568 flags.go:64] FLAG: --tls-min-version="" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095725 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095729 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095737 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095741 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095747 2568 flags.go:64] FLAG: --v="2" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095760 2568 flags.go:64] FLAG: --version="false" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095767 2568 flags.go:64] FLAG: --vmodule="" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095773 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.095778 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.095930 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.095937 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.095943 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.095947 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:58:06.101264 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.095951 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.095955 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.095959 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.095963 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.095967 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.095971 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.095975 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.095979 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.095983 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.095987 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.095990 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.095994 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.095999 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096003 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096007 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096011 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096015 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096019 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096023 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096027 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:58:06.101848 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096031 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096042 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096046 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096050 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096054 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096058 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096062 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096066 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096071 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096075 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096079 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096083 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096087 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096091 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096095 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096101 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096105 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096109 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096114 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096118 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:58:06.102355 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096122 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096126 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096130 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096134 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096138 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096143 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096148 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096153 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096156 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096160 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096164 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096168 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096172 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096178 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096182 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096186 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096192 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096198 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096203 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096208 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:58:06.102897 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096212 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096216 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096220 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096224 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096230 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096237 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096242 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096246 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096250 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096254 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096259 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096263 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096267 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096271 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096275 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096279 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096283 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096287 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096292 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:58:06.103440 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096296 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:58:06.103930 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096300 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:58:06.103930 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.096304 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:58:06.103930 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.096978 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:58:06.103930 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.103749 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 20:58:06.103930 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.103767 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 20:58:06.103930 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103814 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:58:06.103930 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103819 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:58:06.103930 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103822 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:58:06.103930 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103825 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:58:06.103930 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103829 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:58:06.103930 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103832 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:58:06.103930 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103836 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:58:06.103930 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103838 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:58:06.103930 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103841 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:58:06.103930 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103844 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103846 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103849 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103852 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103854 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103857 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103859 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103862 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103864 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103867 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103869 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103872 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103875 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103877 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103880 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103882 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103885 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103887 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103889 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103892 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:58:06.104304 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103894 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103897 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103901 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103906 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103909 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103913 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103915 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103918 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103920 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103923 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103926 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103928 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103931 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103933 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103936 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103939 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103941 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103944 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103946 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:58:06.104798 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103950 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103953 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103955 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103958 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103960 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103963 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103967 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103970 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103973 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103975 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103978 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103981 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103983 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103986 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103988 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103991 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103995 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.103997 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104000 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:58:06.105265 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104002 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:58:06.105757 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104005 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:58:06.105757 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104007 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:58:06.105757 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104010 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:58:06.105757 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104013 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:58:06.105757 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104015 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:58:06.105757 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104018 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:58:06.105757 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104020 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:58:06.105757 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104022 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:58:06.105757 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104025 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:58:06.105757 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104028 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:58:06.105757 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104030 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:58:06.105757 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104032 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:58:06.105757 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104035 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:58:06.105757 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104038 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:58:06.105757 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104040 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:58:06.105757 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104042 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:58:06.105757 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104045 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:58:06.105757 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104048 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:58:06.106189 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.104052 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:58:06.106189 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104151 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:58:06.106189 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104155 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:58:06.106189 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104158 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:58:06.106189 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104161 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:58:06.106189 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104164 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:58:06.106189 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104166 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:58:06.106189 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104169 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:58:06.106189 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104171 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:58:06.106189 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104174 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:58:06.106189 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104177 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:58:06.106189 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104182 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:58:06.106189 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104185 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:58:06.106189 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104187 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:58:06.106189 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104190 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104193 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104195 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104197 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104200 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104202 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104205 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104207 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104210 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104212 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104215 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104217 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104220 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104222 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104225 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104227 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104230 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104232 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104234 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104236 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:58:06.106568 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104239 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104242 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104244 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104247 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104249 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104251 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104254 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104256 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104259 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104261 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104264 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104267 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104269 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104271 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104274 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104276 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104279 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104281 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104283 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104286 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:58:06.107051 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104288 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104291 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104293 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104295 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104298 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104300 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104303 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104307 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104310 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104313 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104316 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104319 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104321 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104324 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104326 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104329 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104331 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104334 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104336 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104338 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:58:06.107690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104341 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:58:06.108184 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104344 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:58:06.108184 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104346 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:58:06.108184 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104349 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:58:06.108184 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104351 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:58:06.108184 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104354 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:58:06.108184 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104356 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:58:06.108184 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104359 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:58:06.108184 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104361 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:58:06.108184 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104364 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:58:06.108184 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104366 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:58:06.108184 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104368 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:58:06.108184 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:06.104371 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:58:06.108184 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.104394 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:58:06.108184 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.105221 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 20:58:06.108184 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.107246 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 20:58:06.108572 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.108036 2568 server.go:1019] "Starting client certificate rotation" Apr 16 20:58:06.108572 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.108136 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:58:06.108572 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.108179 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:58:06.130810 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.130783 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:58:06.133547 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.133530 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:58:06.145424 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.145394 2568 log.go:25] "Validated CRI v1 runtime API" Apr 16 20:58:06.151131 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.151112 2568 log.go:25] "Validated CRI v1 image API" Apr 16 20:58:06.153576 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.153559 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 20:58:06.155718 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.155697 2568 fs.go:135] Filesystem UUIDs: map[63b88d91-dc87-4cb8-b86e-d0e7e84e1b93:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 f145c62d-4b8c-41d3-9b1e-f361de11e9b5:/dev/nvme0n1p4] Apr 16 20:58:06.155811 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.155717 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 20:58:06.161151 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.161034 2568 manager.go:217] Machine: {Timestamp:2026-04-16 20:58:06.159333584 +0000 UTC m=+0.361870307 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097275 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b75733bff1018c706dbed4e668d08 SystemUUID:ec2b7573-3bff-1018-c706-dbed4e668d08 BootID:dc9ff5c0-e87e-4103-8ba4-1180f8dad856 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ab:4e:90:7b:3b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ab:4e:90:7b:3b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:66:cb:b9:c9:2d:09 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 20:58:06.161151 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.161138 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 20:58:06.161279 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.161218 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 20:58:06.163109 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.163087 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:58:06.163516 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.163487 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 20:58:06.163662 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.163519 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-120.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 20:58:06.163708 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.163673 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 20:58:06.163708 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.163681 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 20:58:06.163708 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.163698 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:58:06.164643 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.164632 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:58:06.165567 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.165557 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:58:06.165700 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.165691 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 20:58:06.167835 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.167826 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 16 20:58:06.167874 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.167840 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 20:58:06.167874 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.167852 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 20:58:06.167874 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.167864 2568 kubelet.go:397] "Adding apiserver pod source" Apr 16 20:58:06.167874 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.167872 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 20:58:06.168838 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.168827 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:58:06.168892 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.168845 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:58:06.171753 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.171738 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 20:58:06.173081 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.173065 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 20:58:06.174718 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.174704 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 20:58:06.174794 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.174741 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 20:58:06.174794 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.174752 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 20:58:06.174794 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.174761 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 20:58:06.174794 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.174770 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 20:58:06.174794 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.174778 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 20:58:06.174794 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.174788 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 20:58:06.174794 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.174796 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 20:58:06.175028 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.174807 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 20:58:06.175028 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.174816 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 20:58:06.175028 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.174842 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 20:58:06.175028 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.174857 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 20:58:06.175656 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.175643 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 20:58:06.175717 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.175658 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 20:58:06.178691 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.178662 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 20:58:06.178784 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.178692 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-120.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 20:58:06.178784 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.178748 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-120.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 20:58:06.179360 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.179345 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 20:58:06.179444 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.179409 2568 server.go:1295] "Started kubelet" Apr 16 20:58:06.179508 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.179483 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 20:58:06.179571 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.179536 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 20:58:06.179613 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.179589 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 20:58:06.180361 ip-10-0-138-120 systemd[1]: Started Kubernetes Kubelet. Apr 16 20:58:06.182049 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.182035 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 16 20:58:06.183528 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.183513 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 20:58:06.187201 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.187182 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 20:58:06.188394 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.188356 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 20:58:06.189009 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.188991 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 20:58:06.189273 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.189247 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7zjwx" Apr 16 20:58:06.190552 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.190534 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 20:58:06.190620 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.190554 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 20:58:06.190620 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.190613 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 20:58:06.190802 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.190780 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 16 20:58:06.190887 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.190874 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 16 20:58:06.190939 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.190889 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 16 20:58:06.193402 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.193068 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 20:58:06.193402 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.193115 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-120.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 20:58:06.193526 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.193452 2568 factory.go:55] Registering systemd factory Apr 16 20:58:06.193526 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.193477 2568 factory.go:223] Registration of the systemd container factory successfully Apr 16 20:58:06.194102 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.194084 2568 factory.go:153] Registering CRI-O factory Apr 16 20:58:06.194102 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.194105 2568 factory.go:223] Registration of the crio container factory successfully Apr 16 20:58:06.194217 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.194163 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 20:58:06.194217 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.194191 2568 factory.go:103] Registering Raw factory Apr 16 20:58:06.194217 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.194208 2568 manager.go:1196] Started watching for new ooms in manager Apr 16 20:58:06.194890 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.194871 2568 manager.go:319] Starting recovery of all containers Apr 16 20:58:06.197959 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.193000 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-120.ec2.internal.18a6f1e90648f616 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-120.ec2.internal,UID:ip-10-0-138-120.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-120.ec2.internal,},FirstTimestamp:2026-04-16 20:58:06.17935823 +0000 UTC m=+0.381894968,LastTimestamp:2026-04-16 20:58:06.17935823 +0000 UTC m=+0.381894968,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-120.ec2.internal,}" Apr 16 20:58:06.197959 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.197941 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7zjwx" Apr 16 20:58:06.206853 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.206698 2568 manager.go:324] Recovery completed Apr 16 20:58:06.211311 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.211298 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:58:06.214113 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.214097 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:58:06.214209 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.214124 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:58:06.214209 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.214138 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:58:06.214688 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.214675 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 20:58:06.214688 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.214686 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 20:58:06.214783 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.214702 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:58:06.216636 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.216569 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-120.ec2.internal.18a6f1e9085b42a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-120.ec2.internal,UID:ip-10-0-138-120.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-120.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-120.ec2.internal,},FirstTimestamp:2026-04-16 20:58:06.214111904 +0000 UTC m=+0.416648626,LastTimestamp:2026-04-16 20:58:06.214111904 +0000 UTC m=+0.416648626,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-120.ec2.internal,}" Apr 16 20:58:06.217975 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.217962 2568 policy_none.go:49] "None policy: Start" Apr 16 20:58:06.217975 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.217977 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 20:58:06.218077 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.217987 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 16 20:58:06.262011 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.261990 2568 manager.go:341] "Starting Device Plugin manager" Apr 16 20:58:06.282688 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.262025 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 20:58:06.282688 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.262037 2568 server.go:85] "Starting device plugin registration server" Apr 16 20:58:06.282688 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.262281 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 20:58:06.282688 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.262295 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 20:58:06.282688 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.262371 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 20:58:06.282688 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.262464 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 20:58:06.282688 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.262473 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 20:58:06.282688 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.263131 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 20:58:06.282688 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.263163 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-120.ec2.internal\" not found" Apr 16 20:58:06.327503 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.327455 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 20:58:06.328868 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.328842 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 20:58:06.328963 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.328877 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 20:58:06.328963 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.328903 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 20:58:06.328963 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.328912 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 20:58:06.328963 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.328954 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 20:58:06.332768 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.332751 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:58:06.363069 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.362992 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:58:06.363970 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.363953 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:58:06.364058 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.363988 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:58:06.364058 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.364003 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:58:06.364058 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.364034 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-120.ec2.internal" Apr 16 20:58:06.372552 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.372536 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-120.ec2.internal" Apr 16 20:58:06.372630 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.372559 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-120.ec2.internal\": node \"ip-10-0-138-120.ec2.internal\" not found" Apr 16 20:58:06.388437 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.388419 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 16 20:58:06.429706 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.429669 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal"] Apr 16 20:58:06.429871 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.429765 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:58:06.430727 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.430711 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:58:06.430803 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.430740 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:58:06.430803 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.430750 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:58:06.433134 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.433120 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:58:06.433312 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.433296 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" Apr 16 20:58:06.433393 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.433341 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:58:06.434409 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.434395 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:58:06.434478 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.434424 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:58:06.434478 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.434395 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:58:06.434478 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.434439 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:58:06.434478 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.434454 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:58:06.434478 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.434478 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:58:06.436725 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.436712 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal" Apr 16 20:58:06.436767 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.436737 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:58:06.437451 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.437434 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:58:06.437513 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.437465 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:58:06.437513 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.437477 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:58:06.456948 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.456920 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-120.ec2.internal\" not found" node="ip-10-0-138-120.ec2.internal" Apr 16 20:58:06.460801 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.460783 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-120.ec2.internal\" not found" node="ip-10-0-138-120.ec2.internal" Apr 16 20:58:06.489344 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.489320 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 16 20:58:06.493712 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.493693 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4c5556f6af36052a906fa0aef20bfb6c-config\") pod \"kube-apiserver-proxy-ip-10-0-138-120.ec2.internal\" (UID: \"4c5556f6af36052a906fa0aef20bfb6c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal" Apr 16 20:58:06.493774 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.493720 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ba5550cc00f884a90499316ee8207508-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal\" (UID: \"ba5550cc00f884a90499316ee8207508\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" Apr 16 20:58:06.493774 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.493738 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba5550cc00f884a90499316ee8207508-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal\" (UID: \"ba5550cc00f884a90499316ee8207508\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" Apr 16 20:58:06.589728 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.589697 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 16 20:58:06.594089 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.594069 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4c5556f6af36052a906fa0aef20bfb6c-config\") pod \"kube-apiserver-proxy-ip-10-0-138-120.ec2.internal\" (UID: \"4c5556f6af36052a906fa0aef20bfb6c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal" Apr 16 20:58:06.594144 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.594095 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4c5556f6af36052a906fa0aef20bfb6c-config\") pod \"kube-apiserver-proxy-ip-10-0-138-120.ec2.internal\" (UID: \"4c5556f6af36052a906fa0aef20bfb6c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal" Apr 16 20:58:06.594144 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.594112 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ba5550cc00f884a90499316ee8207508-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal\" (UID: \"ba5550cc00f884a90499316ee8207508\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" Apr 16 20:58:06.594208 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.594139 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba5550cc00f884a90499316ee8207508-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal\" (UID: \"ba5550cc00f884a90499316ee8207508\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" Apr 16 20:58:06.594208 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.594166 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ba5550cc00f884a90499316ee8207508-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal\" (UID: \"ba5550cc00f884a90499316ee8207508\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" Apr 16 20:58:06.594208 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.594177 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba5550cc00f884a90499316ee8207508-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal\" (UID: \"ba5550cc00f884a90499316ee8207508\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" Apr 16 20:58:06.690640 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.690569 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 16 20:58:06.759039 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.759009 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" Apr 16 20:58:06.763566 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.763547 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal" Apr 16 20:58:06.791448 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.791414 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 16 20:58:06.891988 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:06.891945 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 16 20:58:06.986985 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.986918 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:58:06.991257 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:06.991241 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" Apr 16 20:58:07.002151 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.002130 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:58:07.003790 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.003776 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal" Apr 16 20:58:07.012998 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.012974 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:58:07.108525 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.108491 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 20:58:07.109116 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.108646 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:58:07.109116 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.108695 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:58:07.168860 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.168832 2568 apiserver.go:52] "Watching apiserver" Apr 16 20:58:07.189099 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.189067 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 20:58:07.189307 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.189069 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 20:58:07.190825 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.190786 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-t6bp2","openshift-dns/node-resolver-cjqkn","openshift-image-registry/node-ca-s6rx6","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal","openshift-multus/network-metrics-daemon-llm2q","openshift-network-operator/iptables-alerter-nrqjf","kube-system/konnectivity-agent-6dn6k","kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs","openshift-multus/multus-additional-cni-plugins-bt6zf","openshift-multus/multus-pd7bg","openshift-network-diagnostics/network-check-target-z6tjd","openshift-ovn-kubernetes/ovnkube-node-bvjzw"] Apr 16 20:58:07.195365 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.195346 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6dn6k" Apr 16 20:58:07.197560 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.197536 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9d60d121-9e76-46da-a382-d5c74b2c3a1e-konnectivity-ca\") pod \"konnectivity-agent-6dn6k\" (UID: \"9d60d121-9e76-46da-a382-d5c74b2c3a1e\") " pod="kube-system/konnectivity-agent-6dn6k" Apr 16 20:58:07.197775 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.197576 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9d60d121-9e76-46da-a382-d5c74b2c3a1e-agent-certs\") pod \"konnectivity-agent-6dn6k\" (UID: \"9d60d121-9e76-46da-a382-d5c74b2c3a1e\") " pod="kube-system/konnectivity-agent-6dn6k" Apr 16 20:58:07.197775 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.197545 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cjqkn" Apr 16 20:58:07.197775 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.197637 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-s6rx6" Apr 16 20:58:07.197973 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.197920 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 20:58:07.198142 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.198123 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-cd57d\"" Apr 16 20:58:07.198226 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.198133 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 20:58:07.199628 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.199611 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:58:07.199913 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.199854 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:07.199913 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.199908 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 20:58:07.200075 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:07.199923 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llm2q" podUID="a6620072-c60f-4d78-bc86-aac34b2c5098" Apr 16 20:58:07.200075 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.199860 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 20:58:07.200075 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.200032 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-d5c29\"" Apr 16 20:58:07.200226 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.200078 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 20:58:07.200226 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.200210 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 20:53:06 +0000 UTC" deadline="2027-09-17 09:55:18.513116085 +0000 UTC" Apr 16 20:58:07.200323 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.200234 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12444h57m11.312885135s" Apr 16 20:58:07.200323 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.200248 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 20:58:07.201305 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.201286 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 20:58:07.201428 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.201316 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:58:07.201428 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.201323 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-48sqq\"" Apr 16 20:58:07.201987 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.201968 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nrqjf" Apr 16 20:58:07.204127 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.204112 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.204742 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.204725 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:58:07.204828 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.204792 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 20:58:07.204828 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.204802 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 20:58:07.204936 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.204869 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-s6qxs\"" Apr 16 20:58:07.206624 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.206604 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 20:58:07.206873 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.206854 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:58:07.206964 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.206932 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-xrk5p\"" Apr 16 20:58:07.208295 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.207263 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.209739 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.209719 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.210925 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.210838 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-947t8\"" Apr 16 20:58:07.211001 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.210962 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 20:58:07.211069 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.211048 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 20:58:07.211212 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.211196 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 20:58:07.212162 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.212143 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.213139 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.213124 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 20:58:07.213245 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.213229 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 20:58:07.214460 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.214442 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:07.214534 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:07.214505 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z6tjd" podUID="87a25fbd-e69f-400f-a514-d2159ca520b9" Apr 16 20:58:07.215637 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.215619 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 20:58:07.215975 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.215958 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 20:58:07.216048 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.215978 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 20:58:07.216048 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.215965 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-446tz\"" Apr 16 20:58:07.216157 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.216100 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 20:58:07.216308 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.216293 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-7lsts\"" Apr 16 20:58:07.216677 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.216658 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.219462 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.219445 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 20:58:07.220668 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.220602 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 20:58:07.220668 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.220621 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qbzjk\"" Apr 16 20:58:07.220668 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.220634 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 20:58:07.220668 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.220656 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 20:58:07.220914 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.220638 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 20:58:07.220914 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.220604 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 20:58:07.238750 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.238693 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gn9vh" Apr 16 20:58:07.246629 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.246610 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gn9vh" Apr 16 20:58:07.291568 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.291547 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 20:58:07.297835 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.297811 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9d60d121-9e76-46da-a382-d5c74b2c3a1e-konnectivity-ca\") pod \"konnectivity-agent-6dn6k\" (UID: \"9d60d121-9e76-46da-a382-d5c74b2c3a1e\") " pod="kube-system/konnectivity-agent-6dn6k" Apr 16 20:58:07.297980 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.297844 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-sysconfig\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.297980 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.297861 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-var-lib-kubelet\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.297980 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.297877 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-sys-fs\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.297980 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.297920 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/53b01fc4-bf7b-492d-ac6d-538fc5854832-os-release\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.297980 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.297957 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/53b01fc4-bf7b-492d-ac6d-538fc5854832-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.298209 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.297981 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/790473ff-de78-426c-8164-f182aadaf583-cni-binary-copy\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.298209 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.298010 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c95sh\" (UniqueName: \"kubernetes.io/projected/790473ff-de78-426c-8164-f182aadaf583-kube-api-access-c95sh\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.298209 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.298033 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-modprobe-d\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.298209 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.298058 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/53b01fc4-bf7b-492d-ac6d-538fc5854832-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.298209 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.298086 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/53b01fc4-bf7b-492d-ac6d-538fc5854832-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.298209 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.298112 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-var-lib-openvswitch\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.298209 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.298148 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-cni-bin\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.298209 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.298184 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a96ee09-38fe-48c6-891c-03c6540c788b-host\") pod \"node-ca-s6rx6\" (UID: \"0a96ee09-38fe-48c6-891c-03c6540c788b\") " pod="openshift-image-registry/node-ca-s6rx6" Apr 16 20:58:07.298583 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.298238 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-lib-modules\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.298583 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.298269 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-host-var-lib-kubelet\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.298583 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.298294 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-run-systemd\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.298583 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.298320 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/66b94037-d7e2-4eef-911d-5525fbe6343a-env-overrides\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.298761 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.298367 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9d60d121-9e76-46da-a382-d5c74b2c3a1e-konnectivity-ca\") pod \"konnectivity-agent-6dn6k\" (UID: \"9d60d121-9e76-46da-a382-d5c74b2c3a1e\") " pod="kube-system/konnectivity-agent-6dn6k" Apr 16 20:58:07.298889 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.298850 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-kubernetes\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.298960 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.298909 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-multus-socket-dir-parent\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.299010 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.298963 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-host-var-lib-cni-bin\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.299010 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299001 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-hostroot\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.299107 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299054 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/790473ff-de78-426c-8164-f182aadaf583-multus-daemon-config\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.299107 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299101 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-systemd-units\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.299241 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299185 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-sysctl-conf\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.299297 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299246 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-sys\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.299297 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299286 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk26c\" (UniqueName: \"kubernetes.io/projected/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-kube-api-access-fk26c\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.299440 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299340 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-etc-openvswitch\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.299440 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299403 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/66b94037-d7e2-4eef-911d-5525fbe6343a-ovnkube-script-lib\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.299538 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299439 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-host-run-k8s-cni-cncf-io\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.299538 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299466 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-host-var-lib-cni-multus\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.299538 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299532 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brt6z\" (UniqueName: \"kubernetes.io/projected/5382259d-9300-4c27-9870-5d4b5910104d-kube-api-access-brt6z\") pod \"iptables-alerter-nrqjf\" (UID: \"5382259d-9300-4c27-9870-5d4b5910104d\") " pod="openshift-network-operator/iptables-alerter-nrqjf" Apr 16 20:58:07.299669 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299568 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-run-openvswitch\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.299669 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299604 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-run-ovn\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.299669 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299637 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-socket-dir\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.299669 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299663 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-host-run-netns\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.299838 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299697 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-etc-kubernetes\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.299838 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299733 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7gj6\" (UniqueName: \"kubernetes.io/projected/66b94037-d7e2-4eef-911d-5525fbe6343a-kube-api-access-c7gj6\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.299838 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299768 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/53b01fc4-bf7b-492d-ac6d-538fc5854832-cnibin\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.299960 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299861 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-slash\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.299960 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299896 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs\") pod \"network-metrics-daemon-llm2q\" (UID: \"a6620072-c60f-4d78-bc86-aac34b2c5098\") " pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:07.299960 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299929 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcrrt\" (UniqueName: \"kubernetes.io/projected/a6620072-c60f-4d78-bc86-aac34b2c5098-kube-api-access-bcrrt\") pod \"network-metrics-daemon-llm2q\" (UID: \"a6620072-c60f-4d78-bc86-aac34b2c5098\") " pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:07.300087 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299960 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/16da182a-ff12-4e2d-800d-10e00ef1512d-hosts-file\") pod \"node-resolver-cjqkn\" (UID: \"16da182a-ff12-4e2d-800d-10e00ef1512d\") " pod="openshift-dns/node-resolver-cjqkn" Apr 16 20:58:07.300087 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.299991 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-multus-cni-dir\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.300087 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300018 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/66b94037-d7e2-4eef-911d-5525fbe6343a-ovn-node-metrics-cert\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.300087 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300048 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-sysctl-d\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.300253 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300089 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-registration-dir\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.300253 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300119 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-device-dir\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.300253 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300151 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-etc-selinux\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.300253 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300179 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5382259d-9300-4c27-9870-5d4b5910104d-host-slash\") pod \"iptables-alerter-nrqjf\" (UID: \"5382259d-9300-4c27-9870-5d4b5910104d\") " pod="openshift-network-operator/iptables-alerter-nrqjf" Apr 16 20:58:07.300253 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300212 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-kubelet\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.300253 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300244 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-node-log\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.300531 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300275 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0a96ee09-38fe-48c6-891c-03c6540c788b-serviceca\") pod \"node-ca-s6rx6\" (UID: \"0a96ee09-38fe-48c6-891c-03c6540c788b\") " pod="openshift-image-registry/node-ca-s6rx6" Apr 16 20:58:07.300531 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300310 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-systemd\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.300531 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300460 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-tuned\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.300531 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300501 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.300700 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300541 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvnkk\" (UniqueName: \"kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk\") pod \"network-check-target-z6tjd\" (UID: \"87a25fbd-e69f-400f-a514-d2159ca520b9\") " pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:07.300700 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300573 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/66b94037-d7e2-4eef-911d-5525fbe6343a-ovnkube-config\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.300700 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300611 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9d60d121-9e76-46da-a382-d5c74b2c3a1e-agent-certs\") pod \"konnectivity-agent-6dn6k\" (UID: \"9d60d121-9e76-46da-a382-d5c74b2c3a1e\") " pod="kube-system/konnectivity-agent-6dn6k" Apr 16 20:58:07.300700 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300642 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-cnibin\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.300700 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300672 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9276\" (UniqueName: \"kubernetes.io/projected/16da182a-ff12-4e2d-800d-10e00ef1512d-kube-api-access-k9276\") pod \"node-resolver-cjqkn\" (UID: \"16da182a-ff12-4e2d-800d-10e00ef1512d\") " pod="openshift-dns/node-resolver-cjqkn" Apr 16 20:58:07.300961 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300731 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5382259d-9300-4c27-9870-5d4b5910104d-iptables-alerter-script\") pod \"iptables-alerter-nrqjf\" (UID: \"5382259d-9300-4c27-9870-5d4b5910104d\") " pod="openshift-network-operator/iptables-alerter-nrqjf" Apr 16 20:58:07.300961 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300769 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-run-netns\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.300961 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300804 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-run-ovn-kubernetes\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.300961 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300936 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-run\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.301130 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300969 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-tmp\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.301130 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.300996 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9dkc\" (UniqueName: \"kubernetes.io/projected/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-kube-api-access-c9dkc\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.301130 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.301029 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53b01fc4-bf7b-492d-ac6d-538fc5854832-system-cni-dir\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.301130 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.301059 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fktf6\" (UniqueName: \"kubernetes.io/projected/53b01fc4-bf7b-492d-ac6d-538fc5854832-kube-api-access-fktf6\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.301130 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.301089 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-system-cni-dir\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.301130 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.301106 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 20:58:07.301130 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.301117 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgb7t\" (UniqueName: \"kubernetes.io/projected/0a96ee09-38fe-48c6-891c-03c6540c788b-kube-api-access-tgb7t\") pod \"node-ca-s6rx6\" (UID: \"0a96ee09-38fe-48c6-891c-03c6540c788b\") " pod="openshift-image-registry/node-ca-s6rx6" Apr 16 20:58:07.301432 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.301145 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-os-release\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.301432 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.301242 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-host-run-multus-certs\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.301432 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.301279 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-cni-netd\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.301432 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.301357 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.301611 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.301439 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-host\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.301611 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.301468 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/53b01fc4-bf7b-492d-ac6d-538fc5854832-cni-binary-copy\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.301611 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.301510 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-multus-conf-dir\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.301611 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.301572 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/16da182a-ff12-4e2d-800d-10e00ef1512d-tmp-dir\") pod \"node-resolver-cjqkn\" (UID: \"16da182a-ff12-4e2d-800d-10e00ef1512d\") " pod="openshift-dns/node-resolver-cjqkn" Apr 16 20:58:07.301788 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.301626 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-log-socket\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.304819 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.304798 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9d60d121-9e76-46da-a382-d5c74b2c3a1e-agent-certs\") pod \"konnectivity-agent-6dn6k\" (UID: \"9d60d121-9e76-46da-a382-d5c74b2c3a1e\") " pod="kube-system/konnectivity-agent-6dn6k" Apr 16 20:58:07.322692 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:07.322659 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c5556f6af36052a906fa0aef20bfb6c.slice/crio-1317cda4466de44fc703763f3383951cf96cace62c753e7e308891e8c31d1aa5 WatchSource:0}: Error finding container 1317cda4466de44fc703763f3383951cf96cace62c753e7e308891e8c31d1aa5: Status 404 returned error can't find the container with id 1317cda4466de44fc703763f3383951cf96cace62c753e7e308891e8c31d1aa5 Apr 16 20:58:07.322942 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:07.322920 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba5550cc00f884a90499316ee8207508.slice/crio-e84ec9434a13852c4d6e0757030619b1707652704dbb1e40bd4eb19d1c2b06b1 WatchSource:0}: Error finding container e84ec9434a13852c4d6e0757030619b1707652704dbb1e40bd4eb19d1c2b06b1: Status 404 returned error can't find the container with id e84ec9434a13852c4d6e0757030619b1707652704dbb1e40bd4eb19d1c2b06b1 Apr 16 20:58:07.327694 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.327677 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:58:07.332535 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.332495 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" event={"ID":"ba5550cc00f884a90499316ee8207508","Type":"ContainerStarted","Data":"e84ec9434a13852c4d6e0757030619b1707652704dbb1e40bd4eb19d1c2b06b1"} Apr 16 20:58:07.333453 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.333435 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal" event={"ID":"4c5556f6af36052a906fa0aef20bfb6c","Type":"ContainerStarted","Data":"1317cda4466de44fc703763f3383951cf96cace62c753e7e308891e8c31d1aa5"} Apr 16 20:58:07.403201 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403169 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-var-lib-kubelet\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.403201 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403198 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-sys-fs\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.403473 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403213 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/53b01fc4-bf7b-492d-ac6d-538fc5854832-os-release\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.403473 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403229 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/53b01fc4-bf7b-492d-ac6d-538fc5854832-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.403473 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403247 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/790473ff-de78-426c-8164-f182aadaf583-cni-binary-copy\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.403473 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403298 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-sys-fs\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.403473 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403319 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/53b01fc4-bf7b-492d-ac6d-538fc5854832-os-release\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.403473 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403339 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-var-lib-kubelet\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.403473 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403362 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c95sh\" (UniqueName: \"kubernetes.io/projected/790473ff-de78-426c-8164-f182aadaf583-kube-api-access-c95sh\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.403473 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403413 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-modprobe-d\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.403473 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403439 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/53b01fc4-bf7b-492d-ac6d-538fc5854832-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.403473 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403467 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/53b01fc4-bf7b-492d-ac6d-538fc5854832-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.403946 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403491 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-var-lib-openvswitch\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.403946 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403538 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-modprobe-d\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.403946 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403559 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-cni-bin\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.403946 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403564 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-var-lib-openvswitch\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.403946 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403605 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-cni-bin\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.403946 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403615 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/53b01fc4-bf7b-492d-ac6d-538fc5854832-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.403946 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403637 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a96ee09-38fe-48c6-891c-03c6540c788b-host\") pod \"node-ca-s6rx6\" (UID: \"0a96ee09-38fe-48c6-891c-03c6540c788b\") " pod="openshift-image-registry/node-ca-s6rx6" Apr 16 20:58:07.403946 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403666 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-lib-modules\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.403946 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403689 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-host-var-lib-kubelet\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.403946 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403701 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a96ee09-38fe-48c6-891c-03c6540c788b-host\") pod \"node-ca-s6rx6\" (UID: \"0a96ee09-38fe-48c6-891c-03c6540c788b\") " pod="openshift-image-registry/node-ca-s6rx6" Apr 16 20:58:07.403946 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403710 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-run-systemd\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.403946 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403751 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-run-systemd\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.403946 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403754 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-host-var-lib-kubelet\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.403946 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403800 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/66b94037-d7e2-4eef-911d-5525fbe6343a-env-overrides\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.403946 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403828 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-kubernetes\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.403946 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403847 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/53b01fc4-bf7b-492d-ac6d-538fc5854832-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.403946 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403853 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-multus-socket-dir-parent\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.403946 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403873 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/790473ff-de78-426c-8164-f182aadaf583-cni-binary-copy\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.404670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403900 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-multus-socket-dir-parent\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.404670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403909 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-host-var-lib-cni-bin\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.404670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403933 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-hostroot\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.404670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403939 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-host-var-lib-cni-bin\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.404670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403909 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-kubernetes\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.404670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403959 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/53b01fc4-bf7b-492d-ac6d-538fc5854832-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.404670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403954 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/790473ff-de78-426c-8164-f182aadaf583-multus-daemon-config\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.404670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403987 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-hostroot\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.404670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.403998 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-systemd-units\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.404670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404021 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-sysctl-conf\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.404670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404025 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-lib-modules\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.404670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404057 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-systemd-units\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.404670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404056 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-sys\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.404670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404086 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fk26c\" (UniqueName: \"kubernetes.io/projected/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-kube-api-access-fk26c\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.404670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404102 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-sys\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.404670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404107 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-etc-openvswitch\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.404670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404131 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/66b94037-d7e2-4eef-911d-5525fbe6343a-ovnkube-script-lib\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.404670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404153 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-host-run-k8s-cni-cncf-io\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.405373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404168 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-etc-openvswitch\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.405373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404176 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-host-var-lib-cni-multus\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.405373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404203 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-sysctl-conf\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.405373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404208 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-host-var-lib-cni-multus\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.405373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404228 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brt6z\" (UniqueName: \"kubernetes.io/projected/5382259d-9300-4c27-9870-5d4b5910104d-kube-api-access-brt6z\") pod \"iptables-alerter-nrqjf\" (UID: \"5382259d-9300-4c27-9870-5d4b5910104d\") " pod="openshift-network-operator/iptables-alerter-nrqjf" Apr 16 20:58:07.405373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404245 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-host-run-k8s-cni-cncf-io\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.405373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404225 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/66b94037-d7e2-4eef-911d-5525fbe6343a-env-overrides\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.405373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404261 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-run-openvswitch\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.405373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404337 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-run-ovn\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.405373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404357 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-run-openvswitch\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.405373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404363 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-socket-dir\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.405373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404407 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-host-run-netns\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.405373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404415 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-run-ovn\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.405373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404431 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-etc-kubernetes\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.405373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404444 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-host-run-netns\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.405373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404457 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7gj6\" (UniqueName: \"kubernetes.io/projected/66b94037-d7e2-4eef-911d-5525fbe6343a-kube-api-access-c7gj6\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.405373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404474 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-etc-kubernetes\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.405373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404482 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/53b01fc4-bf7b-492d-ac6d-538fc5854832-cnibin\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.406355 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404462 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/790473ff-de78-426c-8164-f182aadaf583-multus-daemon-config\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.406355 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404522 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/53b01fc4-bf7b-492d-ac6d-538fc5854832-cnibin\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.406355 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404510 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-socket-dir\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.406355 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404602 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-slash\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.406355 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404622 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs\") pod \"network-metrics-daemon-llm2q\" (UID: \"a6620072-c60f-4d78-bc86-aac34b2c5098\") " pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:07.406355 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404638 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcrrt\" (UniqueName: \"kubernetes.io/projected/a6620072-c60f-4d78-bc86-aac34b2c5098-kube-api-access-bcrrt\") pod \"network-metrics-daemon-llm2q\" (UID: \"a6620072-c60f-4d78-bc86-aac34b2c5098\") " pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:07.406355 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404655 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/16da182a-ff12-4e2d-800d-10e00ef1512d-hosts-file\") pod \"node-resolver-cjqkn\" (UID: \"16da182a-ff12-4e2d-800d-10e00ef1512d\") " pod="openshift-dns/node-resolver-cjqkn" Apr 16 20:58:07.406355 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404655 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-slash\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.406355 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404694 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/66b94037-d7e2-4eef-911d-5525fbe6343a-ovnkube-script-lib\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.406355 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404706 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/16da182a-ff12-4e2d-800d-10e00ef1512d-hosts-file\") pod \"node-resolver-cjqkn\" (UID: \"16da182a-ff12-4e2d-800d-10e00ef1512d\") " pod="openshift-dns/node-resolver-cjqkn" Apr 16 20:58:07.406355 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:07.404733 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:07.406355 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404738 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-multus-cni-dir\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.406355 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404758 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/66b94037-d7e2-4eef-911d-5525fbe6343a-ovn-node-metrics-cert\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.406355 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:07.404792 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs podName:a6620072-c60f-4d78-bc86-aac34b2c5098 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:07.904776673 +0000 UTC m=+2.107313383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs") pod "network-metrics-daemon-llm2q" (UID: "a6620072-c60f-4d78-bc86-aac34b2c5098") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:07.406355 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404811 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-sysctl-d\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.406355 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404829 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-multus-cni-dir\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.406355 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404838 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-registration-dir\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404870 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-device-dir\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404890 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-etc-selinux\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404911 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5382259d-9300-4c27-9870-5d4b5910104d-host-slash\") pod \"iptables-alerter-nrqjf\" (UID: \"5382259d-9300-4c27-9870-5d4b5910104d\") " pod="openshift-network-operator/iptables-alerter-nrqjf" Apr 16 20:58:07.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404914 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-registration-dir\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404936 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-sysctl-d\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404941 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-kubelet\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404962 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-device-dir\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404965 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-node-log\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404982 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5382259d-9300-4c27-9870-5d4b5910104d-host-slash\") pod \"iptables-alerter-nrqjf\" (UID: \"5382259d-9300-4c27-9870-5d4b5910104d\") " pod="openshift-network-operator/iptables-alerter-nrqjf" Apr 16 20:58:07.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.404998 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0a96ee09-38fe-48c6-891c-03c6540c788b-serviceca\") pod \"node-ca-s6rx6\" (UID: \"0a96ee09-38fe-48c6-891c-03c6540c788b\") " pod="openshift-image-registry/node-ca-s6rx6" Apr 16 20:58:07.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405004 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-node-log\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405008 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-etc-selinux\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405018 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-kubelet\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405033 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-systemd\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405059 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-tuned\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405065 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-systemd\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405091 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.407765 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405131 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnkk\" (UniqueName: \"kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk\") pod \"network-check-target-z6tjd\" (UID: \"87a25fbd-e69f-400f-a514-d2159ca520b9\") " pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:07.407765 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405154 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/66b94037-d7e2-4eef-911d-5525fbe6343a-ovnkube-config\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.407765 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405181 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-cnibin\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.407765 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405187 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.407765 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405204 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9276\" (UniqueName: \"kubernetes.io/projected/16da182a-ff12-4e2d-800d-10e00ef1512d-kube-api-access-k9276\") pod \"node-resolver-cjqkn\" (UID: \"16da182a-ff12-4e2d-800d-10e00ef1512d\") " pod="openshift-dns/node-resolver-cjqkn" Apr 16 20:58:07.407765 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405231 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5382259d-9300-4c27-9870-5d4b5910104d-iptables-alerter-script\") pod \"iptables-alerter-nrqjf\" (UID: \"5382259d-9300-4c27-9870-5d4b5910104d\") " pod="openshift-network-operator/iptables-alerter-nrqjf" Apr 16 20:58:07.407765 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405253 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-cnibin\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.407765 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405255 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-run-netns\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.407765 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405285 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-run-netns\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.407765 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405299 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-run-ovn-kubernetes\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.407765 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405326 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-run\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.407765 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405349 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-tmp\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.407765 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405395 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9dkc\" (UniqueName: \"kubernetes.io/projected/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-kube-api-access-c9dkc\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.407765 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405403 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0a96ee09-38fe-48c6-891c-03c6540c788b-serviceca\") pod \"node-ca-s6rx6\" (UID: \"0a96ee09-38fe-48c6-891c-03c6540c788b\") " pod="openshift-image-registry/node-ca-s6rx6" Apr 16 20:58:07.407765 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405424 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53b01fc4-bf7b-492d-ac6d-538fc5854832-system-cni-dir\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.407765 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405448 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fktf6\" (UniqueName: \"kubernetes.io/projected/53b01fc4-bf7b-492d-ac6d-538fc5854832-kube-api-access-fktf6\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.407765 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405460 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-run\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.408248 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405470 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-system-cni-dir\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.408248 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405486 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgb7t\" (UniqueName: \"kubernetes.io/projected/0a96ee09-38fe-48c6-891c-03c6540c788b-kube-api-access-tgb7t\") pod \"node-ca-s6rx6\" (UID: \"0a96ee09-38fe-48c6-891c-03c6540c788b\") " pod="openshift-image-registry/node-ca-s6rx6" Apr 16 20:58:07.408248 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405499 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-run-ovn-kubernetes\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.408248 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405502 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-os-release\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.408248 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405539 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-host-run-multus-certs\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.408248 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405556 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-os-release\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.408248 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405559 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53b01fc4-bf7b-492d-ac6d-538fc5854832-system-cni-dir\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.408248 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405564 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-cni-netd\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.408248 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405601 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-system-cni-dir\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.408248 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405614 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.408248 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405639 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/66b94037-d7e2-4eef-911d-5525fbe6343a-ovnkube-config\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.408248 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405644 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-host\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.408248 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405668 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/53b01fc4-bf7b-492d-ac6d-538fc5854832-cni-binary-copy\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.408248 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405693 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-multus-conf-dir\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.408248 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405719 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/16da182a-ff12-4e2d-800d-10e00ef1512d-tmp-dir\") pod \"node-resolver-cjqkn\" (UID: \"16da182a-ff12-4e2d-800d-10e00ef1512d\") " pod="openshift-dns/node-resolver-cjqkn" Apr 16 20:58:07.408248 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405743 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-log-socket\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.408248 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405769 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-sysconfig\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.408248 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405810 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-host\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.408742 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405851 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-sysconfig\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.408742 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405879 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-multus-conf-dir\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.408742 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405903 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.408742 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405916 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-host-cni-netd\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.408742 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.405967 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/790473ff-de78-426c-8164-f182aadaf583-host-run-multus-certs\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.408742 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.406031 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/66b94037-d7e2-4eef-911d-5525fbe6343a-log-socket\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.408742 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.406315 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/16da182a-ff12-4e2d-800d-10e00ef1512d-tmp-dir\") pod \"node-resolver-cjqkn\" (UID: \"16da182a-ff12-4e2d-800d-10e00ef1512d\") " pod="openshift-dns/node-resolver-cjqkn" Apr 16 20:58:07.408742 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.406334 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5382259d-9300-4c27-9870-5d4b5910104d-iptables-alerter-script\") pod \"iptables-alerter-nrqjf\" (UID: \"5382259d-9300-4c27-9870-5d4b5910104d\") " pod="openshift-network-operator/iptables-alerter-nrqjf" Apr 16 20:58:07.408742 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.406557 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/53b01fc4-bf7b-492d-ac6d-538fc5854832-cni-binary-copy\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.408742 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.407499 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-etc-tuned\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.408742 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.407597 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-tmp\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.408742 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.407653 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/66b94037-d7e2-4eef-911d-5525fbe6343a-ovn-node-metrics-cert\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.411445 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:07.411428 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:58:07.411483 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:07.411450 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:58:07.411483 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:07.411460 2568 projected.go:194] Error preparing data for projected volume kube-api-access-rvnkk for pod openshift-network-diagnostics/network-check-target-z6tjd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:07.411569 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:07.411511 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk podName:87a25fbd-e69f-400f-a514-d2159ca520b9 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:07.911498947 +0000 UTC m=+2.114035656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rvnkk" (UniqueName: "kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk") pod "network-check-target-z6tjd" (UID: "87a25fbd-e69f-400f-a514-d2159ca520b9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:07.414247 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.414220 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7gj6\" (UniqueName: \"kubernetes.io/projected/66b94037-d7e2-4eef-911d-5525fbe6343a-kube-api-access-c7gj6\") pod \"ovnkube-node-bvjzw\" (UID: \"66b94037-d7e2-4eef-911d-5525fbe6343a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.414947 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.414921 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcrrt\" (UniqueName: \"kubernetes.io/projected/a6620072-c60f-4d78-bc86-aac34b2c5098-kube-api-access-bcrrt\") pod \"network-metrics-daemon-llm2q\" (UID: \"a6620072-c60f-4d78-bc86-aac34b2c5098\") " pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:07.415088 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.415060 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fktf6\" (UniqueName: \"kubernetes.io/projected/53b01fc4-bf7b-492d-ac6d-538fc5854832-kube-api-access-fktf6\") pod \"multus-additional-cni-plugins-bt6zf\" (UID: \"53b01fc4-bf7b-492d-ac6d-538fc5854832\") " pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.416011 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.415984 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk26c\" (UniqueName: \"kubernetes.io/projected/6d4d5e69-9954-4fed-9353-068ccd3ae9ef-kube-api-access-fk26c\") pod \"tuned-t6bp2\" (UID: \"6d4d5e69-9954-4fed-9353-068ccd3ae9ef\") " pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.416205 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.416184 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9276\" (UniqueName: \"kubernetes.io/projected/16da182a-ff12-4e2d-800d-10e00ef1512d-kube-api-access-k9276\") pod \"node-resolver-cjqkn\" (UID: \"16da182a-ff12-4e2d-800d-10e00ef1512d\") " pod="openshift-dns/node-resolver-cjqkn" Apr 16 20:58:07.416292 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.416272 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brt6z\" (UniqueName: \"kubernetes.io/projected/5382259d-9300-4c27-9870-5d4b5910104d-kube-api-access-brt6z\") pod \"iptables-alerter-nrqjf\" (UID: \"5382259d-9300-4c27-9870-5d4b5910104d\") " pod="openshift-network-operator/iptables-alerter-nrqjf" Apr 16 20:58:07.416479 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.416462 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c95sh\" (UniqueName: \"kubernetes.io/projected/790473ff-de78-426c-8164-f182aadaf583-kube-api-access-c95sh\") pod \"multus-pd7bg\" (UID: \"790473ff-de78-426c-8164-f182aadaf583\") " pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.416981 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.416964 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9dkc\" (UniqueName: \"kubernetes.io/projected/bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e-kube-api-access-c9dkc\") pod \"aws-ebs-csi-driver-node-vvnxs\" (UID: \"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.417058 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.417041 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgb7t\" (UniqueName: \"kubernetes.io/projected/0a96ee09-38fe-48c6-891c-03c6540c788b-kube-api-access-tgb7t\") pod \"node-ca-s6rx6\" (UID: \"0a96ee09-38fe-48c6-891c-03c6540c788b\") " pod="openshift-image-registry/node-ca-s6rx6" Apr 16 20:58:07.523519 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.523414 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6dn6k" Apr 16 20:58:07.525647 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.525625 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:58:07.532256 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:07.532230 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d60d121_9e76_46da_a382_d5c74b2c3a1e.slice/crio-b24ab41a9a941f473973e6e9b8b666fd943724697a817d840e48dda2a1348f73 WatchSource:0}: Error finding container b24ab41a9a941f473973e6e9b8b666fd943724697a817d840e48dda2a1348f73: Status 404 returned error can't find the container with id b24ab41a9a941f473973e6e9b8b666fd943724697a817d840e48dda2a1348f73 Apr 16 20:58:07.537878 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.537861 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cjqkn" Apr 16 20:58:07.544280 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:07.544258 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16da182a_ff12_4e2d_800d_10e00ef1512d.slice/crio-f9c1ce49e2df26718bedcc932dba6c20ef424263e8889556363014f63a9940a0 WatchSource:0}: Error finding container f9c1ce49e2df26718bedcc932dba6c20ef424263e8889556363014f63a9940a0: Status 404 returned error can't find the container with id f9c1ce49e2df26718bedcc932dba6c20ef424263e8889556363014f63a9940a0 Apr 16 20:58:07.551294 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.551273 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-s6rx6" Apr 16 20:58:07.557011 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:07.556988 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a96ee09_38fe_48c6_891c_03c6540c788b.slice/crio-e0f6a24f9171ad296c04d14215ee6e55e9cf436e2ffd5ea73e7040efe28ae0e1 WatchSource:0}: Error finding container e0f6a24f9171ad296c04d14215ee6e55e9cf436e2ffd5ea73e7040efe28ae0e1: Status 404 returned error can't find the container with id e0f6a24f9171ad296c04d14215ee6e55e9cf436e2ffd5ea73e7040efe28ae0e1 Apr 16 20:58:07.565462 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.565440 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nrqjf" Apr 16 20:58:07.570991 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.570972 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" Apr 16 20:58:07.572690 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:07.572671 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5382259d_9300_4c27_9870_5d4b5910104d.slice/crio-ac99f6836688e24bf91226954726986a4a0ed61c92bd619cf3462274f98f577d WatchSource:0}: Error finding container ac99f6836688e24bf91226954726986a4a0ed61c92bd619cf3462274f98f577d: Status 404 returned error can't find the container with id ac99f6836688e24bf91226954726986a4a0ed61c92bd619cf3462274f98f577d Apr 16 20:58:07.577957 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:07.577925 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d4d5e69_9954_4fed_9353_068ccd3ae9ef.slice/crio-90b0040c56a6b600530e2abde29e93896d31902eaa6e278d0612353d63889e99 WatchSource:0}: Error finding container 90b0040c56a6b600530e2abde29e93896d31902eaa6e278d0612353d63889e99: Status 404 returned error can't find the container with id 90b0040c56a6b600530e2abde29e93896d31902eaa6e278d0612353d63889e99 Apr 16 20:58:07.584469 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.584450 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" Apr 16 20:58:07.591315 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:07.591294 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc34332f_50e0_4c8b_b9e8_b3ef23f49a6e.slice/crio-84afb54aacd83f1132369694a9e14804599a506bd0b52f879b026dbc83243ac0 WatchSource:0}: Error finding container 84afb54aacd83f1132369694a9e14804599a506bd0b52f879b026dbc83243ac0: Status 404 returned error can't find the container with id 84afb54aacd83f1132369694a9e14804599a506bd0b52f879b026dbc83243ac0 Apr 16 20:58:07.603697 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.603664 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bt6zf" Apr 16 20:58:07.609512 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:07.609485 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53b01fc4_bf7b_492d_ac6d_538fc5854832.slice/crio-3e5967667a9e6e33454f437ef6b02b12a51d1784343792dc4809ba103b84e231 WatchSource:0}: Error finding container 3e5967667a9e6e33454f437ef6b02b12a51d1784343792dc4809ba103b84e231: Status 404 returned error can't find the container with id 3e5967667a9e6e33454f437ef6b02b12a51d1784343792dc4809ba103b84e231 Apr 16 20:58:07.628535 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.628515 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pd7bg" Apr 16 20:58:07.634321 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.634299 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:07.634679 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:07.634654 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod790473ff_de78_426c_8164_f182aadaf583.slice/crio-29aea66a735a7256ff4d933b5a572331a5ac6102c6b5c38d62ab0e5d73cc3968 WatchSource:0}: Error finding container 29aea66a735a7256ff4d933b5a572331a5ac6102c6b5c38d62ab0e5d73cc3968: Status 404 returned error can't find the container with id 29aea66a735a7256ff4d933b5a572331a5ac6102c6b5c38d62ab0e5d73cc3968 Apr 16 20:58:07.640574 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:07.640553 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66b94037_d7e2_4eef_911d_5525fbe6343a.slice/crio-137d292d4f39100d9978ee72e1cf82050cec1f307b63b5646a156e0992dbb02d WatchSource:0}: Error finding container 137d292d4f39100d9978ee72e1cf82050cec1f307b63b5646a156e0992dbb02d: Status 404 returned error can't find the container with id 137d292d4f39100d9978ee72e1cf82050cec1f307b63b5646a156e0992dbb02d Apr 16 20:58:07.909056 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.908972 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs\") pod \"network-metrics-daemon-llm2q\" (UID: \"a6620072-c60f-4d78-bc86-aac34b2c5098\") " pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:07.909203 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:07.909111 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:07.909203 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:07.909171 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs podName:a6620072-c60f-4d78-bc86-aac34b2c5098 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:08.909154367 +0000 UTC m=+3.111691082 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs") pod "network-metrics-daemon-llm2q" (UID: "a6620072-c60f-4d78-bc86-aac34b2c5098") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:07.978653 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:07.978613 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:58:08.009661 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:08.009625 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnkk\" (UniqueName: \"kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk\") pod \"network-check-target-z6tjd\" (UID: \"87a25fbd-e69f-400f-a514-d2159ca520b9\") " pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:08.009821 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:08.009772 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:58:08.009821 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:08.009793 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:58:08.009821 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:08.009807 2568 projected.go:194] Error preparing data for projected volume kube-api-access-rvnkk for pod openshift-network-diagnostics/network-check-target-z6tjd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:08.010003 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:08.009869 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk podName:87a25fbd-e69f-400f-a514-d2159ca520b9 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:09.009849743 +0000 UTC m=+3.212386460 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvnkk" (UniqueName: "kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk") pod "network-check-target-z6tjd" (UID: "87a25fbd-e69f-400f-a514-d2159ca520b9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:08.247551 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:08.247447 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:53:07 +0000 UTC" deadline="2027-12-26 13:27:09.389630846 +0000 UTC" Apr 16 20:58:08.247551 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:08.247491 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14848h29m1.142143941s" Apr 16 20:58:08.335201 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:08.335169 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:08.335394 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:08.335303 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z6tjd" podUID="87a25fbd-e69f-400f-a514-d2159ca520b9" Apr 16 20:58:08.350516 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:08.350475 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bt6zf" event={"ID":"53b01fc4-bf7b-492d-ac6d-538fc5854832","Type":"ContainerStarted","Data":"3e5967667a9e6e33454f437ef6b02b12a51d1784343792dc4809ba103b84e231"} Apr 16 20:58:08.369844 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:08.369790 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" event={"ID":"6d4d5e69-9954-4fed-9353-068ccd3ae9ef","Type":"ContainerStarted","Data":"90b0040c56a6b600530e2abde29e93896d31902eaa6e278d0612353d63889e99"} Apr 16 20:58:08.383092 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:08.383049 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nrqjf" event={"ID":"5382259d-9300-4c27-9870-5d4b5910104d","Type":"ContainerStarted","Data":"ac99f6836688e24bf91226954726986a4a0ed61c92bd619cf3462274f98f577d"} Apr 16 20:58:08.397690 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:08.397650 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-s6rx6" event={"ID":"0a96ee09-38fe-48c6-891c-03c6540c788b","Type":"ContainerStarted","Data":"e0f6a24f9171ad296c04d14215ee6e55e9cf436e2ffd5ea73e7040efe28ae0e1"} Apr 16 20:58:08.401347 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:08.401317 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cjqkn" event={"ID":"16da182a-ff12-4e2d-800d-10e00ef1512d","Type":"ContainerStarted","Data":"f9c1ce49e2df26718bedcc932dba6c20ef424263e8889556363014f63a9940a0"} Apr 16 20:58:08.402908 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:08.402883 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pd7bg" event={"ID":"790473ff-de78-426c-8164-f182aadaf583","Type":"ContainerStarted","Data":"29aea66a735a7256ff4d933b5a572331a5ac6102c6b5c38d62ab0e5d73cc3968"} Apr 16 20:58:08.418519 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:08.418483 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" event={"ID":"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e","Type":"ContainerStarted","Data":"84afb54aacd83f1132369694a9e14804599a506bd0b52f879b026dbc83243ac0"} Apr 16 20:58:08.428078 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:08.428044 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6dn6k" event={"ID":"9d60d121-9e76-46da-a382-d5c74b2c3a1e","Type":"ContainerStarted","Data":"b24ab41a9a941f473973e6e9b8b666fd943724697a817d840e48dda2a1348f73"} Apr 16 20:58:08.437392 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:08.437320 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" event={"ID":"66b94037-d7e2-4eef-911d-5525fbe6343a","Type":"ContainerStarted","Data":"137d292d4f39100d9978ee72e1cf82050cec1f307b63b5646a156e0992dbb02d"} Apr 16 20:58:08.916664 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:08.916626 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs\") pod \"network-metrics-daemon-llm2q\" (UID: \"a6620072-c60f-4d78-bc86-aac34b2c5098\") " pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:08.916861 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:08.916757 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:08.916861 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:08.916813 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs podName:a6620072-c60f-4d78-bc86-aac34b2c5098 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:10.916795595 +0000 UTC m=+5.119332309 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs") pod "network-metrics-daemon-llm2q" (UID: "a6620072-c60f-4d78-bc86-aac34b2c5098") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:09.020417 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:09.017131 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnkk\" (UniqueName: \"kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk\") pod \"network-check-target-z6tjd\" (UID: \"87a25fbd-e69f-400f-a514-d2159ca520b9\") " pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:09.020417 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:09.017279 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:58:09.020417 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:09.017297 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:58:09.020417 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:09.017310 2568 projected.go:194] Error preparing data for projected volume kube-api-access-rvnkk for pod openshift-network-diagnostics/network-check-target-z6tjd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:09.020417 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:09.017365 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk podName:87a25fbd-e69f-400f-a514-d2159ca520b9 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:11.017345869 +0000 UTC m=+5.219882586 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvnkk" (UniqueName: "kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk") pod "network-check-target-z6tjd" (UID: "87a25fbd-e69f-400f-a514-d2159ca520b9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:09.248665 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:09.248577 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:53:07 +0000 UTC" deadline="2027-12-15 05:44:35.421079247 +0000 UTC" Apr 16 20:58:09.248665 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:09.248616 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14576h46m26.172467842s" Apr 16 20:58:09.330067 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:09.329936 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:09.330266 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:09.330077 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llm2q" podUID="a6620072-c60f-4d78-bc86-aac34b2c5098" Apr 16 20:58:10.246244 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:10.246204 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:58:10.330528 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:10.330498 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:10.330977 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:10.330629 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z6tjd" podUID="87a25fbd-e69f-400f-a514-d2159ca520b9" Apr 16 20:58:10.932758 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:10.932721 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs\") pod \"network-metrics-daemon-llm2q\" (UID: \"a6620072-c60f-4d78-bc86-aac34b2c5098\") " pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:10.932939 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:10.932852 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:10.932939 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:10.932908 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs podName:a6620072-c60f-4d78-bc86-aac34b2c5098 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:14.932891417 +0000 UTC m=+9.135428140 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs") pod "network-metrics-daemon-llm2q" (UID: "a6620072-c60f-4d78-bc86-aac34b2c5098") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:11.033697 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:11.033657 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnkk\" (UniqueName: \"kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk\") pod \"network-check-target-z6tjd\" (UID: \"87a25fbd-e69f-400f-a514-d2159ca520b9\") " pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:11.033880 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:11.033842 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:58:11.033880 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:11.033857 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:58:11.033880 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:11.033867 2568 projected.go:194] Error preparing data for projected volume kube-api-access-rvnkk for pod openshift-network-diagnostics/network-check-target-z6tjd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:11.034024 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:11.033921 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk podName:87a25fbd-e69f-400f-a514-d2159ca520b9 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:15.033901507 +0000 UTC m=+9.236438280 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvnkk" (UniqueName: "kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk") pod "network-check-target-z6tjd" (UID: "87a25fbd-e69f-400f-a514-d2159ca520b9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:11.329980 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:11.329879 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:11.330142 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:11.330011 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llm2q" podUID="a6620072-c60f-4d78-bc86-aac34b2c5098" Apr 16 20:58:12.332414 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:12.332363 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:12.332864 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:12.332500 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z6tjd" podUID="87a25fbd-e69f-400f-a514-d2159ca520b9" Apr 16 20:58:13.330313 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:13.329784 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:13.330313 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:13.329961 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llm2q" podUID="a6620072-c60f-4d78-bc86-aac34b2c5098" Apr 16 20:58:14.329529 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:14.329494 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:14.329999 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:14.329609 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z6tjd" podUID="87a25fbd-e69f-400f-a514-d2159ca520b9" Apr 16 20:58:14.969344 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:14.969308 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs\") pod \"network-metrics-daemon-llm2q\" (UID: \"a6620072-c60f-4d78-bc86-aac34b2c5098\") " pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:14.969553 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:14.969507 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:14.969611 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:14.969574 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs podName:a6620072-c60f-4d78-bc86-aac34b2c5098 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:22.969555283 +0000 UTC m=+17.172091994 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs") pod "network-metrics-daemon-llm2q" (UID: "a6620072-c60f-4d78-bc86-aac34b2c5098") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:15.070113 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:15.070074 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnkk\" (UniqueName: \"kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk\") pod \"network-check-target-z6tjd\" (UID: \"87a25fbd-e69f-400f-a514-d2159ca520b9\") " pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:15.070295 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:15.070270 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:58:15.070353 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:15.070297 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:58:15.070353 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:15.070311 2568 projected.go:194] Error preparing data for projected volume kube-api-access-rvnkk for pod openshift-network-diagnostics/network-check-target-z6tjd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:15.070476 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:15.070406 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk podName:87a25fbd-e69f-400f-a514-d2159ca520b9 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:23.07036076 +0000 UTC m=+17.272897473 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvnkk" (UniqueName: "kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk") pod "network-check-target-z6tjd" (UID: "87a25fbd-e69f-400f-a514-d2159ca520b9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:15.329957 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:15.329870 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:15.330431 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:15.330016 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llm2q" podUID="a6620072-c60f-4d78-bc86-aac34b2c5098" Apr 16 20:58:16.333131 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:16.333095 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:16.333588 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:16.333216 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z6tjd" podUID="87a25fbd-e69f-400f-a514-d2159ca520b9" Apr 16 20:58:17.330184 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:17.329927 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:17.330331 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:17.330298 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llm2q" podUID="a6620072-c60f-4d78-bc86-aac34b2c5098" Apr 16 20:58:17.856083 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:17.856048 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-h8fjd"] Apr 16 20:58:17.860214 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:17.860190 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:17.860344 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:17.860278 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h8fjd" podUID="3b1b29e8-6823-494b-9501-ec38717ca6cd" Apr 16 20:58:17.892146 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:17.892067 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3b1b29e8-6823-494b-9501-ec38717ca6cd-dbus\") pod \"global-pull-secret-syncer-h8fjd\" (UID: \"3b1b29e8-6823-494b-9501-ec38717ca6cd\") " pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:17.892146 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:17.892115 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret\") pod \"global-pull-secret-syncer-h8fjd\" (UID: \"3b1b29e8-6823-494b-9501-ec38717ca6cd\") " pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:17.892355 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:17.892151 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3b1b29e8-6823-494b-9501-ec38717ca6cd-kubelet-config\") pod \"global-pull-secret-syncer-h8fjd\" (UID: \"3b1b29e8-6823-494b-9501-ec38717ca6cd\") " pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:17.992592 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:17.992554 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3b1b29e8-6823-494b-9501-ec38717ca6cd-dbus\") pod \"global-pull-secret-syncer-h8fjd\" (UID: \"3b1b29e8-6823-494b-9501-ec38717ca6cd\") " pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:17.992769 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:17.992609 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret\") pod \"global-pull-secret-syncer-h8fjd\" (UID: \"3b1b29e8-6823-494b-9501-ec38717ca6cd\") " pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:17.992769 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:17.992644 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3b1b29e8-6823-494b-9501-ec38717ca6cd-kubelet-config\") pod \"global-pull-secret-syncer-h8fjd\" (UID: \"3b1b29e8-6823-494b-9501-ec38717ca6cd\") " pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:17.992861 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:17.992781 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3b1b29e8-6823-494b-9501-ec38717ca6cd-kubelet-config\") pod \"global-pull-secret-syncer-h8fjd\" (UID: \"3b1b29e8-6823-494b-9501-ec38717ca6cd\") " pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:17.992861 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:17.992781 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3b1b29e8-6823-494b-9501-ec38717ca6cd-dbus\") pod \"global-pull-secret-syncer-h8fjd\" (UID: \"3b1b29e8-6823-494b-9501-ec38717ca6cd\") " pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:17.992861 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:17.992798 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:17.992861 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:17.992857 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret podName:3b1b29e8-6823-494b-9501-ec38717ca6cd nodeName:}" failed. No retries permitted until 2026-04-16 20:58:18.492841992 +0000 UTC m=+12.695378706 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret") pod "global-pull-secret-syncer-h8fjd" (UID: "3b1b29e8-6823-494b-9501-ec38717ca6cd") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:18.330131 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:18.330033 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:18.330288 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:18.330161 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z6tjd" podUID="87a25fbd-e69f-400f-a514-d2159ca520b9" Apr 16 20:58:18.497122 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:18.497081 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret\") pod \"global-pull-secret-syncer-h8fjd\" (UID: \"3b1b29e8-6823-494b-9501-ec38717ca6cd\") " pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:18.497305 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:18.497224 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:18.497305 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:18.497290 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret podName:3b1b29e8-6823-494b-9501-ec38717ca6cd nodeName:}" failed. No retries permitted until 2026-04-16 20:58:19.497272836 +0000 UTC m=+13.699809551 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret") pod "global-pull-secret-syncer-h8fjd" (UID: "3b1b29e8-6823-494b-9501-ec38717ca6cd") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:19.329632 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:19.329599 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:19.330060 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:19.329606 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:19.330060 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:19.329727 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h8fjd" podUID="3b1b29e8-6823-494b-9501-ec38717ca6cd" Apr 16 20:58:19.330060 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:19.329831 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llm2q" podUID="a6620072-c60f-4d78-bc86-aac34b2c5098" Apr 16 20:58:19.505779 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:19.505740 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret\") pod \"global-pull-secret-syncer-h8fjd\" (UID: \"3b1b29e8-6823-494b-9501-ec38717ca6cd\") " pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:19.505974 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:19.505913 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:19.506032 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:19.505991 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret podName:3b1b29e8-6823-494b-9501-ec38717ca6cd nodeName:}" failed. No retries permitted until 2026-04-16 20:58:21.505968579 +0000 UTC m=+15.708505293 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret") pod "global-pull-secret-syncer-h8fjd" (UID: "3b1b29e8-6823-494b-9501-ec38717ca6cd") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:20.332956 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:20.332927 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:20.333290 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:20.333013 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z6tjd" podUID="87a25fbd-e69f-400f-a514-d2159ca520b9" Apr 16 20:58:21.329550 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:21.329511 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:21.329728 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:21.329531 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:21.329728 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:21.329630 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h8fjd" podUID="3b1b29e8-6823-494b-9501-ec38717ca6cd" Apr 16 20:58:21.329827 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:21.329764 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llm2q" podUID="a6620072-c60f-4d78-bc86-aac34b2c5098" Apr 16 20:58:21.519318 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:21.519280 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret\") pod \"global-pull-secret-syncer-h8fjd\" (UID: \"3b1b29e8-6823-494b-9501-ec38717ca6cd\") " pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:21.519829 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:21.519423 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:21.519829 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:21.519477 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret podName:3b1b29e8-6823-494b-9501-ec38717ca6cd nodeName:}" failed. No retries permitted until 2026-04-16 20:58:25.519461818 +0000 UTC m=+19.721998528 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret") pod "global-pull-secret-syncer-h8fjd" (UID: "3b1b29e8-6823-494b-9501-ec38717ca6cd") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:22.330060 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:22.330024 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:22.330226 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:22.330130 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z6tjd" podUID="87a25fbd-e69f-400f-a514-d2159ca520b9" Apr 16 20:58:23.029296 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:23.029260 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs\") pod \"network-metrics-daemon-llm2q\" (UID: \"a6620072-c60f-4d78-bc86-aac34b2c5098\") " pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:23.029814 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:23.029432 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:23.029814 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:23.029501 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs podName:a6620072-c60f-4d78-bc86-aac34b2c5098 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:39.029483157 +0000 UTC m=+33.232019870 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs") pod "network-metrics-daemon-llm2q" (UID: "a6620072-c60f-4d78-bc86-aac34b2c5098") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:23.130600 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:23.130560 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnkk\" (UniqueName: \"kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk\") pod \"network-check-target-z6tjd\" (UID: \"87a25fbd-e69f-400f-a514-d2159ca520b9\") " pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:23.130775 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:23.130727 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:58:23.130775 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:23.130750 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:58:23.130775 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:23.130764 2568 projected.go:194] Error preparing data for projected volume kube-api-access-rvnkk for pod openshift-network-diagnostics/network-check-target-z6tjd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:23.130909 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:23.130827 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk podName:87a25fbd-e69f-400f-a514-d2159ca520b9 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:39.130806521 +0000 UTC m=+33.333343234 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvnkk" (UniqueName: "kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk") pod "network-check-target-z6tjd" (UID: "87a25fbd-e69f-400f-a514-d2159ca520b9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:23.329512 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:23.329425 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:23.329659 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:23.329427 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:23.329659 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:23.329558 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h8fjd" podUID="3b1b29e8-6823-494b-9501-ec38717ca6cd" Apr 16 20:58:23.329659 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:23.329631 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llm2q" podUID="a6620072-c60f-4d78-bc86-aac34b2c5098" Apr 16 20:58:24.329492 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:24.329450 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:24.329936 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:24.329570 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z6tjd" podUID="87a25fbd-e69f-400f-a514-d2159ca520b9" Apr 16 20:58:25.329696 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:25.329651 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:25.330117 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:25.329795 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h8fjd" podUID="3b1b29e8-6823-494b-9501-ec38717ca6cd" Apr 16 20:58:25.330194 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:25.329651 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:25.330242 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:25.330217 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llm2q" podUID="a6620072-c60f-4d78-bc86-aac34b2c5098" Apr 16 20:58:25.547262 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:25.547213 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret\") pod \"global-pull-secret-syncer-h8fjd\" (UID: \"3b1b29e8-6823-494b-9501-ec38717ca6cd\") " pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:25.547431 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:25.547410 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:25.547502 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:25.547490 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret podName:3b1b29e8-6823-494b-9501-ec38717ca6cd nodeName:}" failed. No retries permitted until 2026-04-16 20:58:33.547470371 +0000 UTC m=+27.750007087 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret") pod "global-pull-secret-syncer-h8fjd" (UID: "3b1b29e8-6823-494b-9501-ec38717ca6cd") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:26.330208 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:26.330054 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:26.330665 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:26.330235 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z6tjd" podUID="87a25fbd-e69f-400f-a514-d2159ca520b9" Apr 16 20:58:26.476063 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:26.475859 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal" event={"ID":"4c5556f6af36052a906fa0aef20bfb6c","Type":"ContainerStarted","Data":"930f35397d1d4ad6098bfd9d68ba9a364fc716ca2d64b55a2d6c520461f8f465"} Apr 16 20:58:26.478340 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:26.478303 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/0.log" Apr 16 20:58:26.479008 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:26.478983 2568 generic.go:358] "Generic (PLEG): container finished" podID="66b94037-d7e2-4eef-911d-5525fbe6343a" containerID="46476fc5181f68c7b4fae0903219dbd4d9f4aef7826bd1d33a649fac63170a54" exitCode=1 Apr 16 20:58:26.479090 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:26.479065 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" event={"ID":"66b94037-d7e2-4eef-911d-5525fbe6343a","Type":"ContainerStarted","Data":"c2fd5fd7ff29d7a7a2b7861b6c21aaa43a5bbbfdcc1326488011c6bc3c9c0c8e"} Apr 16 20:58:26.479151 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:26.479100 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" event={"ID":"66b94037-d7e2-4eef-911d-5525fbe6343a","Type":"ContainerStarted","Data":"04ca188b3308671a4ec173c9116ebfbfef205c1f9e2ff7edb02dfd8294a37da5"} Apr 16 20:58:26.479151 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:26.479128 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" event={"ID":"66b94037-d7e2-4eef-911d-5525fbe6343a","Type":"ContainerDied","Data":"46476fc5181f68c7b4fae0903219dbd4d9f4aef7826bd1d33a649fac63170a54"} Apr 16 20:58:26.479151 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:26.479146 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" event={"ID":"66b94037-d7e2-4eef-911d-5525fbe6343a","Type":"ContainerStarted","Data":"0b5ee9eb6137b543e6ff05b63eb4e8687746f532e60c24002ae244753fa7dad7"} Apr 16 20:58:26.489563 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:26.489522 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal" podStartSLOduration=19.489509896 podStartE2EDuration="19.489509896s" podCreationTimestamp="2026-04-16 20:58:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:58:26.489174201 +0000 UTC m=+20.691710933" watchObservedRunningTime="2026-04-16 20:58:26.489509896 +0000 UTC m=+20.692046627" Apr 16 20:58:26.490082 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:26.490055 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" event={"ID":"6d4d5e69-9954-4fed-9353-068ccd3ae9ef","Type":"ContainerStarted","Data":"8ec99aba0c51ab52533d46362d6284385e52ccd5aaba613a39df16d7ed4ce3c5"} Apr 16 20:58:26.493332 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:26.493298 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pd7bg" event={"ID":"790473ff-de78-426c-8164-f182aadaf583","Type":"ContainerStarted","Data":"4fbd1885620305ccbc95d9447f01fd749df4549653211870ebe9aa6967e0f69a"} Apr 16 20:58:26.507259 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:26.507200 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-t6bp2" podStartSLOduration=2.030979673 podStartE2EDuration="20.507181326s" podCreationTimestamp="2026-04-16 20:58:06 +0000 UTC" firstStartedPulling="2026-04-16 20:58:07.57958897 +0000 UTC m=+1.782125680" lastFinishedPulling="2026-04-16 20:58:26.055790609 +0000 UTC m=+20.258327333" observedRunningTime="2026-04-16 20:58:26.506691278 +0000 UTC m=+20.709228009" watchObservedRunningTime="2026-04-16 20:58:26.507181326 +0000 UTC m=+20.709718059" Apr 16 20:58:27.329929 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.329887 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:27.330170 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.329890 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:27.330170 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:27.330037 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h8fjd" podUID="3b1b29e8-6823-494b-9501-ec38717ca6cd" Apr 16 20:58:27.330170 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:27.330133 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llm2q" podUID="a6620072-c60f-4d78-bc86-aac34b2c5098" Apr 16 20:58:27.497173 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.497137 2568 generic.go:358] "Generic (PLEG): container finished" podID="ba5550cc00f884a90499316ee8207508" containerID="f9b4ca1a77ee63d2975b2fa9f31bcf8b63b74bf48e514a3cffce3bea7b838c16" exitCode=0 Apr 16 20:58:27.497323 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.497226 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" event={"ID":"ba5550cc00f884a90499316ee8207508","Type":"ContainerDied","Data":"f9b4ca1a77ee63d2975b2fa9f31bcf8b63b74bf48e514a3cffce3bea7b838c16"} Apr 16 20:58:27.500194 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.500173 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/0.log" Apr 16 20:58:27.500555 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.500525 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" event={"ID":"66b94037-d7e2-4eef-911d-5525fbe6343a","Type":"ContainerStarted","Data":"dd2afff302d1678421690134f345e9fbdbf7295e66c64bafb1e2c41d9712d957"} Apr 16 20:58:27.500664 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.500561 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" event={"ID":"66b94037-d7e2-4eef-911d-5525fbe6343a","Type":"ContainerStarted","Data":"b6e6db2a1559f68e82e481ec42f6997359948edea689e8859c26ffe9d53a80f4"} Apr 16 20:58:27.501940 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.501918 2568 generic.go:358] "Generic (PLEG): container finished" podID="53b01fc4-bf7b-492d-ac6d-538fc5854832" containerID="f02b446ead5caa0af9d547e71694e818398542e3a72812cf2ab7f09d88599d60" exitCode=0 Apr 16 20:58:27.502034 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.501992 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bt6zf" event={"ID":"53b01fc4-bf7b-492d-ac6d-538fc5854832","Type":"ContainerDied","Data":"f02b446ead5caa0af9d547e71694e818398542e3a72812cf2ab7f09d88599d60"} Apr 16 20:58:27.503479 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.503456 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nrqjf" event={"ID":"5382259d-9300-4c27-9870-5d4b5910104d","Type":"ContainerStarted","Data":"9884e1c64ad4a058aec01dee1cea64da910473262858fe7e6906ccdb3f6dc8e9"} Apr 16 20:58:27.505080 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.505062 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-s6rx6" event={"ID":"0a96ee09-38fe-48c6-891c-03c6540c788b","Type":"ContainerStarted","Data":"0c5abc336fd68aa828743e575d9bf5c8acafb2c288205be0946c04e932453417"} Apr 16 20:58:27.506718 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.506700 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cjqkn" event={"ID":"16da182a-ff12-4e2d-800d-10e00ef1512d","Type":"ContainerStarted","Data":"0d5632e1625d82352846519fe195cc8a72584cb61a6947acd11feed509830a8f"} Apr 16 20:58:27.508047 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.508017 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" event={"ID":"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e","Type":"ContainerStarted","Data":"2b26833e454440505e7bbc7197e64f0065450f862ed99c09e7d4005c8ab44758"} Apr 16 20:58:27.509348 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.509328 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6dn6k" event={"ID":"9d60d121-9e76-46da-a382-d5c74b2c3a1e","Type":"ContainerStarted","Data":"1a506d7d3201c96e65f9e1df2b6b4e1f4d2b525e7647d52ef1e3134a69e68b43"} Apr 16 20:58:27.514076 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.514030 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pd7bg" podStartSLOduration=2.986338632 podStartE2EDuration="21.51401816s" podCreationTimestamp="2026-04-16 20:58:06 +0000 UTC" firstStartedPulling="2026-04-16 20:58:07.637373481 +0000 UTC m=+1.839910194" lastFinishedPulling="2026-04-16 20:58:26.165053005 +0000 UTC m=+20.367589722" observedRunningTime="2026-04-16 20:58:26.525798726 +0000 UTC m=+20.728335458" watchObservedRunningTime="2026-04-16 20:58:27.51401816 +0000 UTC m=+21.716554892" Apr 16 20:58:27.527373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.527322 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6dn6k" podStartSLOduration=3.007028638 podStartE2EDuration="21.527307182s" podCreationTimestamp="2026-04-16 20:58:06 +0000 UTC" firstStartedPulling="2026-04-16 20:58:07.53347597 +0000 UTC m=+1.736012679" lastFinishedPulling="2026-04-16 20:58:26.053754499 +0000 UTC m=+20.256291223" observedRunningTime="2026-04-16 20:58:27.526853649 +0000 UTC m=+21.729390382" watchObservedRunningTime="2026-04-16 20:58:27.527307182 +0000 UTC m=+21.729843913" Apr 16 20:58:27.545227 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.545178 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-nrqjf" podStartSLOduration=3.09422782 podStartE2EDuration="21.545159157s" podCreationTimestamp="2026-04-16 20:58:06 +0000 UTC" firstStartedPulling="2026-04-16 20:58:07.574686921 +0000 UTC m=+1.777223638" lastFinishedPulling="2026-04-16 20:58:26.025618252 +0000 UTC m=+20.228154975" observedRunningTime="2026-04-16 20:58:27.544456214 +0000 UTC m=+21.746992945" watchObservedRunningTime="2026-04-16 20:58:27.545159157 +0000 UTC m=+21.747695889" Apr 16 20:58:27.559043 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.558987 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cjqkn" podStartSLOduration=3.051699362 podStartE2EDuration="21.55896997s" podCreationTimestamp="2026-04-16 20:58:06 +0000 UTC" firstStartedPulling="2026-04-16 20:58:07.545863177 +0000 UTC m=+1.748399888" lastFinishedPulling="2026-04-16 20:58:26.053133775 +0000 UTC m=+20.255670496" observedRunningTime="2026-04-16 20:58:27.558697235 +0000 UTC m=+21.761233969" watchObservedRunningTime="2026-04-16 20:58:27.55896997 +0000 UTC m=+21.761506703" Apr 16 20:58:27.574864 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.574807 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-s6rx6" podStartSLOduration=3.07946968 podStartE2EDuration="21.574791867s" podCreationTimestamp="2026-04-16 20:58:06 +0000 UTC" firstStartedPulling="2026-04-16 20:58:07.558478191 +0000 UTC m=+1.761014902" lastFinishedPulling="2026-04-16 20:58:26.053800369 +0000 UTC m=+20.256337089" observedRunningTime="2026-04-16 20:58:27.574502334 +0000 UTC m=+21.777039068" watchObservedRunningTime="2026-04-16 20:58:27.574791867 +0000 UTC m=+21.777328599" Apr 16 20:58:27.809223 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:27.809201 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 20:58:28.275663 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:28.275550 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T20:58:27.809221423Z","UUID":"7df39d42-a6b5-47ad-8b19-cae80382429d","Handler":null,"Name":"","Endpoint":""} Apr 16 20:58:28.277299 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:28.277276 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 20:58:28.277451 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:28.277306 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 20:58:28.329521 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:28.329480 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:28.329699 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:28.329598 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z6tjd" podUID="87a25fbd-e69f-400f-a514-d2159ca520b9" Apr 16 20:58:28.513580 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:28.513533 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" event={"ID":"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e","Type":"ContainerStarted","Data":"b6a09b999be4b392450543a7eea54ac7c02762213575b79771b1f0d1b500e6eb"} Apr 16 20:58:28.516839 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:28.516496 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" event={"ID":"ba5550cc00f884a90499316ee8207508","Type":"ContainerStarted","Data":"a181d56501d6e5af75ca74b03413f53ab0c221cfcaa1bee228db86bf471e90db"} Apr 16 20:58:28.532888 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:28.532786 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" podStartSLOduration=22.53276718 podStartE2EDuration="22.53276718s" podCreationTimestamp="2026-04-16 20:58:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:58:28.53243571 +0000 UTC m=+22.734972442" watchObservedRunningTime="2026-04-16 20:58:28.53276718 +0000 UTC m=+22.735303914" Apr 16 20:58:29.329499 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:29.329463 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:29.329697 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:29.329473 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:29.329697 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:29.329598 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llm2q" podUID="a6620072-c60f-4d78-bc86-aac34b2c5098" Apr 16 20:58:29.329806 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:29.329680 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h8fjd" podUID="3b1b29e8-6823-494b-9501-ec38717ca6cd" Apr 16 20:58:29.520708 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:29.520676 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/0.log" Apr 16 20:58:29.521166 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:29.521101 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" event={"ID":"66b94037-d7e2-4eef-911d-5525fbe6343a","Type":"ContainerStarted","Data":"d18ce135020fd1daf150c870c3d10303d542e85d790f4c4bbb3af7becd6096ac"} Apr 16 20:58:29.522961 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:29.522934 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" event={"ID":"bc34332f-50e0-4c8b-b9e8-b3ef23f49a6e","Type":"ContainerStarted","Data":"15f5a418f9f5fefcfff4d8f5492c818bc6b84c1e18b2f20eb2e4758ea22931af"} Apr 16 20:58:29.541960 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:29.541914 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vvnxs" podStartSLOduration=2.513945773 podStartE2EDuration="23.541901347s" podCreationTimestamp="2026-04-16 20:58:06 +0000 UTC" firstStartedPulling="2026-04-16 20:58:07.593003004 +0000 UTC m=+1.795539715" lastFinishedPulling="2026-04-16 20:58:28.620958575 +0000 UTC m=+22.823495289" observedRunningTime="2026-04-16 20:58:29.541689129 +0000 UTC m=+23.744225863" watchObservedRunningTime="2026-04-16 20:58:29.541901347 +0000 UTC m=+23.744438080" Apr 16 20:58:30.329493 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:30.329461 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:30.329672 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:30.329580 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z6tjd" podUID="87a25fbd-e69f-400f-a514-d2159ca520b9" Apr 16 20:58:30.483795 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:30.483760 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6dn6k" Apr 16 20:58:30.484423 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:30.484399 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6dn6k" Apr 16 20:58:30.525103 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:30.525073 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6dn6k" Apr 16 20:58:30.525570 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:30.525369 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6dn6k" Apr 16 20:58:31.329546 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:31.329317 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:31.329723 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:31.329316 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:31.329723 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:31.329657 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llm2q" podUID="a6620072-c60f-4d78-bc86-aac34b2c5098" Apr 16 20:58:31.329816 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:31.329794 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h8fjd" podUID="3b1b29e8-6823-494b-9501-ec38717ca6cd" Apr 16 20:58:32.329775 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:32.329740 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:32.330494 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:32.329843 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z6tjd" podUID="87a25fbd-e69f-400f-a514-d2159ca520b9" Apr 16 20:58:32.531229 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:32.531050 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/0.log" Apr 16 20:58:32.531646 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:32.531614 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" event={"ID":"66b94037-d7e2-4eef-911d-5525fbe6343a","Type":"ContainerStarted","Data":"1142d98305ad35af2dbc4e38707a6b5ba6dc435bcef10b67aa4688d3212ed65e"} Apr 16 20:58:32.531986 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:32.531960 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:32.532132 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:32.532110 2568 scope.go:117] "RemoveContainer" containerID="46476fc5181f68c7b4fae0903219dbd4d9f4aef7826bd1d33a649fac63170a54" Apr 16 20:58:32.533420 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:32.533395 2568 generic.go:358] "Generic (PLEG): container finished" podID="53b01fc4-bf7b-492d-ac6d-538fc5854832" containerID="f32bc2c2f9344cf30edb21fca749f17695a5b5bc3a52077f92dc7f826291bf4d" exitCode=0 Apr 16 20:58:32.533520 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:32.533469 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bt6zf" event={"ID":"53b01fc4-bf7b-492d-ac6d-538fc5854832","Type":"ContainerDied","Data":"f32bc2c2f9344cf30edb21fca749f17695a5b5bc3a52077f92dc7f826291bf4d"} Apr 16 20:58:32.549532 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:32.549483 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:33.329926 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:33.329895 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:33.330449 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:33.330013 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h8fjd" podUID="3b1b29e8-6823-494b-9501-ec38717ca6cd" Apr 16 20:58:33.330449 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:33.330045 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:33.330449 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:33.330126 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llm2q" podUID="a6620072-c60f-4d78-bc86-aac34b2c5098" Apr 16 20:58:33.536940 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:33.536901 2568 generic.go:358] "Generic (PLEG): container finished" podID="53b01fc4-bf7b-492d-ac6d-538fc5854832" containerID="971000b2b0ec1000ad3ef725935930ffe3bd56fe6cae1bd327f9f95aa053a334" exitCode=0 Apr 16 20:58:33.537127 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:33.536989 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bt6zf" event={"ID":"53b01fc4-bf7b-492d-ac6d-538fc5854832","Type":"ContainerDied","Data":"971000b2b0ec1000ad3ef725935930ffe3bd56fe6cae1bd327f9f95aa053a334"} Apr 16 20:58:33.540662 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:33.540640 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/0.log" Apr 16 20:58:33.540980 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:33.540956 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" event={"ID":"66b94037-d7e2-4eef-911d-5525fbe6343a","Type":"ContainerStarted","Data":"dee471b508043f63bb7a3acd851fb21353b087fe60cb177dafee5b325ff12ce4"} Apr 16 20:58:33.544586 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:33.542691 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:33.544586 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:33.542730 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:33.559980 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:33.559907 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:58:33.589334 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:33.589281 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" podStartSLOduration=9.135695127 podStartE2EDuration="27.589268275s" podCreationTimestamp="2026-04-16 20:58:06 +0000 UTC" firstStartedPulling="2026-04-16 20:58:07.641912748 +0000 UTC m=+1.844449459" lastFinishedPulling="2026-04-16 20:58:26.095485893 +0000 UTC m=+20.298022607" observedRunningTime="2026-04-16 20:58:33.587220425 +0000 UTC m=+27.789757161" watchObservedRunningTime="2026-04-16 20:58:33.589268275 +0000 UTC m=+27.791805009" Apr 16 20:58:33.600808 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:33.600785 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret\") pod \"global-pull-secret-syncer-h8fjd\" (UID: \"3b1b29e8-6823-494b-9501-ec38717ca6cd\") " pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:33.600997 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:33.600978 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:33.601330 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:33.601311 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret podName:3b1b29e8-6823-494b-9501-ec38717ca6cd nodeName:}" failed. No retries permitted until 2026-04-16 20:58:49.601288131 +0000 UTC m=+43.803824844 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret") pod "global-pull-secret-syncer-h8fjd" (UID: "3b1b29e8-6823-494b-9501-ec38717ca6cd") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:33.679194 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:33.679150 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-h8fjd"] Apr 16 20:58:33.679333 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:33.679285 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:33.679416 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:33.679367 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h8fjd" podUID="3b1b29e8-6823-494b-9501-ec38717ca6cd" Apr 16 20:58:33.682241 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:33.682215 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-z6tjd"] Apr 16 20:58:33.682398 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:33.682331 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:33.682472 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:33.682446 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z6tjd" podUID="87a25fbd-e69f-400f-a514-d2159ca520b9" Apr 16 20:58:33.682953 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:33.682928 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-llm2q"] Apr 16 20:58:33.683066 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:33.683044 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:33.683197 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:33.683161 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llm2q" podUID="a6620072-c60f-4d78-bc86-aac34b2c5098" Apr 16 20:58:34.544813 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:34.544730 2568 generic.go:358] "Generic (PLEG): container finished" podID="53b01fc4-bf7b-492d-ac6d-538fc5854832" containerID="dff7789c05e9afbdd58945d90fc8f623a148fe5f0cad5a21545d0ed9de485a01" exitCode=0 Apr 16 20:58:34.545165 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:34.544817 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bt6zf" event={"ID":"53b01fc4-bf7b-492d-ac6d-538fc5854832","Type":"ContainerDied","Data":"dff7789c05e9afbdd58945d90fc8f623a148fe5f0cad5a21545d0ed9de485a01"} Apr 16 20:58:35.330137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:35.330092 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:35.330137 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:35.330133 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:35.330333 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:35.330148 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:35.330333 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:35.330248 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llm2q" podUID="a6620072-c60f-4d78-bc86-aac34b2c5098" Apr 16 20:58:35.330630 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:35.330531 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h8fjd" podUID="3b1b29e8-6823-494b-9501-ec38717ca6cd" Apr 16 20:58:35.330630 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:35.330606 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z6tjd" podUID="87a25fbd-e69f-400f-a514-d2159ca520b9" Apr 16 20:58:37.329429 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:37.329184 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:37.329845 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:37.329187 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:37.329845 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:37.329209 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:37.329845 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:37.329543 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h8fjd" podUID="3b1b29e8-6823-494b-9501-ec38717ca6cd" Apr 16 20:58:37.329845 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:37.329626 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z6tjd" podUID="87a25fbd-e69f-400f-a514-d2159ca520b9" Apr 16 20:58:37.329845 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:37.329743 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llm2q" podUID="a6620072-c60f-4d78-bc86-aac34b2c5098" Apr 16 20:58:39.037373 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.037332 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs\") pod \"network-metrics-daemon-llm2q\" (UID: \"a6620072-c60f-4d78-bc86-aac34b2c5098\") " pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:39.037891 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:39.037505 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:39.037891 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:39.037590 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs podName:a6620072-c60f-4d78-bc86-aac34b2c5098 nodeName:}" failed. No retries permitted until 2026-04-16 20:59:11.037570789 +0000 UTC m=+65.240107510 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs") pod "network-metrics-daemon-llm2q" (UID: "a6620072-c60f-4d78-bc86-aac34b2c5098") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:39.137926 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.137819 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnkk\" (UniqueName: \"kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk\") pod \"network-check-target-z6tjd\" (UID: \"87a25fbd-e69f-400f-a514-d2159ca520b9\") " pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:39.138117 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:39.138008 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:58:39.138117 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:39.138029 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:58:39.138117 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:39.138040 2568 projected.go:194] Error preparing data for projected volume kube-api-access-rvnkk for pod openshift-network-diagnostics/network-check-target-z6tjd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:39.138117 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:39.138096 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk podName:87a25fbd-e69f-400f-a514-d2159ca520b9 nodeName:}" failed. No retries permitted until 2026-04-16 20:59:11.138077834 +0000 UTC m=+65.340614545 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvnkk" (UniqueName: "kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk") pod "network-check-target-z6tjd" (UID: "87a25fbd-e69f-400f-a514-d2159ca520b9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:39.198042 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.197989 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeReady" Apr 16 20:58:39.198230 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.198117 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 20:58:39.253549 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.253503 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tbbn8"] Apr 16 20:58:39.285605 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.285573 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4chc2"] Apr 16 20:58:39.285811 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.285762 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tbbn8" Apr 16 20:58:39.289010 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.288776 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 20:58:39.289010 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.288792 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 20:58:39.289010 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.288949 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kfmp4\"" Apr 16 20:58:39.304208 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.304176 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4chc2"] Apr 16 20:58:39.304208 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.304210 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tbbn8"] Apr 16 20:58:39.304410 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.304295 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4chc2" Apr 16 20:58:39.307332 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.307306 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 20:58:39.307473 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.307329 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 20:58:39.307473 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.307345 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mhzpf\"" Apr 16 20:58:39.307575 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.307507 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 20:58:39.329716 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.329691 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:58:39.329865 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.329690 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:39.329927 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.329700 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:58:39.332589 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.332563 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:58:39.332589 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.332588 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:58:39.332801 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.332650 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lvzr6\"" Apr 16 20:58:39.332801 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.332754 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 20:58:39.332801 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.332772 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c7977\"" Apr 16 20:58:39.333006 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.332991 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:58:39.440468 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.440362 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert\") pod \"ingress-canary-4chc2\" (UID: \"6d8eb548-23d0-403d-a61e-f91a50c71507\") " pod="openshift-ingress-canary/ingress-canary-4chc2" Apr 16 20:58:39.440633 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.440530 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-config-volume\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 20:58:39.440633 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.440561 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nhqm\" (UniqueName: \"kubernetes.io/projected/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-kube-api-access-9nhqm\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 20:58:39.440633 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.440589 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 20:58:39.440783 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.440637 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-tmp-dir\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 20:58:39.440783 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.440693 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmcxz\" (UniqueName: \"kubernetes.io/projected/6d8eb548-23d0-403d-a61e-f91a50c71507-kube-api-access-nmcxz\") pod \"ingress-canary-4chc2\" (UID: \"6d8eb548-23d0-403d-a61e-f91a50c71507\") " pod="openshift-ingress-canary/ingress-canary-4chc2" Apr 16 20:58:39.541848 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.541813 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-tmp-dir\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 20:58:39.542023 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.541856 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmcxz\" (UniqueName: \"kubernetes.io/projected/6d8eb548-23d0-403d-a61e-f91a50c71507-kube-api-access-nmcxz\") pod \"ingress-canary-4chc2\" (UID: \"6d8eb548-23d0-403d-a61e-f91a50c71507\") " pod="openshift-ingress-canary/ingress-canary-4chc2" Apr 16 20:58:39.542192 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.542173 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-tmp-dir\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 20:58:39.542241 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.542216 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert\") pod \"ingress-canary-4chc2\" (UID: \"6d8eb548-23d0-403d-a61e-f91a50c71507\") " pod="openshift-ingress-canary/ingress-canary-4chc2" Apr 16 20:58:39.542289 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.542277 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-config-volume\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 20:58:39.542323 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.542297 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nhqm\" (UniqueName: \"kubernetes.io/projected/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-kube-api-access-9nhqm\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 20:58:39.542323 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.542316 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 20:58:39.542406 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:39.542334 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:58:39.542406 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:39.542401 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert podName:6d8eb548-23d0-403d-a61e-f91a50c71507 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:40.042367991 +0000 UTC m=+34.244904701 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert") pod "ingress-canary-4chc2" (UID: "6d8eb548-23d0-403d-a61e-f91a50c71507") : secret "canary-serving-cert" not found Apr 16 20:58:39.542487 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:39.542421 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:58:39.542487 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:39.542475 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls podName:7d61f010-bef5-435b-a6dd-30cf6ec4dbe2 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:40.042459681 +0000 UTC m=+34.244996391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls") pod "dns-default-tbbn8" (UID: "7d61f010-bef5-435b-a6dd-30cf6ec4dbe2") : secret "dns-default-metrics-tls" not found Apr 16 20:58:39.542906 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.542865 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-config-volume\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 20:58:39.560057 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.560030 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmcxz\" (UniqueName: \"kubernetes.io/projected/6d8eb548-23d0-403d-a61e-f91a50c71507-kube-api-access-nmcxz\") pod \"ingress-canary-4chc2\" (UID: \"6d8eb548-23d0-403d-a61e-f91a50c71507\") " pod="openshift-ingress-canary/ingress-canary-4chc2" Apr 16 20:58:39.560201 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:39.560031 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nhqm\" (UniqueName: \"kubernetes.io/projected/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-kube-api-access-9nhqm\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 20:58:40.045750 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:40.045695 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 20:58:40.046463 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:40.045797 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert\") pod \"ingress-canary-4chc2\" (UID: \"6d8eb548-23d0-403d-a61e-f91a50c71507\") " pod="openshift-ingress-canary/ingress-canary-4chc2" Apr 16 20:58:40.046463 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:40.045844 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:58:40.046463 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:40.045907 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls podName:7d61f010-bef5-435b-a6dd-30cf6ec4dbe2 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:41.045892134 +0000 UTC m=+35.248428844 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls") pod "dns-default-tbbn8" (UID: "7d61f010-bef5-435b-a6dd-30cf6ec4dbe2") : secret "dns-default-metrics-tls" not found Apr 16 20:58:40.046463 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:40.045934 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:58:40.046463 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:40.046000 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert podName:6d8eb548-23d0-403d-a61e-f91a50c71507 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:41.0459871 +0000 UTC m=+35.248523813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert") pod "ingress-canary-4chc2" (UID: "6d8eb548-23d0-403d-a61e-f91a50c71507") : secret "canary-serving-cert" not found Apr 16 20:58:41.054556 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:41.054520 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 20:58:41.055079 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:41.054599 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert\") pod \"ingress-canary-4chc2\" (UID: \"6d8eb548-23d0-403d-a61e-f91a50c71507\") " pod="openshift-ingress-canary/ingress-canary-4chc2" Apr 16 20:58:41.055079 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:41.054672 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:58:41.055079 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:41.054732 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls podName:7d61f010-bef5-435b-a6dd-30cf6ec4dbe2 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:43.054717695 +0000 UTC m=+37.257254405 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls") pod "dns-default-tbbn8" (UID: "7d61f010-bef5-435b-a6dd-30cf6ec4dbe2") : secret "dns-default-metrics-tls" not found Apr 16 20:58:41.055079 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:41.054760 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:58:41.055079 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:41.054831 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert podName:6d8eb548-23d0-403d-a61e-f91a50c71507 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:43.054813921 +0000 UTC m=+37.257350649 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert") pod "ingress-canary-4chc2" (UID: "6d8eb548-23d0-403d-a61e-f91a50c71507") : secret "canary-serving-cert" not found Apr 16 20:58:41.562192 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:41.562161 2568 generic.go:358] "Generic (PLEG): container finished" podID="53b01fc4-bf7b-492d-ac6d-538fc5854832" containerID="c92ced99f95eb39234c9a90ec04953ebd57b6718b964778564a05154672081ba" exitCode=0 Apr 16 20:58:41.562354 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:41.562208 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bt6zf" event={"ID":"53b01fc4-bf7b-492d-ac6d-538fc5854832","Type":"ContainerDied","Data":"c92ced99f95eb39234c9a90ec04953ebd57b6718b964778564a05154672081ba"} Apr 16 20:58:42.566354 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:42.566316 2568 generic.go:358] "Generic (PLEG): container finished" podID="53b01fc4-bf7b-492d-ac6d-538fc5854832" containerID="a8f10427f61ae01f7dbfcb7175e5e0939e1e4415bdd25e3da9888ad9e36eee48" exitCode=0 Apr 16 20:58:42.566893 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:42.566409 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bt6zf" event={"ID":"53b01fc4-bf7b-492d-ac6d-538fc5854832","Type":"ContainerDied","Data":"a8f10427f61ae01f7dbfcb7175e5e0939e1e4415bdd25e3da9888ad9e36eee48"} Apr 16 20:58:43.069838 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:43.069652 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 20:58:43.069993 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:43.069870 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert\") pod \"ingress-canary-4chc2\" (UID: \"6d8eb548-23d0-403d-a61e-f91a50c71507\") " pod="openshift-ingress-canary/ingress-canary-4chc2" Apr 16 20:58:43.069993 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:43.069806 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:58:43.069993 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:43.069959 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls podName:7d61f010-bef5-435b-a6dd-30cf6ec4dbe2 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:47.069942315 +0000 UTC m=+41.272479025 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls") pod "dns-default-tbbn8" (UID: "7d61f010-bef5-435b-a6dd-30cf6ec4dbe2") : secret "dns-default-metrics-tls" not found Apr 16 20:58:43.069993 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:43.069968 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:58:43.070125 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:43.070004 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert podName:6d8eb548-23d0-403d-a61e-f91a50c71507 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:47.069992567 +0000 UTC m=+41.272529276 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert") pod "ingress-canary-4chc2" (UID: "6d8eb548-23d0-403d-a61e-f91a50c71507") : secret "canary-serving-cert" not found Apr 16 20:58:43.570855 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:43.570820 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bt6zf" event={"ID":"53b01fc4-bf7b-492d-ac6d-538fc5854832","Type":"ContainerStarted","Data":"78c94fe270bb76bdf0d6bf959b56199f3c52afbb35fb1e3cf82cc389adf52c3d"} Apr 16 20:58:43.595237 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:43.595186 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bt6zf" podStartSLOduration=4.508368744 podStartE2EDuration="37.59517186s" podCreationTimestamp="2026-04-16 20:58:06 +0000 UTC" firstStartedPulling="2026-04-16 20:58:07.610886137 +0000 UTC m=+1.813422846" lastFinishedPulling="2026-04-16 20:58:40.697689249 +0000 UTC m=+34.900225962" observedRunningTime="2026-04-16 20:58:43.593682283 +0000 UTC m=+37.796219017" watchObservedRunningTime="2026-04-16 20:58:43.59517186 +0000 UTC m=+37.797708604" Apr 16 20:58:47.098555 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:47.098509 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 20:58:47.098910 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:47.098577 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert\") pod \"ingress-canary-4chc2\" (UID: \"6d8eb548-23d0-403d-a61e-f91a50c71507\") " pod="openshift-ingress-canary/ingress-canary-4chc2" Apr 16 20:58:47.098910 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:47.098668 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:58:47.098910 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:47.098672 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:58:47.098910 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:47.098717 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert podName:6d8eb548-23d0-403d-a61e-f91a50c71507 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:55.098704785 +0000 UTC m=+49.301241495 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert") pod "ingress-canary-4chc2" (UID: "6d8eb548-23d0-403d-a61e-f91a50c71507") : secret "canary-serving-cert" not found Apr 16 20:58:47.098910 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:47.098734 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls podName:7d61f010-bef5-435b-a6dd-30cf6ec4dbe2 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:55.098721177 +0000 UTC m=+49.301257890 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls") pod "dns-default-tbbn8" (UID: "7d61f010-bef5-435b-a6dd-30cf6ec4dbe2") : secret "dns-default-metrics-tls" not found Apr 16 20:58:49.615197 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:49.615161 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret\") pod \"global-pull-secret-syncer-h8fjd\" (UID: \"3b1b29e8-6823-494b-9501-ec38717ca6cd\") " pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:49.618545 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:49.618509 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3b1b29e8-6823-494b-9501-ec38717ca6cd-original-pull-secret\") pod \"global-pull-secret-syncer-h8fjd\" (UID: \"3b1b29e8-6823-494b-9501-ec38717ca6cd\") " pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:49.847163 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:49.847121 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h8fjd" Apr 16 20:58:50.028712 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:50.028662 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-h8fjd"] Apr 16 20:58:50.032413 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:58:50.032368 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b1b29e8_6823_494b_9501_ec38717ca6cd.slice/crio-3669a3edd2bcf34d48aa535102bf3cff1c130fb58afbbe867d9fcf77ed78db2a WatchSource:0}: Error finding container 3669a3edd2bcf34d48aa535102bf3cff1c130fb58afbbe867d9fcf77ed78db2a: Status 404 returned error can't find the container with id 3669a3edd2bcf34d48aa535102bf3cff1c130fb58afbbe867d9fcf77ed78db2a Apr 16 20:58:50.587438 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:50.587396 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-h8fjd" event={"ID":"3b1b29e8-6823-494b-9501-ec38717ca6cd","Type":"ContainerStarted","Data":"3669a3edd2bcf34d48aa535102bf3cff1c130fb58afbbe867d9fcf77ed78db2a"} Apr 16 20:58:54.595528 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:54.595486 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-h8fjd" event={"ID":"3b1b29e8-6823-494b-9501-ec38717ca6cd","Type":"ContainerStarted","Data":"ce9acc739e506f537e9194992532fea0ad77647ca16d163003e2f685d3277f6c"} Apr 16 20:58:54.612755 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:54.612701 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-h8fjd" podStartSLOduration=33.680552647 podStartE2EDuration="37.612686513s" podCreationTimestamp="2026-04-16 20:58:17 +0000 UTC" firstStartedPulling="2026-04-16 20:58:50.034063134 +0000 UTC m=+44.236599844" lastFinishedPulling="2026-04-16 20:58:53.966196988 +0000 UTC m=+48.168733710" observedRunningTime="2026-04-16 20:58:54.611795858 +0000 UTC m=+48.814332592" watchObservedRunningTime="2026-04-16 20:58:54.612686513 +0000 UTC m=+48.815223245" Apr 16 20:58:55.160488 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:55.160450 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert\") pod \"ingress-canary-4chc2\" (UID: \"6d8eb548-23d0-403d-a61e-f91a50c71507\") " pod="openshift-ingress-canary/ingress-canary-4chc2" Apr 16 20:58:55.160670 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:58:55.160519 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 20:58:55.160670 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:55.160615 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:58:55.160670 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:55.160617 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:58:55.160670 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:55.160666 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls podName:7d61f010-bef5-435b-a6dd-30cf6ec4dbe2 nodeName:}" failed. No retries permitted until 2026-04-16 20:59:11.160653012 +0000 UTC m=+65.363189723 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls") pod "dns-default-tbbn8" (UID: "7d61f010-bef5-435b-a6dd-30cf6ec4dbe2") : secret "dns-default-metrics-tls" not found Apr 16 20:58:55.160830 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:58:55.160679 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert podName:6d8eb548-23d0-403d-a61e-f91a50c71507 nodeName:}" failed. No retries permitted until 2026-04-16 20:59:11.160673444 +0000 UTC m=+65.363210153 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert") pod "ingress-canary-4chc2" (UID: "6d8eb548-23d0-403d-a61e-f91a50c71507") : secret "canary-serving-cert" not found Apr 16 20:59:05.560179 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:05.560150 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bvjzw" Apr 16 20:59:11.067929 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:11.067880 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs\") pod \"network-metrics-daemon-llm2q\" (UID: \"a6620072-c60f-4d78-bc86-aac34b2c5098\") " pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 20:59:11.071136 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:11.071103 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:59:11.078908 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:59:11.078885 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:59:11.079027 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:59:11.078948 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs podName:a6620072-c60f-4d78-bc86-aac34b2c5098 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:15.078930818 +0000 UTC m=+129.281467533 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs") pod "network-metrics-daemon-llm2q" (UID: "a6620072-c60f-4d78-bc86-aac34b2c5098") : secret "metrics-daemon-secret" not found Apr 16 20:59:11.168242 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:11.168201 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 20:59:11.168431 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:11.168258 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert\") pod \"ingress-canary-4chc2\" (UID: \"6d8eb548-23d0-403d-a61e-f91a50c71507\") " pod="openshift-ingress-canary/ingress-canary-4chc2" Apr 16 20:59:11.168431 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:11.168279 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnkk\" (UniqueName: \"kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk\") pod \"network-check-target-z6tjd\" (UID: \"87a25fbd-e69f-400f-a514-d2159ca520b9\") " pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:59:11.168431 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:59:11.168344 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:59:11.168431 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:59:11.168410 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:59:11.168431 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:59:11.168425 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls podName:7d61f010-bef5-435b-a6dd-30cf6ec4dbe2 nodeName:}" failed. No retries permitted until 2026-04-16 20:59:43.168408133 +0000 UTC m=+97.370944843 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls") pod "dns-default-tbbn8" (UID: "7d61f010-bef5-435b-a6dd-30cf6ec4dbe2") : secret "dns-default-metrics-tls" not found Apr 16 20:59:11.168679 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:59:11.168457 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert podName:6d8eb548-23d0-403d-a61e-f91a50c71507 nodeName:}" failed. No retries permitted until 2026-04-16 20:59:43.168444704 +0000 UTC m=+97.370981414 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert") pod "ingress-canary-4chc2" (UID: "6d8eb548-23d0-403d-a61e-f91a50c71507") : secret "canary-serving-cert" not found Apr 16 20:59:11.171757 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:11.171739 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:59:11.181448 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:11.181427 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:59:11.191868 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:11.191839 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvnkk\" (UniqueName: \"kubernetes.io/projected/87a25fbd-e69f-400f-a514-d2159ca520b9-kube-api-access-rvnkk\") pod \"network-check-target-z6tjd\" (UID: \"87a25fbd-e69f-400f-a514-d2159ca520b9\") " pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:59:11.443900 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:11.443812 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c7977\"" Apr 16 20:59:11.451433 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:11.451412 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:59:11.596108 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:11.596078 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-z6tjd"] Apr 16 20:59:11.599257 ip-10-0-138-120 kubenswrapper[2568]: W0416 20:59:11.599220 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87a25fbd_e69f_400f_a514_d2159ca520b9.slice/crio-bbae09c9325864444bce2c6c81357fdede510b17289f1d38f67b6d45967d1c03 WatchSource:0}: Error finding container bbae09c9325864444bce2c6c81357fdede510b17289f1d38f67b6d45967d1c03: Status 404 returned error can't find the container with id bbae09c9325864444bce2c6c81357fdede510b17289f1d38f67b6d45967d1c03 Apr 16 20:59:11.628604 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:11.628567 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-z6tjd" event={"ID":"87a25fbd-e69f-400f-a514-d2159ca520b9","Type":"ContainerStarted","Data":"bbae09c9325864444bce2c6c81357fdede510b17289f1d38f67b6d45967d1c03"} Apr 16 20:59:14.635812 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:14.635718 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-z6tjd" event={"ID":"87a25fbd-e69f-400f-a514-d2159ca520b9","Type":"ContainerStarted","Data":"5eb2c79746249831cf2f86ddb61bf6a84cd2eb63d20eeb2a74834313ef2189ef"} Apr 16 20:59:14.636141 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:14.635844 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 20:59:14.651751 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:14.651704 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-z6tjd" podStartSLOduration=65.960442625 podStartE2EDuration="1m8.651689192s" podCreationTimestamp="2026-04-16 20:58:06 +0000 UTC" firstStartedPulling="2026-04-16 20:59:11.601079812 +0000 UTC m=+65.803616522" lastFinishedPulling="2026-04-16 20:59:14.292326376 +0000 UTC m=+68.494863089" observedRunningTime="2026-04-16 20:59:14.651025675 +0000 UTC m=+68.853562404" watchObservedRunningTime="2026-04-16 20:59:14.651689192 +0000 UTC m=+68.854225924" Apr 16 20:59:43.177501 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:43.177341 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 20:59:43.177501 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:43.177447 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert\") pod \"ingress-canary-4chc2\" (UID: \"6d8eb548-23d0-403d-a61e-f91a50c71507\") " pod="openshift-ingress-canary/ingress-canary-4chc2" Apr 16 20:59:43.178022 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:59:43.177508 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:59:43.178022 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:59:43.177528 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:59:43.178022 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:59:43.177590 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls podName:7d61f010-bef5-435b-a6dd-30cf6ec4dbe2 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:47.177570471 +0000 UTC m=+161.380107187 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls") pod "dns-default-tbbn8" (UID: "7d61f010-bef5-435b-a6dd-30cf6ec4dbe2") : secret "dns-default-metrics-tls" not found Apr 16 20:59:43.178022 ip-10-0-138-120 kubenswrapper[2568]: E0416 20:59:43.177609 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert podName:6d8eb548-23d0-403d-a61e-f91a50c71507 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:47.17759969 +0000 UTC m=+161.380136402 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert") pod "ingress-canary-4chc2" (UID: "6d8eb548-23d0-403d-a61e-f91a50c71507") : secret "canary-serving-cert" not found Apr 16 20:59:45.639823 ip-10-0-138-120 kubenswrapper[2568]: I0416 20:59:45.639795 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-z6tjd" Apr 16 21:00:15.091997 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:15.091951 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs\") pod \"network-metrics-daemon-llm2q\" (UID: \"a6620072-c60f-4d78-bc86-aac34b2c5098\") " pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 21:00:15.092490 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:15.092072 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 21:00:15.092490 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:15.092139 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs podName:a6620072-c60f-4d78-bc86-aac34b2c5098 nodeName:}" failed. No retries permitted until 2026-04-16 21:02:17.092123693 +0000 UTC m=+251.294660407 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs") pod "network-metrics-daemon-llm2q" (UID: "a6620072-c60f-4d78-bc86-aac34b2c5098") : secret "metrics-daemon-secret" not found Apr 16 21:00:18.830476 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:18.830443 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zn96z"] Apr 16 21:00:18.832423 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:18.832407 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zn96z" Apr 16 21:00:18.835610 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:18.835592 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-h5l2n\"" Apr 16 21:00:18.836696 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:18.836678 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 21:00:18.836800 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:18.836721 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 21:00:18.854764 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:18.854737 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zn96z"] Apr 16 21:00:18.919940 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:18.919907 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz27v\" (UniqueName: \"kubernetes.io/projected/c556d26b-cf58-4648-b502-fd757f2e826b-kube-api-access-pz27v\") pod \"volume-data-source-validator-7c6cbb6c87-zn96z\" (UID: \"c556d26b-cf58-4648-b502-fd757f2e826b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zn96z" Apr 16 21:00:18.928249 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:18.928216 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96"] Apr 16 21:00:18.930009 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:18.929991 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96" Apr 16 21:00:18.932685 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:18.932665 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 21:00:18.933085 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:18.933068 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-tqq5r\"" Apr 16 21:00:18.933179 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:18.933083 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 21:00:18.933179 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:18.933101 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 21:00:18.943709 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:18.943685 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96"] Apr 16 21:00:19.020826 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.020785 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrzcl\" (UniqueName: \"kubernetes.io/projected/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-kube-api-access-rrzcl\") pod \"cluster-samples-operator-6dc5bdb6b4-m7w96\" (UID: \"a1ac2529-94a2-4ab5-853d-fca47b26f5c8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96" Apr 16 21:00:19.020826 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.020835 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pz27v\" (UniqueName: \"kubernetes.io/projected/c556d26b-cf58-4648-b502-fd757f2e826b-kube-api-access-pz27v\") pod \"volume-data-source-validator-7c6cbb6c87-zn96z\" (UID: \"c556d26b-cf58-4648-b502-fd757f2e826b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zn96z" Apr 16 21:00:19.021044 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.020914 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m7w96\" (UID: \"a1ac2529-94a2-4ab5-853d-fca47b26f5c8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96" Apr 16 21:00:19.035204 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.035174 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz27v\" (UniqueName: \"kubernetes.io/projected/c556d26b-cf58-4648-b502-fd757f2e826b-kube-api-access-pz27v\") pod \"volume-data-source-validator-7c6cbb6c87-zn96z\" (UID: \"c556d26b-cf58-4648-b502-fd757f2e826b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zn96z" Apr 16 21:00:19.121300 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.121204 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m7w96\" (UID: \"a1ac2529-94a2-4ab5-853d-fca47b26f5c8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96" Apr 16 21:00:19.121300 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.121296 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrzcl\" (UniqueName: \"kubernetes.io/projected/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-kube-api-access-rrzcl\") pod \"cluster-samples-operator-6dc5bdb6b4-m7w96\" (UID: \"a1ac2529-94a2-4ab5-853d-fca47b26f5c8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96" Apr 16 21:00:19.121538 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:19.121353 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 21:00:19.121538 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:19.121454 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-samples-operator-tls podName:a1ac2529-94a2-4ab5-853d-fca47b26f5c8 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:19.621437988 +0000 UTC m=+133.823974698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-m7w96" (UID: "a1ac2529-94a2-4ab5-853d-fca47b26f5c8") : secret "samples-operator-tls" not found Apr 16 21:00:19.133610 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.133584 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrzcl\" (UniqueName: \"kubernetes.io/projected/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-kube-api-access-rrzcl\") pod \"cluster-samples-operator-6dc5bdb6b4-m7w96\" (UID: \"a1ac2529-94a2-4ab5-853d-fca47b26f5c8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96" Apr 16 21:00:19.141332 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.141313 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zn96z" Apr 16 21:00:19.259081 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.259039 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zn96z"] Apr 16 21:00:19.263322 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:00:19.263296 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc556d26b_cf58_4648_b502_fd757f2e826b.slice/crio-a9cbb3a6930a013030ca13d84146994e726535f5434f315f2aad69d1c6c2dd18 WatchSource:0}: Error finding container a9cbb3a6930a013030ca13d84146994e726535f5434f315f2aad69d1c6c2dd18: Status 404 returned error can't find the container with id a9cbb3a6930a013030ca13d84146994e726535f5434f315f2aad69d1c6c2dd18 Apr 16 21:00:19.624143 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.624111 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m7w96\" (UID: \"a1ac2529-94a2-4ab5-853d-fca47b26f5c8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96" Apr 16 21:00:19.624310 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:19.624238 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 21:00:19.624310 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:19.624291 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-samples-operator-tls podName:a1ac2529-94a2-4ab5-853d-fca47b26f5c8 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:20.624276656 +0000 UTC m=+134.826813369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-m7w96" (UID: "a1ac2529-94a2-4ab5-853d-fca47b26f5c8") : secret "samples-operator-tls" not found Apr 16 21:00:19.758139 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.758107 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zn96z" event={"ID":"c556d26b-cf58-4648-b502-fd757f2e826b","Type":"ContainerStarted","Data":"a9cbb3a6930a013030ca13d84146994e726535f5434f315f2aad69d1c6c2dd18"} Apr 16 21:00:19.862389 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.862353 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7jmtg"] Apr 16 21:00:19.864239 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.864221 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" Apr 16 21:00:19.867326 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.867302 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-9jbj5\"" Apr 16 21:00:19.867326 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.867320 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 21:00:19.867525 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.867371 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 21:00:19.867525 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.867325 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 21:00:19.868345 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.868314 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 21:00:19.874060 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.874038 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 21:00:19.874503 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.874452 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7jmtg"] Apr 16 21:00:19.926877 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.926841 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/861b7de5-b08b-459b-b432-8f80dc4d6df7-trusted-ca\") pod \"console-operator-9d4b6777b-7jmtg\" (UID: \"861b7de5-b08b-459b-b432-8f80dc4d6df7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" Apr 16 21:00:19.927050 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.926937 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/861b7de5-b08b-459b-b432-8f80dc4d6df7-config\") pod \"console-operator-9d4b6777b-7jmtg\" (UID: \"861b7de5-b08b-459b-b432-8f80dc4d6df7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" Apr 16 21:00:19.927050 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.926994 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/861b7de5-b08b-459b-b432-8f80dc4d6df7-serving-cert\") pod \"console-operator-9d4b6777b-7jmtg\" (UID: \"861b7de5-b08b-459b-b432-8f80dc4d6df7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" Apr 16 21:00:19.927050 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:19.927020 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkxxb\" (UniqueName: \"kubernetes.io/projected/861b7de5-b08b-459b-b432-8f80dc4d6df7-kube-api-access-qkxxb\") pod \"console-operator-9d4b6777b-7jmtg\" (UID: \"861b7de5-b08b-459b-b432-8f80dc4d6df7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" Apr 16 21:00:20.028230 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:20.028184 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/861b7de5-b08b-459b-b432-8f80dc4d6df7-serving-cert\") pod \"console-operator-9d4b6777b-7jmtg\" (UID: \"861b7de5-b08b-459b-b432-8f80dc4d6df7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" Apr 16 21:00:20.028230 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:20.028233 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkxxb\" (UniqueName: \"kubernetes.io/projected/861b7de5-b08b-459b-b432-8f80dc4d6df7-kube-api-access-qkxxb\") pod \"console-operator-9d4b6777b-7jmtg\" (UID: \"861b7de5-b08b-459b-b432-8f80dc4d6df7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" Apr 16 21:00:20.028511 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:20.028350 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/861b7de5-b08b-459b-b432-8f80dc4d6df7-trusted-ca\") pod \"console-operator-9d4b6777b-7jmtg\" (UID: \"861b7de5-b08b-459b-b432-8f80dc4d6df7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" Apr 16 21:00:20.028511 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:20.028456 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/861b7de5-b08b-459b-b432-8f80dc4d6df7-config\") pod \"console-operator-9d4b6777b-7jmtg\" (UID: \"861b7de5-b08b-459b-b432-8f80dc4d6df7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" Apr 16 21:00:20.029540 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:20.029514 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/861b7de5-b08b-459b-b432-8f80dc4d6df7-config\") pod \"console-operator-9d4b6777b-7jmtg\" (UID: \"861b7de5-b08b-459b-b432-8f80dc4d6df7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" Apr 16 21:00:20.030042 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:20.030001 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/861b7de5-b08b-459b-b432-8f80dc4d6df7-trusted-ca\") pod \"console-operator-9d4b6777b-7jmtg\" (UID: \"861b7de5-b08b-459b-b432-8f80dc4d6df7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" Apr 16 21:00:20.031058 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:20.031022 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/861b7de5-b08b-459b-b432-8f80dc4d6df7-serving-cert\") pod \"console-operator-9d4b6777b-7jmtg\" (UID: \"861b7de5-b08b-459b-b432-8f80dc4d6df7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" Apr 16 21:00:20.037484 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:20.037460 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkxxb\" (UniqueName: \"kubernetes.io/projected/861b7de5-b08b-459b-b432-8f80dc4d6df7-kube-api-access-qkxxb\") pod \"console-operator-9d4b6777b-7jmtg\" (UID: \"861b7de5-b08b-459b-b432-8f80dc4d6df7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" Apr 16 21:00:20.176566 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:20.176472 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" Apr 16 21:00:20.298670 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:20.298631 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7jmtg"] Apr 16 21:00:20.530051 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:00:20.530008 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod861b7de5_b08b_459b_b432_8f80dc4d6df7.slice/crio-12feae98ebb3c934033617e87e312365fb2048d035180607f9f17e7b8c5b5b74 WatchSource:0}: Error finding container 12feae98ebb3c934033617e87e312365fb2048d035180607f9f17e7b8c5b5b74: Status 404 returned error can't find the container with id 12feae98ebb3c934033617e87e312365fb2048d035180607f9f17e7b8c5b5b74 Apr 16 21:00:20.633879 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:20.633833 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m7w96\" (UID: \"a1ac2529-94a2-4ab5-853d-fca47b26f5c8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96" Apr 16 21:00:20.634298 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:20.634131 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 21:00:20.634298 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:20.634206 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-samples-operator-tls podName:a1ac2529-94a2-4ab5-853d-fca47b26f5c8 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:22.634183966 +0000 UTC m=+136.836720691 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-m7w96" (UID: "a1ac2529-94a2-4ab5-853d-fca47b26f5c8") : secret "samples-operator-tls" not found Apr 16 21:00:20.761326 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:20.761285 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" event={"ID":"861b7de5-b08b-459b-b432-8f80dc4d6df7","Type":"ContainerStarted","Data":"12feae98ebb3c934033617e87e312365fb2048d035180607f9f17e7b8c5b5b74"} Apr 16 21:00:20.762566 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:20.762540 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zn96z" event={"ID":"c556d26b-cf58-4648-b502-fd757f2e826b","Type":"ContainerStarted","Data":"06bfa69fc0b44194974fbd1a62946c1c7b239ee95bccf58b487b182d838a27bc"} Apr 16 21:00:20.778641 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:20.778593 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zn96z" podStartSLOduration=1.490672561 podStartE2EDuration="2.778576553s" podCreationTimestamp="2026-04-16 21:00:18 +0000 UTC" firstStartedPulling="2026-04-16 21:00:19.264981975 +0000 UTC m=+133.467518684" lastFinishedPulling="2026-04-16 21:00:20.552885962 +0000 UTC m=+134.755422676" observedRunningTime="2026-04-16 21:00:20.778058385 +0000 UTC m=+134.980595117" watchObservedRunningTime="2026-04-16 21:00:20.778576553 +0000 UTC m=+134.981113311" Apr 16 21:00:21.430605 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.430568 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-69468958bf-49nvn"] Apr 16 21:00:21.432481 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.432460 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.435495 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.435469 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 21:00:21.435657 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.435522 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 21:00:21.435657 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.435480 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 21:00:21.435780 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.435674 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-pqqcq\"" Apr 16 21:00:21.440974 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.440839 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 21:00:21.446033 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.446010 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69468958bf-49nvn"] Apr 16 21:00:21.541795 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.541760 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-image-registry-private-configuration\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.541971 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.541815 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-certificates\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.541971 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.541929 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-trusted-ca\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.542065 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.541977 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9jbj\" (UniqueName: \"kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-kube-api-access-w9jbj\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.542065 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.542033 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-installation-pull-secrets\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.542156 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.542071 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.542156 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.542095 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-bound-sa-token\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.542156 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.542141 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-ca-trust-extracted\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.642611 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.642571 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-certificates\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.642792 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.642629 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-trusted-ca\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.642792 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.642661 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9jbj\" (UniqueName: \"kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-kube-api-access-w9jbj\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.642792 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.642703 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-installation-pull-secrets\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.642943 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.642787 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.642943 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.642825 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-bound-sa-token\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.642943 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:21.642885 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 21:00:21.642943 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:21.642908 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69468958bf-49nvn: secret "image-registry-tls" not found Apr 16 21:00:21.642943 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.642914 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-ca-trust-extracted\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.643181 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:21.642970 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls podName:8dd344a7-9f73-4d3b-8235-d73a07ccbe84 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:22.142947155 +0000 UTC m=+136.345483868 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls") pod "image-registry-69468958bf-49nvn" (UID: "8dd344a7-9f73-4d3b-8235-d73a07ccbe84") : secret "image-registry-tls" not found Apr 16 21:00:21.643181 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.643007 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-image-registry-private-configuration\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.643316 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.643295 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-ca-trust-extracted\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.643395 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.643294 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-certificates\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.643750 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.643732 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-trusted-ca\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.645704 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.645681 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-image-registry-private-configuration\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.645931 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.645909 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-installation-pull-secrets\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.652546 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.652494 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9jbj\" (UniqueName: \"kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-kube-api-access-w9jbj\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:21.652690 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:21.652667 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-bound-sa-token\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:22.145905 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:22.145860 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:22.146099 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:22.146036 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 21:00:22.146099 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:22.146060 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69468958bf-49nvn: secret "image-registry-tls" not found Apr 16 21:00:22.146215 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:22.146132 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls podName:8dd344a7-9f73-4d3b-8235-d73a07ccbe84 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:23.146110618 +0000 UTC m=+137.348647342 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls") pod "image-registry-69468958bf-49nvn" (UID: "8dd344a7-9f73-4d3b-8235-d73a07ccbe84") : secret "image-registry-tls" not found Apr 16 21:00:22.649268 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:22.649231 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m7w96\" (UID: \"a1ac2529-94a2-4ab5-853d-fca47b26f5c8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96" Apr 16 21:00:22.649750 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:22.649429 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 21:00:22.649750 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:22.649516 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-samples-operator-tls podName:a1ac2529-94a2-4ab5-853d-fca47b26f5c8 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:26.649495846 +0000 UTC m=+140.852032556 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-m7w96" (UID: "a1ac2529-94a2-4ab5-853d-fca47b26f5c8") : secret "samples-operator-tls" not found Apr 16 21:00:22.770822 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:22.770793 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/0.log" Apr 16 21:00:22.770983 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:22.770839 2568 generic.go:358] "Generic (PLEG): container finished" podID="861b7de5-b08b-459b-b432-8f80dc4d6df7" containerID="5acd4fe078ced5b2b844b06d8372dd2d16e5a81687bcfafdb3e56db5c7df7043" exitCode=255 Apr 16 21:00:22.770983 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:22.770878 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" event={"ID":"861b7de5-b08b-459b-b432-8f80dc4d6df7","Type":"ContainerDied","Data":"5acd4fe078ced5b2b844b06d8372dd2d16e5a81687bcfafdb3e56db5c7df7043"} Apr 16 21:00:22.771111 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:22.771096 2568 scope.go:117] "RemoveContainer" containerID="5acd4fe078ced5b2b844b06d8372dd2d16e5a81687bcfafdb3e56db5c7df7043" Apr 16 21:00:23.154044 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:23.153957 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:23.154186 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:23.154072 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 21:00:23.154186 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:23.154084 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69468958bf-49nvn: secret "image-registry-tls" not found Apr 16 21:00:23.154186 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:23.154131 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls podName:8dd344a7-9f73-4d3b-8235-d73a07ccbe84 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:25.154117627 +0000 UTC m=+139.356654337 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls") pod "image-registry-69468958bf-49nvn" (UID: "8dd344a7-9f73-4d3b-8235-d73a07ccbe84") : secret "image-registry-tls" not found Apr 16 21:00:23.774813 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:23.774777 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/1.log" Apr 16 21:00:23.775247 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:23.775194 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/0.log" Apr 16 21:00:23.775247 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:23.775233 2568 generic.go:358] "Generic (PLEG): container finished" podID="861b7de5-b08b-459b-b432-8f80dc4d6df7" containerID="58817af328691e9b580e22422880cda4f9ce213583b88dca2ba3cc5b285be61c" exitCode=255 Apr 16 21:00:23.775354 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:23.775307 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" event={"ID":"861b7de5-b08b-459b-b432-8f80dc4d6df7","Type":"ContainerDied","Data":"58817af328691e9b580e22422880cda4f9ce213583b88dca2ba3cc5b285be61c"} Apr 16 21:00:23.775354 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:23.775349 2568 scope.go:117] "RemoveContainer" containerID="5acd4fe078ced5b2b844b06d8372dd2d16e5a81687bcfafdb3e56db5c7df7043" Apr 16 21:00:23.775548 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:23.775531 2568 scope.go:117] "RemoveContainer" containerID="58817af328691e9b580e22422880cda4f9ce213583b88dca2ba3cc5b285be61c" Apr 16 21:00:23.775752 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:23.775735 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7jmtg_openshift-console-operator(861b7de5-b08b-459b-b432-8f80dc4d6df7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" podUID="861b7de5-b08b-459b-b432-8f80dc4d6df7" Apr 16 21:00:24.480574 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:24.480547 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cjqkn_16da182a-ff12-4e2d-800d-10e00ef1512d/dns-node-resolver/0.log" Apr 16 21:00:24.778929 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:24.778900 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/1.log" Apr 16 21:00:24.779293 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:24.779249 2568 scope.go:117] "RemoveContainer" containerID="58817af328691e9b580e22422880cda4f9ce213583b88dca2ba3cc5b285be61c" Apr 16 21:00:24.779467 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:24.779436 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7jmtg_openshift-console-operator(861b7de5-b08b-459b-b432-8f80dc4d6df7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" podUID="861b7de5-b08b-459b-b432-8f80dc4d6df7" Apr 16 21:00:25.169467 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:25.169354 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:25.169607 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:25.169500 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 21:00:25.169607 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:25.169526 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69468958bf-49nvn: secret "image-registry-tls" not found Apr 16 21:00:25.169607 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:25.169577 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls podName:8dd344a7-9f73-4d3b-8235-d73a07ccbe84 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:29.169563813 +0000 UTC m=+143.372100523 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls") pod "image-registry-69468958bf-49nvn" (UID: "8dd344a7-9f73-4d3b-8235-d73a07ccbe84") : secret "image-registry-tls" not found Apr 16 21:00:25.880994 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:25.880965 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-s6rx6_0a96ee09-38fe-48c6-891c-03c6540c788b/node-ca/0.log" Apr 16 21:00:26.680997 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:26.680938 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m7w96\" (UID: \"a1ac2529-94a2-4ab5-853d-fca47b26f5c8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96" Apr 16 21:00:26.681186 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:26.681114 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 21:00:26.681186 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:26.681184 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-samples-operator-tls podName:a1ac2529-94a2-4ab5-853d-fca47b26f5c8 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:34.68116825 +0000 UTC m=+148.883704961 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-m7w96" (UID: "a1ac2529-94a2-4ab5-853d-fca47b26f5c8") : secret "samples-operator-tls" not found Apr 16 21:00:28.032506 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:28.032469 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-lmxn9"] Apr 16 21:00:28.034869 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:28.034853 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lmxn9" Apr 16 21:00:28.037520 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:28.037495 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 21:00:28.037645 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:28.037533 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 21:00:28.038803 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:28.038771 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-nfvjj\"" Apr 16 21:00:28.042791 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:28.042767 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-lmxn9"] Apr 16 21:00:28.192031 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:28.191985 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf2kt\" (UniqueName: \"kubernetes.io/projected/2b8d9335-a0d6-4d52-b133-5eb8282aab9a-kube-api-access-mf2kt\") pod \"migrator-74bb7799d9-lmxn9\" (UID: \"2b8d9335-a0d6-4d52-b133-5eb8282aab9a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lmxn9" Apr 16 21:00:28.293171 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:28.293079 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf2kt\" (UniqueName: \"kubernetes.io/projected/2b8d9335-a0d6-4d52-b133-5eb8282aab9a-kube-api-access-mf2kt\") pod \"migrator-74bb7799d9-lmxn9\" (UID: \"2b8d9335-a0d6-4d52-b133-5eb8282aab9a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lmxn9" Apr 16 21:00:28.301389 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:28.301361 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf2kt\" (UniqueName: \"kubernetes.io/projected/2b8d9335-a0d6-4d52-b133-5eb8282aab9a-kube-api-access-mf2kt\") pod \"migrator-74bb7799d9-lmxn9\" (UID: \"2b8d9335-a0d6-4d52-b133-5eb8282aab9a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lmxn9" Apr 16 21:00:28.343804 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:28.343773 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lmxn9" Apr 16 21:00:28.462819 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:28.462788 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-lmxn9"] Apr 16 21:00:28.466416 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:00:28.466372 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b8d9335_a0d6_4d52_b133_5eb8282aab9a.slice/crio-e15c31f53d43f37fcfb41708a5e139c3b9f44fb30a30e2ade458c833c0a3cc0d WatchSource:0}: Error finding container e15c31f53d43f37fcfb41708a5e139c3b9f44fb30a30e2ade458c833c0a3cc0d: Status 404 returned error can't find the container with id e15c31f53d43f37fcfb41708a5e139c3b9f44fb30a30e2ade458c833c0a3cc0d Apr 16 21:00:28.787136 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:28.787094 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lmxn9" event={"ID":"2b8d9335-a0d6-4d52-b133-5eb8282aab9a","Type":"ContainerStarted","Data":"e15c31f53d43f37fcfb41708a5e139c3b9f44fb30a30e2ade458c833c0a3cc0d"} Apr 16 21:00:29.199638 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:29.199560 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:29.199963 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:29.199721 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 21:00:29.199963 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:29.199741 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69468958bf-49nvn: secret "image-registry-tls" not found Apr 16 21:00:29.199963 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:29.199793 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls podName:8dd344a7-9f73-4d3b-8235-d73a07ccbe84 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:37.199778244 +0000 UTC m=+151.402314953 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls") pod "image-registry-69468958bf-49nvn" (UID: "8dd344a7-9f73-4d3b-8235-d73a07ccbe84") : secret "image-registry-tls" not found Apr 16 21:00:29.791290 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:29.791251 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lmxn9" event={"ID":"2b8d9335-a0d6-4d52-b133-5eb8282aab9a","Type":"ContainerStarted","Data":"67c62c227d01daa16732fc9084d98ca652b1767e8ff05742f1c4a15ccb5a7dee"} Apr 16 21:00:29.791290 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:29.791289 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lmxn9" event={"ID":"2b8d9335-a0d6-4d52-b133-5eb8282aab9a","Type":"ContainerStarted","Data":"2f098667fd7e254a7641b0edc2fb159438dd2cdd73c5c40bfa269f702050f6b7"} Apr 16 21:00:29.808505 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:29.808423 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lmxn9" podStartSLOduration=0.727541471 podStartE2EDuration="1.808395992s" podCreationTimestamp="2026-04-16 21:00:28 +0000 UTC" firstStartedPulling="2026-04-16 21:00:28.468601698 +0000 UTC m=+142.671138408" lastFinishedPulling="2026-04-16 21:00:29.549456216 +0000 UTC m=+143.751992929" observedRunningTime="2026-04-16 21:00:29.806691958 +0000 UTC m=+144.009228690" watchObservedRunningTime="2026-04-16 21:00:29.808395992 +0000 UTC m=+144.010932714" Apr 16 21:00:30.177193 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:30.177100 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" Apr 16 21:00:30.177193 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:30.177139 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" Apr 16 21:00:30.177550 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:30.177534 2568 scope.go:117] "RemoveContainer" containerID="58817af328691e9b580e22422880cda4f9ce213583b88dca2ba3cc5b285be61c" Apr 16 21:00:30.177725 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:30.177705 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7jmtg_openshift-console-operator(861b7de5-b08b-459b-b432-8f80dc4d6df7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" podUID="861b7de5-b08b-459b-b432-8f80dc4d6df7" Apr 16 21:00:34.743682 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:34.743646 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m7w96\" (UID: \"a1ac2529-94a2-4ab5-853d-fca47b26f5c8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96" Apr 16 21:00:34.746129 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:34.746099 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1ac2529-94a2-4ab5-853d-fca47b26f5c8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m7w96\" (UID: \"a1ac2529-94a2-4ab5-853d-fca47b26f5c8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96" Apr 16 21:00:34.838681 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:34.838648 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96" Apr 16 21:00:34.959676 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:34.959647 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96"] Apr 16 21:00:35.807783 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:35.807735 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96" event={"ID":"a1ac2529-94a2-4ab5-853d-fca47b26f5c8","Type":"ContainerStarted","Data":"a76410d345fcd1f52f056496fff48bb4bbc2e5f743ef94d1723af13ee6d7b387"} Apr 16 21:00:36.812072 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:36.812039 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96" event={"ID":"a1ac2529-94a2-4ab5-853d-fca47b26f5c8","Type":"ContainerStarted","Data":"12a6f0ea99073b5ae4c7fa1b628689dff007f2e04f41ab1aeebf266ed3a7313e"} Apr 16 21:00:36.812501 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:36.812078 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96" event={"ID":"a1ac2529-94a2-4ab5-853d-fca47b26f5c8","Type":"ContainerStarted","Data":"10fbe2afa6868c5bee1c247a292827ae4f87a96da701c93a1346abcd9eab4dc6"} Apr 16 21:00:36.831748 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:36.831660 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m7w96" podStartSLOduration=17.254645769 podStartE2EDuration="18.831641617s" podCreationTimestamp="2026-04-16 21:00:18 +0000 UTC" firstStartedPulling="2026-04-16 21:00:34.996553974 +0000 UTC m=+149.199090684" lastFinishedPulling="2026-04-16 21:00:36.573549822 +0000 UTC m=+150.776086532" observedRunningTime="2026-04-16 21:00:36.831532215 +0000 UTC m=+151.034068943" watchObservedRunningTime="2026-04-16 21:00:36.831641617 +0000 UTC m=+151.034178349" Apr 16 21:00:37.264757 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:37.264712 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls\") pod \"image-registry-69468958bf-49nvn\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:37.264906 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:37.264868 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 21:00:37.264906 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:37.264890 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69468958bf-49nvn: secret "image-registry-tls" not found Apr 16 21:00:37.264987 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:37.264943 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls podName:8dd344a7-9f73-4d3b-8235-d73a07ccbe84 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:53.264927796 +0000 UTC m=+167.467464506 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls") pod "image-registry-69468958bf-49nvn" (UID: "8dd344a7-9f73-4d3b-8235-d73a07ccbe84") : secret "image-registry-tls" not found Apr 16 21:00:42.297687 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:42.297627 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-tbbn8" podUID="7d61f010-bef5-435b-a6dd-30cf6ec4dbe2" Apr 16 21:00:42.315310 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:42.315278 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-4chc2" podUID="6d8eb548-23d0-403d-a61e-f91a50c71507" Apr 16 21:00:42.352548 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:42.352505 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-llm2q" podUID="a6620072-c60f-4d78-bc86-aac34b2c5098" Apr 16 21:00:42.826584 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:42.826554 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tbbn8" Apr 16 21:00:43.329499 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:43.329475 2568 scope.go:117] "RemoveContainer" containerID="58817af328691e9b580e22422880cda4f9ce213583b88dca2ba3cc5b285be61c" Apr 16 21:00:43.830818 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:43.830791 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/1.log" Apr 16 21:00:43.830972 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:43.830842 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" event={"ID":"861b7de5-b08b-459b-b432-8f80dc4d6df7","Type":"ContainerStarted","Data":"aba40d573096eb1512199e98992dc507eba674094ce65df290fe49f5d8c36bdf"} Apr 16 21:00:43.831126 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:43.831105 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" Apr 16 21:00:43.852581 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:43.852525 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" podStartSLOduration=23.078839376 podStartE2EDuration="24.852510891s" podCreationTimestamp="2026-04-16 21:00:19 +0000 UTC" firstStartedPulling="2026-04-16 21:00:20.531737676 +0000 UTC m=+134.734274386" lastFinishedPulling="2026-04-16 21:00:22.305409187 +0000 UTC m=+136.507945901" observedRunningTime="2026-04-16 21:00:43.850611405 +0000 UTC m=+158.053148136" watchObservedRunningTime="2026-04-16 21:00:43.852510891 +0000 UTC m=+158.055047622" Apr 16 21:00:44.124480 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:44.124371 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-7jmtg" Apr 16 21:00:47.237768 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:47.237730 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert\") pod \"ingress-canary-4chc2\" (UID: \"6d8eb548-23d0-403d-a61e-f91a50c71507\") " pod="openshift-ingress-canary/ingress-canary-4chc2" Apr 16 21:00:47.238148 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:47.237792 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 21:00:47.240305 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:47.240282 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d61f010-bef5-435b-a6dd-30cf6ec4dbe2-metrics-tls\") pod \"dns-default-tbbn8\" (UID: \"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2\") " pod="openshift-dns/dns-default-tbbn8" Apr 16 21:00:47.240353 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:47.240325 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d8eb548-23d0-403d-a61e-f91a50c71507-cert\") pod \"ingress-canary-4chc2\" (UID: \"6d8eb548-23d0-403d-a61e-f91a50c71507\") " pod="openshift-ingress-canary/ingress-canary-4chc2" Apr 16 21:00:47.330205 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:47.330173 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kfmp4\"" Apr 16 21:00:47.337645 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:47.337620 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tbbn8" Apr 16 21:00:47.456869 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:47.456839 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tbbn8"] Apr 16 21:00:47.459682 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:00:47.459651 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d61f010_bef5_435b_a6dd_30cf6ec4dbe2.slice/crio-8107c143fdb5f218219b74a5a7ab5ae65e9528ede4caf834398b19ee1ec1226b WatchSource:0}: Error finding container 8107c143fdb5f218219b74a5a7ab5ae65e9528ede4caf834398b19ee1ec1226b: Status 404 returned error can't find the container with id 8107c143fdb5f218219b74a5a7ab5ae65e9528ede4caf834398b19ee1ec1226b Apr 16 21:00:47.840850 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:47.840808 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tbbn8" event={"ID":"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2","Type":"ContainerStarted","Data":"8107c143fdb5f218219b74a5a7ab5ae65e9528ede4caf834398b19ee1ec1226b"} Apr 16 21:00:48.844367 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:48.844339 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tbbn8" event={"ID":"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2","Type":"ContainerStarted","Data":"b2aba036f765ab936385c13194a8bd9424642995bb5adadbfe382a04b5a3bbce"} Apr 16 21:00:49.848729 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:49.848692 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tbbn8" event={"ID":"7d61f010-bef5-435b-a6dd-30cf6ec4dbe2","Type":"ContainerStarted","Data":"64f69e06bd2aee485573fd122b9e61b8a60493a192b814ff9028bb21b0c3925d"} Apr 16 21:00:49.849142 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:49.848812 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-tbbn8" Apr 16 21:00:49.870870 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:49.870822 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tbbn8" podStartSLOduration=129.599613381 podStartE2EDuration="2m10.870807323s" podCreationTimestamp="2026-04-16 20:58:39 +0000 UTC" firstStartedPulling="2026-04-16 21:00:47.461464815 +0000 UTC m=+161.664001527" lastFinishedPulling="2026-04-16 21:00:48.732658758 +0000 UTC m=+162.935195469" observedRunningTime="2026-04-16 21:00:49.870447775 +0000 UTC m=+164.072984509" watchObservedRunningTime="2026-04-16 21:00:49.870807323 +0000 UTC m=+164.073344055" Apr 16 21:00:50.834788 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.834758 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-f8gqh"] Apr 16 21:00:50.836703 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.836686 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-f8gqh" Apr 16 21:00:50.839317 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.839293 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-b7pz7\"" Apr 16 21:00:50.840653 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.840633 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 21:00:50.840768 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.840704 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 21:00:50.840768 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.840734 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 21:00:50.840768 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.840742 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 21:00:50.852874 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.852847 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-f8gqh"] Apr 16 21:00:50.860280 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.860250 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7467cabe-4fe5-428e-85ad-b7c1293cf891-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f8gqh\" (UID: \"7467cabe-4fe5-428e-85ad-b7c1293cf891\") " pod="openshift-insights/insights-runtime-extractor-f8gqh" Apr 16 21:00:50.860405 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.860288 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-976ht\" (UniqueName: \"kubernetes.io/projected/7467cabe-4fe5-428e-85ad-b7c1293cf891-kube-api-access-976ht\") pod \"insights-runtime-extractor-f8gqh\" (UID: \"7467cabe-4fe5-428e-85ad-b7c1293cf891\") " pod="openshift-insights/insights-runtime-extractor-f8gqh" Apr 16 21:00:50.860463 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.860448 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7467cabe-4fe5-428e-85ad-b7c1293cf891-data-volume\") pod \"insights-runtime-extractor-f8gqh\" (UID: \"7467cabe-4fe5-428e-85ad-b7c1293cf891\") " pod="openshift-insights/insights-runtime-extractor-f8gqh" Apr 16 21:00:50.860600 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.860578 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7467cabe-4fe5-428e-85ad-b7c1293cf891-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f8gqh\" (UID: \"7467cabe-4fe5-428e-85ad-b7c1293cf891\") " pod="openshift-insights/insights-runtime-extractor-f8gqh" Apr 16 21:00:50.860652 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.860631 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7467cabe-4fe5-428e-85ad-b7c1293cf891-crio-socket\") pod \"insights-runtime-extractor-f8gqh\" (UID: \"7467cabe-4fe5-428e-85ad-b7c1293cf891\") " pod="openshift-insights/insights-runtime-extractor-f8gqh" Apr 16 21:00:50.880364 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.880340 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69468958bf-49nvn"] Apr 16 21:00:50.880544 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:50.880525 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-69468958bf-49nvn" podUID="8dd344a7-9f73-4d3b-8235-d73a07ccbe84" Apr 16 21:00:50.929171 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.929141 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7fbd4b68b-464br"] Apr 16 21:00:50.931028 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.931014 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:50.956219 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.956191 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7fbd4b68b-464br"] Apr 16 21:00:50.961590 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.961308 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7467cabe-4fe5-428e-85ad-b7c1293cf891-crio-socket\") pod \"insights-runtime-extractor-f8gqh\" (UID: \"7467cabe-4fe5-428e-85ad-b7c1293cf891\") " pod="openshift-insights/insights-runtime-extractor-f8gqh" Apr 16 21:00:50.961590 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.961363 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7467cabe-4fe5-428e-85ad-b7c1293cf891-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f8gqh\" (UID: \"7467cabe-4fe5-428e-85ad-b7c1293cf891\") " pod="openshift-insights/insights-runtime-extractor-f8gqh" Apr 16 21:00:50.961590 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.961440 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/89b22818-d7c9-4f53-b62b-fa46caa8ef94-image-registry-private-configuration\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:50.961590 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.961483 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/89b22818-d7c9-4f53-b62b-fa46caa8ef94-registry-certificates\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:50.961590 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.961517 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/89b22818-d7c9-4f53-b62b-fa46caa8ef94-ca-trust-extracted\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:50.961590 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.961546 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-976ht\" (UniqueName: \"kubernetes.io/projected/7467cabe-4fe5-428e-85ad-b7c1293cf891-kube-api-access-976ht\") pod \"insights-runtime-extractor-f8gqh\" (UID: \"7467cabe-4fe5-428e-85ad-b7c1293cf891\") " pod="openshift-insights/insights-runtime-extractor-f8gqh" Apr 16 21:00:50.962018 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.961622 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89b22818-d7c9-4f53-b62b-fa46caa8ef94-registry-tls\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:50.962018 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.961654 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89b22818-d7c9-4f53-b62b-fa46caa8ef94-trusted-ca\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:50.962018 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.961692 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/89b22818-d7c9-4f53-b62b-fa46caa8ef94-installation-pull-secrets\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:50.962018 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.961738 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f8tj\" (UniqueName: \"kubernetes.io/projected/89b22818-d7c9-4f53-b62b-fa46caa8ef94-kube-api-access-9f8tj\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:50.962018 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.961766 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7467cabe-4fe5-428e-85ad-b7c1293cf891-data-volume\") pod \"insights-runtime-extractor-f8gqh\" (UID: \"7467cabe-4fe5-428e-85ad-b7c1293cf891\") " pod="openshift-insights/insights-runtime-extractor-f8gqh" Apr 16 21:00:50.962018 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.961786 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/89b22818-d7c9-4f53-b62b-fa46caa8ef94-bound-sa-token\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:50.962018 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.961814 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7467cabe-4fe5-428e-85ad-b7c1293cf891-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f8gqh\" (UID: \"7467cabe-4fe5-428e-85ad-b7c1293cf891\") " pod="openshift-insights/insights-runtime-extractor-f8gqh" Apr 16 21:00:50.965465 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.965433 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7467cabe-4fe5-428e-85ad-b7c1293cf891-crio-socket\") pod \"insights-runtime-extractor-f8gqh\" (UID: \"7467cabe-4fe5-428e-85ad-b7c1293cf891\") " pod="openshift-insights/insights-runtime-extractor-f8gqh" Apr 16 21:00:50.965578 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.965522 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7467cabe-4fe5-428e-85ad-b7c1293cf891-data-volume\") pod \"insights-runtime-extractor-f8gqh\" (UID: \"7467cabe-4fe5-428e-85ad-b7c1293cf891\") " pod="openshift-insights/insights-runtime-extractor-f8gqh" Apr 16 21:00:50.965651 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.965611 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7467cabe-4fe5-428e-85ad-b7c1293cf891-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f8gqh\" (UID: \"7467cabe-4fe5-428e-85ad-b7c1293cf891\") " pod="openshift-insights/insights-runtime-extractor-f8gqh" Apr 16 21:00:50.966115 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.966097 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7467cabe-4fe5-428e-85ad-b7c1293cf891-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f8gqh\" (UID: \"7467cabe-4fe5-428e-85ad-b7c1293cf891\") " pod="openshift-insights/insights-runtime-extractor-f8gqh" Apr 16 21:00:50.979337 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:50.979310 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-976ht\" (UniqueName: \"kubernetes.io/projected/7467cabe-4fe5-428e-85ad-b7c1293cf891-kube-api-access-976ht\") pod \"insights-runtime-extractor-f8gqh\" (UID: \"7467cabe-4fe5-428e-85ad-b7c1293cf891\") " pod="openshift-insights/insights-runtime-extractor-f8gqh" Apr 16 21:00:51.062527 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.062488 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/89b22818-d7c9-4f53-b62b-fa46caa8ef94-bound-sa-token\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:51.062677 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.062551 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/89b22818-d7c9-4f53-b62b-fa46caa8ef94-image-registry-private-configuration\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:51.062677 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.062584 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/89b22818-d7c9-4f53-b62b-fa46caa8ef94-registry-certificates\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:51.062677 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.062612 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/89b22818-d7c9-4f53-b62b-fa46caa8ef94-ca-trust-extracted\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:51.062817 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.062713 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89b22818-d7c9-4f53-b62b-fa46caa8ef94-registry-tls\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:51.062817 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.062734 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89b22818-d7c9-4f53-b62b-fa46caa8ef94-trusted-ca\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:51.062909 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.062877 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/89b22818-d7c9-4f53-b62b-fa46caa8ef94-installation-pull-secrets\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:51.062981 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.062949 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9f8tj\" (UniqueName: \"kubernetes.io/projected/89b22818-d7c9-4f53-b62b-fa46caa8ef94-kube-api-access-9f8tj\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:51.063266 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.063233 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/89b22818-d7c9-4f53-b62b-fa46caa8ef94-ca-trust-extracted\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:51.063790 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.063749 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/89b22818-d7c9-4f53-b62b-fa46caa8ef94-registry-certificates\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:51.063894 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.063829 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89b22818-d7c9-4f53-b62b-fa46caa8ef94-trusted-ca\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:51.065319 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.065299 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/89b22818-d7c9-4f53-b62b-fa46caa8ef94-image-registry-private-configuration\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:51.065465 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.065444 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/89b22818-d7c9-4f53-b62b-fa46caa8ef94-installation-pull-secrets\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:51.065508 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.065446 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89b22818-d7c9-4f53-b62b-fa46caa8ef94-registry-tls\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:51.083237 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.083211 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f8tj\" (UniqueName: \"kubernetes.io/projected/89b22818-d7c9-4f53-b62b-fa46caa8ef94-kube-api-access-9f8tj\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:51.088879 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.088834 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/89b22818-d7c9-4f53-b62b-fa46caa8ef94-bound-sa-token\") pod \"image-registry-7fbd4b68b-464br\" (UID: \"89b22818-d7c9-4f53-b62b-fa46caa8ef94\") " pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:51.145457 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.145428 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-f8gqh" Apr 16 21:00:51.239361 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.239309 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:51.273601 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.273570 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-f8gqh"] Apr 16 21:00:51.277125 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:00:51.277092 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7467cabe_4fe5_428e_85ad_b7c1293cf891.slice/crio-d443a2cad7657f38b8633d0c2c6ddd4b0b194f126216fd93f464f3d806949b57 WatchSource:0}: Error finding container d443a2cad7657f38b8633d0c2c6ddd4b0b194f126216fd93f464f3d806949b57: Status 404 returned error can't find the container with id d443a2cad7657f38b8633d0c2c6ddd4b0b194f126216fd93f464f3d806949b57 Apr 16 21:00:51.373092 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.373055 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7fbd4b68b-464br"] Apr 16 21:00:51.377348 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:00:51.377320 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89b22818_d7c9_4f53_b62b_fa46caa8ef94.slice/crio-2c1365809fd35f88fba0267d159c37619f11c4df94308ea471f56f73c77a940a WatchSource:0}: Error finding container 2c1365809fd35f88fba0267d159c37619f11c4df94308ea471f56f73c77a940a: Status 404 returned error can't find the container with id 2c1365809fd35f88fba0267d159c37619f11c4df94308ea471f56f73c77a940a Apr 16 21:00:51.855369 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.855331 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f8gqh" event={"ID":"7467cabe-4fe5-428e-85ad-b7c1293cf891","Type":"ContainerStarted","Data":"39e53d594f83440e387ccbe7af63b7c5c1811371efec1a0a7ca0dae92cf90d9f"} Apr 16 21:00:51.855892 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.855372 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f8gqh" event={"ID":"7467cabe-4fe5-428e-85ad-b7c1293cf891","Type":"ContainerStarted","Data":"d443a2cad7657f38b8633d0c2c6ddd4b0b194f126216fd93f464f3d806949b57"} Apr 16 21:00:51.856559 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.856532 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fbd4b68b-464br" event={"ID":"89b22818-d7c9-4f53-b62b-fa46caa8ef94","Type":"ContainerStarted","Data":"31fc3942f32644b1fb93704474832af0a2397448b1711dde5fe9eee2e833b1d0"} Apr 16 21:00:51.856658 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.856566 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fbd4b68b-464br" event={"ID":"89b22818-d7c9-4f53-b62b-fa46caa8ef94","Type":"ContainerStarted","Data":"2c1365809fd35f88fba0267d159c37619f11c4df94308ea471f56f73c77a940a"} Apr 16 21:00:51.856710 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.856667 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:00:51.857216 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.857199 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:51.861369 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.861345 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:51.868839 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.868818 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-certificates\") pod \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " Apr 16 21:00:51.868946 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.868847 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-image-registry-private-configuration\") pod \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " Apr 16 21:00:51.868946 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.868872 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-trusted-ca\") pod \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " Apr 16 21:00:51.868946 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.868900 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-bound-sa-token\") pod \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " Apr 16 21:00:51.869097 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.869073 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-ca-trust-extracted\") pod \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " Apr 16 21:00:51.869150 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.869119 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-installation-pull-secrets\") pod \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " Apr 16 21:00:51.869204 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.869149 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9jbj\" (UniqueName: \"kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-kube-api-access-w9jbj\") pod \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\" (UID: \"8dd344a7-9f73-4d3b-8235-d73a07ccbe84\") " Apr 16 21:00:51.869204 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.869165 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8dd344a7-9f73-4d3b-8235-d73a07ccbe84" (UID: "8dd344a7-9f73-4d3b-8235-d73a07ccbe84"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:00:51.869313 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.869250 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8dd344a7-9f73-4d3b-8235-d73a07ccbe84" (UID: "8dd344a7-9f73-4d3b-8235-d73a07ccbe84"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:00:51.869313 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.869272 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8dd344a7-9f73-4d3b-8235-d73a07ccbe84" (UID: "8dd344a7-9f73-4d3b-8235-d73a07ccbe84"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:00:51.869526 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.869506 2568 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-ca-trust-extracted\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:00:51.869640 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.869582 2568 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-certificates\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:00:51.869996 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.869704 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-trusted-ca\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:00:51.871337 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.871310 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8dd344a7-9f73-4d3b-8235-d73a07ccbe84" (UID: "8dd344a7-9f73-4d3b-8235-d73a07ccbe84"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:00:51.871434 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.871341 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-kube-api-access-w9jbj" (OuterVolumeSpecName: "kube-api-access-w9jbj") pod "8dd344a7-9f73-4d3b-8235-d73a07ccbe84" (UID: "8dd344a7-9f73-4d3b-8235-d73a07ccbe84"). InnerVolumeSpecName "kube-api-access-w9jbj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:00:51.871476 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.871432 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8dd344a7-9f73-4d3b-8235-d73a07ccbe84" (UID: "8dd344a7-9f73-4d3b-8235-d73a07ccbe84"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 21:00:51.871569 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.871550 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "8dd344a7-9f73-4d3b-8235-d73a07ccbe84" (UID: "8dd344a7-9f73-4d3b-8235-d73a07ccbe84"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 21:00:51.882684 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.882646 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7fbd4b68b-464br" podStartSLOduration=1.882634061 podStartE2EDuration="1.882634061s" podCreationTimestamp="2026-04-16 21:00:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:00:51.881072255 +0000 UTC m=+166.083608987" watchObservedRunningTime="2026-04-16 21:00:51.882634061 +0000 UTC m=+166.085170793" Apr 16 21:00:51.970612 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.970579 2568 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-bound-sa-token\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:00:51.970612 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.970609 2568 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-installation-pull-secrets\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:00:51.970612 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.970620 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w9jbj\" (UniqueName: \"kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-kube-api-access-w9jbj\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:00:51.970832 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:51.970629 2568 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-image-registry-private-configuration\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:00:52.860168 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:52.860132 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f8gqh" event={"ID":"7467cabe-4fe5-428e-85ad-b7c1293cf891","Type":"ContainerStarted","Data":"812eebff0362b51fbf85ea3b26e7fea9318d04b733d911235f743c6e67064697"} Apr 16 21:00:52.860168 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:52.860141 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69468958bf-49nvn" Apr 16 21:00:52.897058 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:52.897020 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69468958bf-49nvn"] Apr 16 21:00:52.901202 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:52.901169 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-69468958bf-49nvn"] Apr 16 21:00:52.978630 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:52.978589 2568 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8dd344a7-9f73-4d3b-8235-d73a07ccbe84-registry-tls\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:00:53.726278 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:53.726256 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgw5q"] Apr 16 21:00:53.728136 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:53.728120 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgw5q" Apr 16 21:00:53.730788 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:53.730768 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 21:00:53.730958 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:53.730807 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-zcfvk\"" Apr 16 21:00:53.737097 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:53.737077 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgw5q"] Apr 16 21:00:53.786904 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:53.786863 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9629b488-a52e-4946-9c32-2acc348e6da5-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hgw5q\" (UID: \"9629b488-a52e-4946-9c32-2acc348e6da5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgw5q" Apr 16 21:00:53.864901 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:53.864856 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f8gqh" event={"ID":"7467cabe-4fe5-428e-85ad-b7c1293cf891","Type":"ContainerStarted","Data":"0c278e8c375af852f3b9bc1a7390c69a06cdd3616054b98c81498bdfd7af67ee"} Apr 16 21:00:53.884649 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:53.884603 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-f8gqh" podStartSLOduration=1.531206408 podStartE2EDuration="3.884589092s" podCreationTimestamp="2026-04-16 21:00:50 +0000 UTC" firstStartedPulling="2026-04-16 21:00:51.33434257 +0000 UTC m=+165.536879279" lastFinishedPulling="2026-04-16 21:00:53.687725253 +0000 UTC m=+167.890261963" observedRunningTime="2026-04-16 21:00:53.883839916 +0000 UTC m=+168.086376686" watchObservedRunningTime="2026-04-16 21:00:53.884589092 +0000 UTC m=+168.087125824" Apr 16 21:00:53.887275 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:53.887254 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9629b488-a52e-4946-9c32-2acc348e6da5-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hgw5q\" (UID: \"9629b488-a52e-4946-9c32-2acc348e6da5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgw5q" Apr 16 21:00:53.887450 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:53.887431 2568 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 21:00:53.887515 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:00:53.887508 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9629b488-a52e-4946-9c32-2acc348e6da5-tls-certificates podName:9629b488-a52e-4946-9c32-2acc348e6da5 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:54.387491325 +0000 UTC m=+168.590028043 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/9629b488-a52e-4946-9c32-2acc348e6da5-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-hgw5q" (UID: "9629b488-a52e-4946-9c32-2acc348e6da5") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 21:00:54.333337 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:54.333304 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dd344a7-9f73-4d3b-8235-d73a07ccbe84" path="/var/lib/kubelet/pods/8dd344a7-9f73-4d3b-8235-d73a07ccbe84/volumes" Apr 16 21:00:54.390671 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:54.390634 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9629b488-a52e-4946-9c32-2acc348e6da5-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hgw5q\" (UID: \"9629b488-a52e-4946-9c32-2acc348e6da5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgw5q" Apr 16 21:00:54.393171 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:54.393153 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9629b488-a52e-4946-9c32-2acc348e6da5-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hgw5q\" (UID: \"9629b488-a52e-4946-9c32-2acc348e6da5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgw5q" Apr 16 21:00:54.636965 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:54.636878 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgw5q" Apr 16 21:00:54.754089 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:54.754055 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgw5q"] Apr 16 21:00:54.757782 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:00:54.757754 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9629b488_a52e_4946_9c32_2acc348e6da5.slice/crio-62d75ec430255ce1e5f3d249a0c71b8bd0b9457e4c36526e93413c9f4b7ef38f WatchSource:0}: Error finding container 62d75ec430255ce1e5f3d249a0c71b8bd0b9457e4c36526e93413c9f4b7ef38f: Status 404 returned error can't find the container with id 62d75ec430255ce1e5f3d249a0c71b8bd0b9457e4c36526e93413c9f4b7ef38f Apr 16 21:00:54.870924 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:54.870887 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgw5q" event={"ID":"9629b488-a52e-4946-9c32-2acc348e6da5","Type":"ContainerStarted","Data":"62d75ec430255ce1e5f3d249a0c71b8bd0b9457e4c36526e93413c9f4b7ef38f"} Apr 16 21:00:55.329944 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:55.329871 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 21:00:56.878115 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:56.878078 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgw5q" event={"ID":"9629b488-a52e-4946-9c32-2acc348e6da5","Type":"ContainerStarted","Data":"80a9449e77ab47023f78b1d6cdc563f78ffc11b2dae9c14dbb9d1c17e7552d38"} Apr 16 21:00:56.878492 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:56.878291 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgw5q" Apr 16 21:00:56.883632 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:56.883609 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgw5q" Apr 16 21:00:56.897519 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:56.897478 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgw5q" podStartSLOduration=2.3340604369999998 podStartE2EDuration="3.897464921s" podCreationTimestamp="2026-04-16 21:00:53 +0000 UTC" firstStartedPulling="2026-04-16 21:00:54.759580909 +0000 UTC m=+168.962117618" lastFinishedPulling="2026-04-16 21:00:56.322985392 +0000 UTC m=+170.525522102" observedRunningTime="2026-04-16 21:00:56.896455787 +0000 UTC m=+171.098992520" watchObservedRunningTime="2026-04-16 21:00:56.897464921 +0000 UTC m=+171.100001654" Apr 16 21:00:57.329922 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:57.329882 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4chc2" Apr 16 21:00:57.333403 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:57.333358 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mhzpf\"" Apr 16 21:00:57.341219 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:57.341202 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4chc2" Apr 16 21:00:57.457962 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:57.457930 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4chc2"] Apr 16 21:00:57.461350 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:00:57.461318 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d8eb548_23d0_403d_a61e_f91a50c71507.slice/crio-0b223391c1c94b9579ec65f6eda465886838a1ff2dda2a8fff539ce44555538b WatchSource:0}: Error finding container 0b223391c1c94b9579ec65f6eda465886838a1ff2dda2a8fff539ce44555538b: Status 404 returned error can't find the container with id 0b223391c1c94b9579ec65f6eda465886838a1ff2dda2a8fff539ce44555538b Apr 16 21:00:57.882017 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:57.881980 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4chc2" event={"ID":"6d8eb548-23d0-403d-a61e-f91a50c71507","Type":"ContainerStarted","Data":"0b223391c1c94b9579ec65f6eda465886838a1ff2dda2a8fff539ce44555538b"} Apr 16 21:00:59.853831 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:00:59.853799 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tbbn8" Apr 16 21:01:00.892515 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:00.892421 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4chc2" event={"ID":"6d8eb548-23d0-403d-a61e-f91a50c71507","Type":"ContainerStarted","Data":"21ff82bf1da7dc992283f8343fe74f9bbee9c8758ac812f007fe0f0132fcc85b"} Apr 16 21:01:00.912672 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:00.912626 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4chc2" podStartSLOduration=138.847363727 podStartE2EDuration="2m21.912611599s" podCreationTimestamp="2026-04-16 20:58:39 +0000 UTC" firstStartedPulling="2026-04-16 21:00:57.463404234 +0000 UTC m=+171.665940945" lastFinishedPulling="2026-04-16 21:01:00.528652107 +0000 UTC m=+174.731188817" observedRunningTime="2026-04-16 21:01:00.911768742 +0000 UTC m=+175.114305472" watchObservedRunningTime="2026-04-16 21:01:00.912611599 +0000 UTC m=+175.115148331" Apr 16 21:01:01.842881 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.842844 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59fb76f94-qtq24"] Apr 16 21:01:01.844753 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.844735 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:01.849408 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.849365 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 21:01:01.850780 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.850756 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 21:01:01.850920 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.850792 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 21:01:01.850920 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.850825 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 21:01:01.850920 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.850830 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 21:01:01.850920 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.850759 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-vpj6c\"" Apr 16 21:01:01.851115 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.850759 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 21:01:01.852245 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.852227 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 21:01:01.855607 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.855588 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 21:01:01.860517 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.860499 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59fb76f94-qtq24"] Apr 16 21:01:01.945959 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.945923 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-console-config\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:01.945959 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.945958 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-oauth-serving-cert\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:01.946427 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.945986 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa349265-8f10-4718-b52a-553dd987d335-console-serving-cert\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:01.946427 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.946012 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa349265-8f10-4718-b52a-553dd987d335-console-oauth-config\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:01.946427 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.946195 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-trusted-ca-bundle\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:01.946427 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.946238 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hklrr\" (UniqueName: \"kubernetes.io/projected/aa349265-8f10-4718-b52a-553dd987d335-kube-api-access-hklrr\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:01.946427 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:01.946279 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-service-ca\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:02.046809 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:02.046777 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa349265-8f10-4718-b52a-553dd987d335-console-serving-cert\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:02.046980 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:02.046821 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa349265-8f10-4718-b52a-553dd987d335-console-oauth-config\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:02.046980 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:02.046866 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-trusted-ca-bundle\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:02.046980 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:02.046898 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hklrr\" (UniqueName: \"kubernetes.io/projected/aa349265-8f10-4718-b52a-553dd987d335-kube-api-access-hklrr\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:02.046980 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:02.046930 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-service-ca\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:02.046980 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:02.046956 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-console-config\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:02.047192 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:02.047150 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-oauth-serving-cert\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:02.048205 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:02.048180 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-oauth-serving-cert\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:02.048324 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:02.048288 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-service-ca\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:02.048556 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:02.048537 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-trusted-ca-bundle\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:02.048601 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:02.048566 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-console-config\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:02.050130 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:02.050112 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa349265-8f10-4718-b52a-553dd987d335-console-serving-cert\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:02.050130 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:02.050121 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa349265-8f10-4718-b52a-553dd987d335-console-oauth-config\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:02.056875 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:02.056852 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hklrr\" (UniqueName: \"kubernetes.io/projected/aa349265-8f10-4718-b52a-553dd987d335-kube-api-access-hklrr\") pod \"console-59fb76f94-qtq24\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:02.153575 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:02.153491 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:02.275082 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:02.275051 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59fb76f94-qtq24"] Apr 16 21:01:02.278654 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:01:02.278626 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa349265_8f10_4718_b52a_553dd987d335.slice/crio-50f79ab99e0c43159e90e8c7e68983faffef0a954768d629aff8f130de6305c7 WatchSource:0}: Error finding container 50f79ab99e0c43159e90e8c7e68983faffef0a954768d629aff8f130de6305c7: Status 404 returned error can't find the container with id 50f79ab99e0c43159e90e8c7e68983faffef0a954768d629aff8f130de6305c7 Apr 16 21:01:02.898995 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:02.898955 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59fb76f94-qtq24" event={"ID":"aa349265-8f10-4718-b52a-553dd987d335","Type":"ContainerStarted","Data":"50f79ab99e0c43159e90e8c7e68983faffef0a954768d629aff8f130de6305c7"} Apr 16 21:01:03.158305 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.158225 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx"] Apr 16 21:01:03.161093 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.161075 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" Apr 16 21:01:03.164906 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.164883 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 21:01:03.166245 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.166223 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 21:01:03.166342 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.166227 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-2xhpk\"" Apr 16 21:01:03.166342 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.166262 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 21:01:03.166342 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.166260 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 21:01:03.166342 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.166254 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 21:01:03.175626 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.175602 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx"] Apr 16 21:01:03.177916 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.177896 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-8v8q5"] Apr 16 21:01:03.179957 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.179933 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.183658 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.183612 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-9f7mz\"" Apr 16 21:01:03.183935 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.183917 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 21:01:03.184013 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.183942 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 21:01:03.184013 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.183963 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 21:01:03.195926 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.195901 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-8v8q5"] Apr 16 21:01:03.211904 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.211880 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ks9m9"] Apr 16 21:01:03.214074 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.214057 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.217180 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.217160 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 21:01:03.217488 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.217466 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 21:01:03.217596 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.217552 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-l4vt2\"" Apr 16 21:01:03.218979 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.218961 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 21:01:03.257890 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.257854 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2kjw\" (UniqueName: \"kubernetes.io/projected/a9e3ec16-5b10-458c-92e9-fc70564250dc-kube-api-access-j2kjw\") pod \"openshift-state-metrics-9d44df66c-7sgvx\" (UID: \"a9e3ec16-5b10-458c-92e9-fc70564250dc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" Apr 16 21:01:03.258059 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.257894 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-8v8q5\" (UID: \"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.258059 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.257927 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bb636b7d-dc5b-447a-8265-30a0d805ab14-node-exporter-wtmp\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.258059 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.257968 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bb636b7d-dc5b-447a-8265-30a0d805ab14-root\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.258059 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.258035 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r8l8\" (UniqueName: \"kubernetes.io/projected/bb636b7d-dc5b-447a-8265-30a0d805ab14-kube-api-access-7r8l8\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.258287 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.258080 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-8v8q5\" (UID: \"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.258287 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.258113 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-8v8q5\" (UID: \"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.258287 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.258136 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb636b7d-dc5b-447a-8265-30a0d805ab14-sys\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.258287 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.258170 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bb636b7d-dc5b-447a-8265-30a0d805ab14-node-exporter-textfile\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.258287 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.258204 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9e3ec16-5b10-458c-92e9-fc70564250dc-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7sgvx\" (UID: \"a9e3ec16-5b10-458c-92e9-fc70564250dc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" Apr 16 21:01:03.258287 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.258230 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrm9x\" (UniqueName: \"kubernetes.io/projected/c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086-kube-api-access-nrm9x\") pod \"kube-state-metrics-69db897b98-8v8q5\" (UID: \"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.258617 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.258306 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bb636b7d-dc5b-447a-8265-30a0d805ab14-node-exporter-tls\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.258617 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.258331 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bb636b7d-dc5b-447a-8265-30a0d805ab14-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.258617 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.258404 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8v8q5\" (UID: \"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.258617 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.258422 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb636b7d-dc5b-447a-8265-30a0d805ab14-metrics-client-ca\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.258617 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.258449 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-8v8q5\" (UID: \"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.258617 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.258474 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bb636b7d-dc5b-447a-8265-30a0d805ab14-node-exporter-accelerators-collector-config\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.258617 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.258519 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a9e3ec16-5b10-458c-92e9-fc70564250dc-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7sgvx\" (UID: \"a9e3ec16-5b10-458c-92e9-fc70564250dc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" Apr 16 21:01:03.258617 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.258538 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9e3ec16-5b10-458c-92e9-fc70564250dc-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7sgvx\" (UID: \"a9e3ec16-5b10-458c-92e9-fc70564250dc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" Apr 16 21:01:03.359161 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.359120 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bb636b7d-dc5b-447a-8265-30a0d805ab14-node-exporter-tls\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.359359 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.359170 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bb636b7d-dc5b-447a-8265-30a0d805ab14-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.359359 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.359220 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8v8q5\" (UID: \"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.359359 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.359246 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb636b7d-dc5b-447a-8265-30a0d805ab14-metrics-client-ca\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.359359 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:01:03.359280 2568 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 21:01:03.359596 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:01:03.359369 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb636b7d-dc5b-447a-8265-30a0d805ab14-node-exporter-tls podName:bb636b7d-dc5b-447a-8265-30a0d805ab14 nodeName:}" failed. No retries permitted until 2026-04-16 21:01:03.859345665 +0000 UTC m=+178.061882380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/bb636b7d-dc5b-447a-8265-30a0d805ab14-node-exporter-tls") pod "node-exporter-ks9m9" (UID: "bb636b7d-dc5b-447a-8265-30a0d805ab14") : secret "node-exporter-tls" not found Apr 16 21:01:03.359596 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.359283 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-8v8q5\" (UID: \"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.359787 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.359624 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bb636b7d-dc5b-447a-8265-30a0d805ab14-node-exporter-accelerators-collector-config\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.359787 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.359677 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a9e3ec16-5b10-458c-92e9-fc70564250dc-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7sgvx\" (UID: \"a9e3ec16-5b10-458c-92e9-fc70564250dc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" Apr 16 21:01:03.359787 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.359704 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9e3ec16-5b10-458c-92e9-fc70564250dc-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7sgvx\" (UID: \"a9e3ec16-5b10-458c-92e9-fc70564250dc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" Apr 16 21:01:03.359787 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.359727 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2kjw\" (UniqueName: \"kubernetes.io/projected/a9e3ec16-5b10-458c-92e9-fc70564250dc-kube-api-access-j2kjw\") pod \"openshift-state-metrics-9d44df66c-7sgvx\" (UID: \"a9e3ec16-5b10-458c-92e9-fc70564250dc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" Apr 16 21:01:03.359787 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.359749 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-8v8q5\" (UID: \"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.359787 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.359779 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bb636b7d-dc5b-447a-8265-30a0d805ab14-node-exporter-wtmp\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.360071 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.359803 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bb636b7d-dc5b-447a-8265-30a0d805ab14-root\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.360071 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.359880 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7r8l8\" (UniqueName: \"kubernetes.io/projected/bb636b7d-dc5b-447a-8265-30a0d805ab14-kube-api-access-7r8l8\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.360071 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.359906 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-8v8q5\" (UID: \"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.360071 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.359940 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-8v8q5\" (UID: \"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.360071 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.359964 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb636b7d-dc5b-447a-8265-30a0d805ab14-sys\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.360071 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.360003 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bb636b7d-dc5b-447a-8265-30a0d805ab14-node-exporter-textfile\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.360071 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.360038 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9e3ec16-5b10-458c-92e9-fc70564250dc-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7sgvx\" (UID: \"a9e3ec16-5b10-458c-92e9-fc70564250dc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" Apr 16 21:01:03.360071 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.360066 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrm9x\" (UniqueName: \"kubernetes.io/projected/c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086-kube-api-access-nrm9x\") pod \"kube-state-metrics-69db897b98-8v8q5\" (UID: \"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.360653 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.360501 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9e3ec16-5b10-458c-92e9-fc70564250dc-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7sgvx\" (UID: \"a9e3ec16-5b10-458c-92e9-fc70564250dc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" Apr 16 21:01:03.360653 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.360609 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb636b7d-dc5b-447a-8265-30a0d805ab14-metrics-client-ca\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.360808 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.360765 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bb636b7d-dc5b-447a-8265-30a0d805ab14-node-exporter-wtmp\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.360879 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.360850 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bb636b7d-dc5b-447a-8265-30a0d805ab14-root\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.361526 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.361009 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bb636b7d-dc5b-447a-8265-30a0d805ab14-node-exporter-textfile\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.361526 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.361048 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb636b7d-dc5b-447a-8265-30a0d805ab14-sys\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.361526 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:01:03.361238 2568 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 21:01:03.361526 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:01:03.361311 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9e3ec16-5b10-458c-92e9-fc70564250dc-openshift-state-metrics-tls podName:a9e3ec16-5b10-458c-92e9-fc70564250dc nodeName:}" failed. No retries permitted until 2026-04-16 21:01:03.861294501 +0000 UTC m=+178.063831212 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/a9e3ec16-5b10-458c-92e9-fc70564250dc-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-7sgvx" (UID: "a9e3ec16-5b10-458c-92e9-fc70564250dc") : secret "openshift-state-metrics-tls" not found Apr 16 21:01:03.361526 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.361422 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-8v8q5\" (UID: \"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.362079 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.361963 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bb636b7d-dc5b-447a-8265-30a0d805ab14-node-exporter-accelerators-collector-config\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.362079 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.362011 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-8v8q5\" (UID: \"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.362244 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.362116 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-8v8q5\" (UID: \"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.362817 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.362772 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8v8q5\" (UID: \"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.363255 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.363236 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bb636b7d-dc5b-447a-8265-30a0d805ab14-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.363852 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.363831 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-8v8q5\" (UID: \"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.363940 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.363898 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a9e3ec16-5b10-458c-92e9-fc70564250dc-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7sgvx\" (UID: \"a9e3ec16-5b10-458c-92e9-fc70564250dc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" Apr 16 21:01:03.372147 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.371945 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2kjw\" (UniqueName: \"kubernetes.io/projected/a9e3ec16-5b10-458c-92e9-fc70564250dc-kube-api-access-j2kjw\") pod \"openshift-state-metrics-9d44df66c-7sgvx\" (UID: \"a9e3ec16-5b10-458c-92e9-fc70564250dc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" Apr 16 21:01:03.372541 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.372519 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrm9x\" (UniqueName: \"kubernetes.io/projected/c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086-kube-api-access-nrm9x\") pod \"kube-state-metrics-69db897b98-8v8q5\" (UID: \"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.373946 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.373923 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r8l8\" (UniqueName: \"kubernetes.io/projected/bb636b7d-dc5b-447a-8265-30a0d805ab14-kube-api-access-7r8l8\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.488829 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.488740 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" Apr 16 21:01:03.630508 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.630472 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-8v8q5"] Apr 16 21:01:03.634712 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:01:03.634532 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7e0ea71_e0ca_4bd3_8a86_ebcf3bb88086.slice/crio-8fa37e5537015934b643cc50d0da0210fa6f8c8e3f7c92706747f848e07cf2de WatchSource:0}: Error finding container 8fa37e5537015934b643cc50d0da0210fa6f8c8e3f7c92706747f848e07cf2de: Status 404 returned error can't find the container with id 8fa37e5537015934b643cc50d0da0210fa6f8c8e3f7c92706747f848e07cf2de Apr 16 21:01:03.863366 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.863325 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9e3ec16-5b10-458c-92e9-fc70564250dc-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7sgvx\" (UID: \"a9e3ec16-5b10-458c-92e9-fc70564250dc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" Apr 16 21:01:03.863557 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.863407 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bb636b7d-dc5b-447a-8265-30a0d805ab14-node-exporter-tls\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.866143 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.866119 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9e3ec16-5b10-458c-92e9-fc70564250dc-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7sgvx\" (UID: \"a9e3ec16-5b10-458c-92e9-fc70564250dc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" Apr 16 21:01:03.866260 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.866234 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bb636b7d-dc5b-447a-8265-30a0d805ab14-node-exporter-tls\") pod \"node-exporter-ks9m9\" (UID: \"bb636b7d-dc5b-447a-8265-30a0d805ab14\") " pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:03.902440 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:03.902397 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" event={"ID":"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086","Type":"ContainerStarted","Data":"8fa37e5537015934b643cc50d0da0210fa6f8c8e3f7c92706747f848e07cf2de"} Apr 16 21:01:04.070057 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.070027 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" Apr 16 21:01:04.123791 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.123648 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ks9m9" Apr 16 21:01:04.135991 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:01:04.135952 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb636b7d_dc5b_447a_8265_30a0d805ab14.slice/crio-e150c28863f6ee415b253dfefbb69eb7bde5994f47d8a968aa33eff6afd9d5f6 WatchSource:0}: Error finding container e150c28863f6ee415b253dfefbb69eb7bde5994f47d8a968aa33eff6afd9d5f6: Status 404 returned error can't find the container with id e150c28863f6ee415b253dfefbb69eb7bde5994f47d8a968aa33eff6afd9d5f6 Apr 16 21:01:04.215584 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.215543 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 21:01:04.218689 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.218666 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.221562 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.221532 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 21:01:04.221676 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.221558 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 21:01:04.221834 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.221806 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 21:01:04.221834 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.221825 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 21:01:04.222044 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.222026 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 21:01:04.222119 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.222060 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 21:01:04.222167 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.222136 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-8wv9m\"" Apr 16 21:01:04.222220 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.222193 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 21:01:04.222274 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.222249 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 21:01:04.222332 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.222303 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 21:01:04.226587 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.226563 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx"] Apr 16 21:01:04.231347 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:01:04.231312 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9e3ec16_5b10_458c_92e9_fc70564250dc.slice/crio-845ae4e7fa4e529d55ff9abdfe06afaf349a1d1427d73e48f999e4db8aec742e WatchSource:0}: Error finding container 845ae4e7fa4e529d55ff9abdfe06afaf349a1d1427d73e48f999e4db8aec742e: Status 404 returned error can't find the container with id 845ae4e7fa4e529d55ff9abdfe06afaf349a1d1427d73e48f999e4db8aec742e Apr 16 21:01:04.232758 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.232724 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 21:01:04.267198 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.267145 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.267198 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.267194 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.267351 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.267225 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-web-config\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.267351 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.267266 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.267351 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.267293 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/091e6cfb-6782-44ce-8308-7094ca7107cf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.267351 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.267323 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/091e6cfb-6782-44ce-8308-7094ca7107cf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.267351 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.267350 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.267640 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.267464 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s2hn\" (UniqueName: \"kubernetes.io/projected/091e6cfb-6782-44ce-8308-7094ca7107cf-kube-api-access-6s2hn\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.267640 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.267504 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-config-volume\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.267640 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.267557 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/091e6cfb-6782-44ce-8308-7094ca7107cf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.267640 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.267606 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/091e6cfb-6782-44ce-8308-7094ca7107cf-config-out\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.267640 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.267633 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/091e6cfb-6782-44ce-8308-7094ca7107cf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.267879 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.267670 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.368148 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.368114 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6s2hn\" (UniqueName: \"kubernetes.io/projected/091e6cfb-6782-44ce-8308-7094ca7107cf-kube-api-access-6s2hn\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.368348 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.368332 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-config-volume\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.368929 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.368911 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/091e6cfb-6782-44ce-8308-7094ca7107cf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.369937 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.369754 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/091e6cfb-6782-44ce-8308-7094ca7107cf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.370363 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.369886 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/091e6cfb-6782-44ce-8308-7094ca7107cf-config-out\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.370500 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.370480 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/091e6cfb-6782-44ce-8308-7094ca7107cf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.370589 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.370533 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.370589 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.370564 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.370696 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.370589 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.370696 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.370618 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-web-config\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.370696 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.370661 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.370849 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.370692 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/091e6cfb-6782-44ce-8308-7094ca7107cf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.370849 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.370723 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/091e6cfb-6782-44ce-8308-7094ca7107cf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.370849 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.370753 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.370993 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:01:04.370878 2568 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 21:01:04.370993 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:01:04.370934 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-secret-alertmanager-main-tls podName:091e6cfb-6782-44ce-8308-7094ca7107cf nodeName:}" failed. No retries permitted until 2026-04-16 21:01:04.870914827 +0000 UTC m=+179.073451558 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "091e6cfb-6782-44ce-8308-7094ca7107cf") : secret "alertmanager-main-tls" not found Apr 16 21:01:04.372309 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:01:04.371293 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/091e6cfb-6782-44ce-8308-7094ca7107cf-alertmanager-trusted-ca-bundle podName:091e6cfb-6782-44ce-8308-7094ca7107cf nodeName:}" failed. No retries permitted until 2026-04-16 21:01:04.871236981 +0000 UTC m=+179.073773710 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/091e6cfb-6782-44ce-8308-7094ca7107cf-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "091e6cfb-6782-44ce-8308-7094ca7107cf") : configmap references non-existent config key: ca-bundle.crt Apr 16 21:01:04.372309 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.371367 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/091e6cfb-6782-44ce-8308-7094ca7107cf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.372309 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.372266 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-config-volume\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.375658 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.375582 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.378090 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.376893 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-web-config\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.378090 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.376991 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.378090 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.377236 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/091e6cfb-6782-44ce-8308-7094ca7107cf-config-out\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.378090 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.378045 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/091e6cfb-6782-44ce-8308-7094ca7107cf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.378350 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.378118 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.379679 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.379627 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.382648 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.382602 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s2hn\" (UniqueName: \"kubernetes.io/projected/091e6cfb-6782-44ce-8308-7094ca7107cf-kube-api-access-6s2hn\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.875312 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.875275 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/091e6cfb-6782-44ce-8308-7094ca7107cf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.875503 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.875320 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.876211 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.876180 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/091e6cfb-6782-44ce-8308-7094ca7107cf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.877946 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.877922 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/091e6cfb-6782-44ce-8308-7094ca7107cf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"091e6cfb-6782-44ce-8308-7094ca7107cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:04.907466 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.907419 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ks9m9" event={"ID":"bb636b7d-dc5b-447a-8265-30a0d805ab14","Type":"ContainerStarted","Data":"e150c28863f6ee415b253dfefbb69eb7bde5994f47d8a968aa33eff6afd9d5f6"} Apr 16 21:01:04.909340 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.909311 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" event={"ID":"a9e3ec16-5b10-458c-92e9-fc70564250dc","Type":"ContainerStarted","Data":"132ee8ffb620091dcb62eb49c48a0b0e3980982703628486ab5b1ab99340ad33"} Apr 16 21:01:04.909473 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.909346 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" event={"ID":"a9e3ec16-5b10-458c-92e9-fc70564250dc","Type":"ContainerStarted","Data":"1a029834a2b4b7478f99599a24f3532cef46500381453dd8173e6dbefc4f7593"} Apr 16 21:01:04.909473 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:04.909359 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" event={"ID":"a9e3ec16-5b10-458c-92e9-fc70564250dc","Type":"ContainerStarted","Data":"845ae4e7fa4e529d55ff9abdfe06afaf349a1d1427d73e48f999e4db8aec742e"} Apr 16 21:01:05.133753 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:05.133675 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 21:01:06.109588 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:06.109530 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 21:01:06.117189 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:01:06.116507 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod091e6cfb_6782_44ce_8308_7094ca7107cf.slice/crio-40813e780fcb2b06d8a7c6b10f658b8aa43531dcf348227ffa5b530a7e3dc747 WatchSource:0}: Error finding container 40813e780fcb2b06d8a7c6b10f658b8aa43531dcf348227ffa5b530a7e3dc747: Status 404 returned error can't find the container with id 40813e780fcb2b06d8a7c6b10f658b8aa43531dcf348227ffa5b530a7e3dc747 Apr 16 21:01:06.917053 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:06.917012 2568 generic.go:358] "Generic (PLEG): container finished" podID="bb636b7d-dc5b-447a-8265-30a0d805ab14" containerID="95b0e9815607e01512e0c6d16b33a0fabe662b4b9cba2867df31a738f85c0abf" exitCode=0 Apr 16 21:01:06.917235 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:06.917089 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ks9m9" event={"ID":"bb636b7d-dc5b-447a-8265-30a0d805ab14","Type":"ContainerDied","Data":"95b0e9815607e01512e0c6d16b33a0fabe662b4b9cba2867df31a738f85c0abf"} Apr 16 21:01:06.918659 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:06.918634 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59fb76f94-qtq24" event={"ID":"aa349265-8f10-4718-b52a-553dd987d335","Type":"ContainerStarted","Data":"fbefc1f2ccb792f993e7de105c349cfdfca38199b4d450aac5b474db13e66f00"} Apr 16 21:01:06.920904 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:06.920875 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" event={"ID":"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086","Type":"ContainerStarted","Data":"f01a3a7cd143e28df4ecd820a0a4557cff768e78461745aa400980f2c00fdb11"} Apr 16 21:01:06.921000 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:06.920910 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" event={"ID":"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086","Type":"ContainerStarted","Data":"16130dcca90b9bb9cc9d7084e1a9d86c60c2cb46a64427f74ca419bf9a4999a2"} Apr 16 21:01:06.921000 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:06.920924 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" event={"ID":"c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086","Type":"ContainerStarted","Data":"9d226f1ac2ce8d6930b045086403b112c0037989cf247d85d9e7788dbbaf9d55"} Apr 16 21:01:06.923161 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:06.923129 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" event={"ID":"a9e3ec16-5b10-458c-92e9-fc70564250dc","Type":"ContainerStarted","Data":"b5bcbda06b4ac81e10ab404b4be25c6d30919acd5637d175ff5f2e5d8790a554"} Apr 16 21:01:06.924356 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:06.924334 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"091e6cfb-6782-44ce-8308-7094ca7107cf","Type":"ContainerStarted","Data":"40813e780fcb2b06d8a7c6b10f658b8aa43531dcf348227ffa5b530a7e3dc747"} Apr 16 21:01:06.954887 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:06.954843 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59fb76f94-qtq24" podStartSLOduration=2.274788117 podStartE2EDuration="5.954830074s" podCreationTimestamp="2026-04-16 21:01:01 +0000 UTC" firstStartedPulling="2026-04-16 21:01:02.280503073 +0000 UTC m=+176.483039783" lastFinishedPulling="2026-04-16 21:01:05.960545029 +0000 UTC m=+180.163081740" observedRunningTime="2026-04-16 21:01:06.953841059 +0000 UTC m=+181.156377793" watchObservedRunningTime="2026-04-16 21:01:06.954830074 +0000 UTC m=+181.157366805" Apr 16 21:01:06.970786 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:06.970681 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7sgvx" podStartSLOduration=1.860075212 podStartE2EDuration="3.970661457s" podCreationTimestamp="2026-04-16 21:01:03 +0000 UTC" firstStartedPulling="2026-04-16 21:01:04.38726894 +0000 UTC m=+178.589805656" lastFinishedPulling="2026-04-16 21:01:06.49785519 +0000 UTC m=+180.700391901" observedRunningTime="2026-04-16 21:01:06.970273845 +0000 UTC m=+181.172810578" watchObservedRunningTime="2026-04-16 21:01:06.970661457 +0000 UTC m=+181.173198192" Apr 16 21:01:06.991211 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:06.991147 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-8v8q5" podStartSLOduration=1.667501472 podStartE2EDuration="3.991128823s" podCreationTimestamp="2026-04-16 21:01:03 +0000 UTC" firstStartedPulling="2026-04-16 21:01:03.636533811 +0000 UTC m=+177.839070525" lastFinishedPulling="2026-04-16 21:01:05.960161153 +0000 UTC m=+180.162697876" observedRunningTime="2026-04-16 21:01:06.990144234 +0000 UTC m=+181.192680974" watchObservedRunningTime="2026-04-16 21:01:06.991128823 +0000 UTC m=+181.193665557" Apr 16 21:01:07.928174 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:07.928142 2568 generic.go:358] "Generic (PLEG): container finished" podID="091e6cfb-6782-44ce-8308-7094ca7107cf" containerID="1c3dbc72bd4dbef489d4141de8dd714a553c71e3160325d0ac8fa9ee02af9867" exitCode=0 Apr 16 21:01:07.928625 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:07.928228 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"091e6cfb-6782-44ce-8308-7094ca7107cf","Type":"ContainerDied","Data":"1c3dbc72bd4dbef489d4141de8dd714a553c71e3160325d0ac8fa9ee02af9867"} Apr 16 21:01:07.930431 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:07.930406 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ks9m9" event={"ID":"bb636b7d-dc5b-447a-8265-30a0d805ab14","Type":"ContainerStarted","Data":"e8f8cd520f821b2e200c57740d05243f62b6bca35df6df2b80d8e4cd5269723f"} Apr 16 21:01:07.930523 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:07.930438 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ks9m9" event={"ID":"bb636b7d-dc5b-447a-8265-30a0d805ab14","Type":"ContainerStarted","Data":"95c98f22ad6f00545bbc65113a4b9e319f0ebd045f07e5beefe390c2e5c82963"} Apr 16 21:01:07.985946 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:07.985893 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ks9m9" podStartSLOduration=3.16088159 podStartE2EDuration="4.985878705s" podCreationTimestamp="2026-04-16 21:01:03 +0000 UTC" firstStartedPulling="2026-04-16 21:01:04.138212779 +0000 UTC m=+178.340749509" lastFinishedPulling="2026-04-16 21:01:05.9632099 +0000 UTC m=+180.165746624" observedRunningTime="2026-04-16 21:01:07.98479401 +0000 UTC m=+182.187330745" watchObservedRunningTime="2026-04-16 21:01:07.985878705 +0000 UTC m=+182.188415434" Apr 16 21:01:08.400887 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.400843 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v"] Apr 16 21:01:08.443229 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.443191 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v"] Apr 16 21:01:08.443364 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.443345 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.446775 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.446736 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 21:01:08.446905 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.446784 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 21:01:08.446905 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.446736 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 21:01:08.446905 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.446736 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 21:01:08.446905 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.446871 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 21:01:08.447196 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.447173 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-nx7vz\"" Apr 16 21:01:08.451564 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.451545 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 21:01:08.508226 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.508187 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tv6g\" (UniqueName: \"kubernetes.io/projected/08025926-ab85-485a-b626-2d52b4c806ed-kube-api-access-2tv6g\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.508419 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.508244 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/08025926-ab85-485a-b626-2d52b4c806ed-secret-telemeter-client\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.508419 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.508283 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08025926-ab85-485a-b626-2d52b4c806ed-metrics-client-ca\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.508419 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.508313 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08025926-ab85-485a-b626-2d52b4c806ed-telemeter-trusted-ca-bundle\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.508419 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.508405 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/08025926-ab85-485a-b626-2d52b4c806ed-telemeter-client-tls\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.508587 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.508450 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/08025926-ab85-485a-b626-2d52b4c806ed-federate-client-tls\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.508587 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.508472 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/08025926-ab85-485a-b626-2d52b4c806ed-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.508587 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.508507 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08025926-ab85-485a-b626-2d52b4c806ed-serving-certs-ca-bundle\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.609598 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.609560 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/08025926-ab85-485a-b626-2d52b4c806ed-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.609783 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.609603 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08025926-ab85-485a-b626-2d52b4c806ed-serving-certs-ca-bundle\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.609783 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.609659 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tv6g\" (UniqueName: \"kubernetes.io/projected/08025926-ab85-485a-b626-2d52b4c806ed-kube-api-access-2tv6g\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.609783 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.609686 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/08025926-ab85-485a-b626-2d52b4c806ed-secret-telemeter-client\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.609783 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.609705 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08025926-ab85-485a-b626-2d52b4c806ed-metrics-client-ca\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.609783 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.609764 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08025926-ab85-485a-b626-2d52b4c806ed-telemeter-trusted-ca-bundle\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.610034 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.609842 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/08025926-ab85-485a-b626-2d52b4c806ed-telemeter-client-tls\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.610034 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.609894 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/08025926-ab85-485a-b626-2d52b4c806ed-federate-client-tls\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.610591 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.610563 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08025926-ab85-485a-b626-2d52b4c806ed-serving-certs-ca-bundle\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.610705 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.610660 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08025926-ab85-485a-b626-2d52b4c806ed-metrics-client-ca\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.611001 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.610977 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08025926-ab85-485a-b626-2d52b4c806ed-telemeter-trusted-ca-bundle\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.612431 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.612395 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/08025926-ab85-485a-b626-2d52b4c806ed-secret-telemeter-client\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.612514 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.612445 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/08025926-ab85-485a-b626-2d52b4c806ed-telemeter-client-tls\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.612766 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.612743 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/08025926-ab85-485a-b626-2d52b4c806ed-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.612806 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.612793 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/08025926-ab85-485a-b626-2d52b4c806ed-federate-client-tls\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.618712 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.618695 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tv6g\" (UniqueName: \"kubernetes.io/projected/08025926-ab85-485a-b626-2d52b4c806ed-kube-api-access-2tv6g\") pod \"telemeter-client-78dcd8b57c-2vd2v\" (UID: \"08025926-ab85-485a-b626-2d52b4c806ed\") " pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.752780 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.752747 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" Apr 16 21:01:08.825887 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.825852 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59fb76f94-qtq24"] Apr 16 21:01:08.895990 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.895934 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v"] Apr 16 21:01:08.901155 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:01:08.901125 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08025926_ab85_485a_b626_2d52b4c806ed.slice/crio-f335ef6aa138e34e0b6e161be6baef4b2f02fbed38723afa6d7f69c9b2d5e51a WatchSource:0}: Error finding container f335ef6aa138e34e0b6e161be6baef4b2f02fbed38723afa6d7f69c9b2d5e51a: Status 404 returned error can't find the container with id f335ef6aa138e34e0b6e161be6baef4b2f02fbed38723afa6d7f69c9b2d5e51a Apr 16 21:01:08.934392 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:08.934340 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" event={"ID":"08025926-ab85-485a-b626-2d52b4c806ed","Type":"ContainerStarted","Data":"f335ef6aa138e34e0b6e161be6baef4b2f02fbed38723afa6d7f69c9b2d5e51a"} Apr 16 21:01:09.481952 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.481918 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 21:01:09.485609 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.485587 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.489136 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.488772 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 21:01:09.489136 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.488789 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 21:01:09.489136 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.488856 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jbwnx\"" Apr 16 21:01:09.489136 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.488882 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 21:01:09.489136 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.488912 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 21:01:09.489136 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.488926 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-1moju10enn430\"" Apr 16 21:01:09.489136 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.488968 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 21:01:09.489136 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.489141 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 21:01:09.490009 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.489991 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 21:01:09.490162 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.490119 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 21:01:09.490252 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.490229 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 21:01:09.490317 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.490265 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 21:01:09.490371 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.490323 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 21:01:09.492892 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.492869 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 21:01:09.497340 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.495942 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 21:01:09.501964 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.501940 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 21:01:09.517145 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.517115 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.517306 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.517160 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.517306 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.517191 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d0f404b7-076d-4b10-a425-05da184a3e01-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.517306 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.517255 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0f404b7-076d-4b10-a425-05da184a3e01-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.517306 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.517301 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.517530 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.517324 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d0f404b7-076d-4b10-a425-05da184a3e01-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.517530 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.517447 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0f404b7-076d-4b10-a425-05da184a3e01-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.517530 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.517477 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d0f404b7-076d-4b10-a425-05da184a3e01-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.517530 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.517506 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d0f404b7-076d-4b10-a425-05da184a3e01-config-out\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.517728 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.517529 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-web-config\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.517728 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.517553 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0f404b7-076d-4b10-a425-05da184a3e01-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.517728 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.517622 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.517728 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.517657 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsmhr\" (UniqueName: \"kubernetes.io/projected/d0f404b7-076d-4b10-a425-05da184a3e01-kube-api-access-fsmhr\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.517728 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.517682 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-config\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.517907 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.517769 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.517907 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.517803 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0f404b7-076d-4b10-a425-05da184a3e01-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.517907 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.517822 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.517907 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.517879 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.619023 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.618984 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.619023 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.619029 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d0f404b7-076d-4b10-a425-05da184a3e01-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.619273 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.619061 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0f404b7-076d-4b10-a425-05da184a3e01-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.619273 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.619083 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d0f404b7-076d-4b10-a425-05da184a3e01-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.619273 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.619113 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d0f404b7-076d-4b10-a425-05da184a3e01-config-out\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.619273 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.619139 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-web-config\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.619484 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.619452 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0f404b7-076d-4b10-a425-05da184a3e01-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.619545 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.619512 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.619600 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.619545 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsmhr\" (UniqueName: \"kubernetes.io/projected/d0f404b7-076d-4b10-a425-05da184a3e01-kube-api-access-fsmhr\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.619600 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.619586 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-config\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.619681 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.619635 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.619681 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.619661 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0f404b7-076d-4b10-a425-05da184a3e01-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.619774 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.619687 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.619774 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.619728 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.620038 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.619783 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.620038 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.619814 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.620038 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.619843 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d0f404b7-076d-4b10-a425-05da184a3e01-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.620038 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.619893 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0f404b7-076d-4b10-a425-05da184a3e01-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.620038 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.620000 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0f404b7-076d-4b10-a425-05da184a3e01-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.620269 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.620068 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d0f404b7-076d-4b10-a425-05da184a3e01-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.621088 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.620613 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0f404b7-076d-4b10-a425-05da184a3e01-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.621088 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.621078 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0f404b7-076d-4b10-a425-05da184a3e01-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.621343 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.621314 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0f404b7-076d-4b10-a425-05da184a3e01-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.622806 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.622771 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d0f404b7-076d-4b10-a425-05da184a3e01-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.623424 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.623398 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-web-config\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.624112 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.624016 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-config\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.624112 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.624069 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.624112 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.624084 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.624911 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.624865 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.625184 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.625136 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d0f404b7-076d-4b10-a425-05da184a3e01-config-out\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.625843 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.625802 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.625843 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.625853 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.625843 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.626053 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d0f404b7-076d-4b10-a425-05da184a3e01-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.625843 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.626208 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.627130 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.627109 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d0f404b7-076d-4b10-a425-05da184a3e01-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.629239 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.629217 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsmhr\" (UniqueName: \"kubernetes.io/projected/d0f404b7-076d-4b10-a425-05da184a3e01-kube-api-access-fsmhr\") pod \"prometheus-k8s-0\" (UID: \"d0f404b7-076d-4b10-a425-05da184a3e01\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:09.799070 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:09.799031 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:10.219848 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:10.219818 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 21:01:10.557499 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:01:10.557455 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0f404b7_076d_4b10_a425_05da184a3e01.slice/crio-167faef16370282869c2a41e9cbbecb12d7a8a552acae13cf6dbf952b01051f8 WatchSource:0}: Error finding container 167faef16370282869c2a41e9cbbecb12d7a8a552acae13cf6dbf952b01051f8: Status 404 returned error can't find the container with id 167faef16370282869c2a41e9cbbecb12d7a8a552acae13cf6dbf952b01051f8 Apr 16 21:01:10.943023 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:10.942988 2568 generic.go:358] "Generic (PLEG): container finished" podID="d0f404b7-076d-4b10-a425-05da184a3e01" containerID="21d10bbf27f83911fc81b17baa1a00266ec56745cfef45b70c0a112a85f7a2d5" exitCode=0 Apr 16 21:01:10.943222 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:10.943086 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d0f404b7-076d-4b10-a425-05da184a3e01","Type":"ContainerDied","Data":"21d10bbf27f83911fc81b17baa1a00266ec56745cfef45b70c0a112a85f7a2d5"} Apr 16 21:01:10.943222 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:10.943129 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d0f404b7-076d-4b10-a425-05da184a3e01","Type":"ContainerStarted","Data":"167faef16370282869c2a41e9cbbecb12d7a8a552acae13cf6dbf952b01051f8"} Apr 16 21:01:10.945809 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:10.945782 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"091e6cfb-6782-44ce-8308-7094ca7107cf","Type":"ContainerStarted","Data":"58dde47f995d3ad88873ec42932c342be4a506a5c56c75ae540ed277fa1c2b3a"} Apr 16 21:01:10.945918 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:10.945816 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"091e6cfb-6782-44ce-8308-7094ca7107cf","Type":"ContainerStarted","Data":"aff4f701e7fa90927e44ae63b28c950efd7d859d7282b616c6f8d8977514c7b5"} Apr 16 21:01:10.945918 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:10.945828 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"091e6cfb-6782-44ce-8308-7094ca7107cf","Type":"ContainerStarted","Data":"2358fedceaeefab1620e0c6b03f7428784338ec02490e28050c8f609db88f82a"} Apr 16 21:01:10.945918 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:10.945837 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"091e6cfb-6782-44ce-8308-7094ca7107cf","Type":"ContainerStarted","Data":"f43f7f36cfb4e33b0d88d8a7c88768bf67976bb99384bbfd70772481fc0af1b4"} Apr 16 21:01:10.945918 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:10.945845 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"091e6cfb-6782-44ce-8308-7094ca7107cf","Type":"ContainerStarted","Data":"7a3188ac9ef8d2ec55071bb2af0541a5a6b16c3cd0a3200f68238a805869b72d"} Apr 16 21:01:10.947686 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:10.947665 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" event={"ID":"08025926-ab85-485a-b626-2d52b4c806ed","Type":"ContainerStarted","Data":"ece1f1b420af40819ddc07c72c7bf0977033ddd800300b50e1c451c74d197973"} Apr 16 21:01:10.947795 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:10.947691 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" event={"ID":"08025926-ab85-485a-b626-2d52b4c806ed","Type":"ContainerStarted","Data":"4efb30c50dc9876b51538187b364e3b66414e35770afc3820dee907a6c366860"} Apr 16 21:01:10.947795 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:10.947701 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" event={"ID":"08025926-ab85-485a-b626-2d52b4c806ed","Type":"ContainerStarted","Data":"94a21f19e2f2c2145ef37c917c2ff9342088b8c2194fe16330e927ea87bc6d8e"} Apr 16 21:01:10.991650 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:10.991600 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-78dcd8b57c-2vd2v" podStartSLOduration=1.307112166 podStartE2EDuration="2.991585251s" podCreationTimestamp="2026-04-16 21:01:08 +0000 UTC" firstStartedPulling="2026-04-16 21:01:08.903220128 +0000 UTC m=+183.105756838" lastFinishedPulling="2026-04-16 21:01:10.587693214 +0000 UTC m=+184.790229923" observedRunningTime="2026-04-16 21:01:10.989554341 +0000 UTC m=+185.192091073" watchObservedRunningTime="2026-04-16 21:01:10.991585251 +0000 UTC m=+185.194121983" Apr 16 21:01:12.154225 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:12.154196 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:12.864955 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:12.864911 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7fbd4b68b-464br" Apr 16 21:01:12.957027 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:12.956975 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"091e6cfb-6782-44ce-8308-7094ca7107cf","Type":"ContainerStarted","Data":"04cba74a572849aaf56eac97b22e641aed90564a428e359f2561f2600d0cc1f0"} Apr 16 21:01:12.988130 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:12.988070 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.999103399 podStartE2EDuration="8.988048449s" podCreationTimestamp="2026-04-16 21:01:04 +0000 UTC" firstStartedPulling="2026-04-16 21:01:06.121313915 +0000 UTC m=+180.323850625" lastFinishedPulling="2026-04-16 21:01:12.110258965 +0000 UTC m=+186.312795675" observedRunningTime="2026-04-16 21:01:12.98615596 +0000 UTC m=+187.188692692" watchObservedRunningTime="2026-04-16 21:01:12.988048449 +0000 UTC m=+187.190585184" Apr 16 21:01:14.539469 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.539437 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b47d4479c-8wg47"] Apr 16 21:01:14.541613 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.541598 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.554021 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.553998 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b47d4479c-8wg47"] Apr 16 21:01:14.671896 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.671851 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/df8d6720-ddab-4afb-a6fa-30771bbcb52f-console-oauth-config\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.672073 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.671904 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-oauth-serving-cert\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.672073 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.671929 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/df8d6720-ddab-4afb-a6fa-30771bbcb52f-console-serving-cert\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.672073 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.671947 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hppl\" (UniqueName: \"kubernetes.io/projected/df8d6720-ddab-4afb-a6fa-30771bbcb52f-kube-api-access-4hppl\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.672073 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.671983 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-service-ca\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.672073 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.672060 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-console-config\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.672340 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.672086 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-trusted-ca-bundle\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.772784 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.772737 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-service-ca\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.772966 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.772841 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-console-config\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.772966 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.772872 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-trusted-ca-bundle\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.772966 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.772919 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/df8d6720-ddab-4afb-a6fa-30771bbcb52f-console-oauth-config\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.772966 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.772961 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-oauth-serving-cert\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.773180 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.772991 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/df8d6720-ddab-4afb-a6fa-30771bbcb52f-console-serving-cert\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.773180 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.773014 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hppl\" (UniqueName: \"kubernetes.io/projected/df8d6720-ddab-4afb-a6fa-30771bbcb52f-kube-api-access-4hppl\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.773618 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.773569 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-service-ca\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.773868 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.773819 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-console-config\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.774124 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.774101 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-trusted-ca-bundle\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.774124 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.774112 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-oauth-serving-cert\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.775954 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.775930 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/df8d6720-ddab-4afb-a6fa-30771bbcb52f-console-serving-cert\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.776087 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.775939 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/df8d6720-ddab-4afb-a6fa-30771bbcb52f-console-oauth-config\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.786313 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.786282 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hppl\" (UniqueName: \"kubernetes.io/projected/df8d6720-ddab-4afb-a6fa-30771bbcb52f-kube-api-access-4hppl\") pod \"console-7b47d4479c-8wg47\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.851206 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.851107 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:14.965974 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.965939 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d0f404b7-076d-4b10-a425-05da184a3e01","Type":"ContainerStarted","Data":"a810668c3c0626370fc48a34389d9d8c7788b4b044878a2b59425cad26e0404d"} Apr 16 21:01:14.965974 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.965979 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d0f404b7-076d-4b10-a425-05da184a3e01","Type":"ContainerStarted","Data":"c42e8263dedfb667457898465ba517c2697efa37b8025c3f9c116f27e167f48a"} Apr 16 21:01:14.996842 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:14.996808 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b47d4479c-8wg47"] Apr 16 21:01:15.000977 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:01:15.000946 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf8d6720_ddab_4afb_a6fa_30771bbcb52f.slice/crio-64442d97771fa2d341bcfcff64dafada6a85ac8dae24b85955305f9bacbf004d WatchSource:0}: Error finding container 64442d97771fa2d341bcfcff64dafada6a85ac8dae24b85955305f9bacbf004d: Status 404 returned error can't find the container with id 64442d97771fa2d341bcfcff64dafada6a85ac8dae24b85955305f9bacbf004d Apr 16 21:01:15.970971 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:15.970918 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b47d4479c-8wg47" event={"ID":"df8d6720-ddab-4afb-a6fa-30771bbcb52f","Type":"ContainerStarted","Data":"658d498d01db2142ed26b1a70aebdb5e9993f1be36308260e954bf92d3d859e5"} Apr 16 21:01:15.970971 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:15.970965 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b47d4479c-8wg47" event={"ID":"df8d6720-ddab-4afb-a6fa-30771bbcb52f","Type":"ContainerStarted","Data":"64442d97771fa2d341bcfcff64dafada6a85ac8dae24b85955305f9bacbf004d"} Apr 16 21:01:15.973317 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:15.973295 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d0f404b7-076d-4b10-a425-05da184a3e01","Type":"ContainerStarted","Data":"80aa9d04bb9a9aa4a84a01cfd8b2d6320040944df94b07071ca6c17d0ddc5c1c"} Apr 16 21:01:15.990019 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:15.989973 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b47d4479c-8wg47" podStartSLOduration=1.989960355 podStartE2EDuration="1.989960355s" podCreationTimestamp="2026-04-16 21:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:01:15.988607523 +0000 UTC m=+190.191144259" watchObservedRunningTime="2026-04-16 21:01:15.989960355 +0000 UTC m=+190.192497086" Apr 16 21:01:16.981549 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:16.981509 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d0f404b7-076d-4b10-a425-05da184a3e01","Type":"ContainerStarted","Data":"463a4fbb92385b7323d239ecaa26864614f5f11462776cb85c8407bea1b89dbb"} Apr 16 21:01:16.981549 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:16.981553 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d0f404b7-076d-4b10-a425-05da184a3e01","Type":"ContainerStarted","Data":"4fd8204a354e09f255390252faca5bc9297508e81406b5f90ec0d9d9d3fa6644"} Apr 16 21:01:16.981549 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:16.981563 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d0f404b7-076d-4b10-a425-05da184a3e01","Type":"ContainerStarted","Data":"f603d1eff957a4e89e736520bb9e35fcaab5386592f248f2bfac2d58b675531e"} Apr 16 21:01:17.012839 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:17.012785 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.068161422 podStartE2EDuration="8.012771605s" podCreationTimestamp="2026-04-16 21:01:09 +0000 UTC" firstStartedPulling="2026-04-16 21:01:10.944418368 +0000 UTC m=+185.146955079" lastFinishedPulling="2026-04-16 21:01:15.889028552 +0000 UTC m=+190.091565262" observedRunningTime="2026-04-16 21:01:17.010796122 +0000 UTC m=+191.213332910" watchObservedRunningTime="2026-04-16 21:01:17.012771605 +0000 UTC m=+191.215308336" Apr 16 21:01:19.799486 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:19.799435 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:01:24.852226 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:24.852193 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:24.852691 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:24.852234 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:24.857863 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:24.857839 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:25.016189 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:25.016163 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:01:33.953229 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:33.953164 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-59fb76f94-qtq24" podUID="aa349265-8f10-4718-b52a-553dd987d335" containerName="console" containerID="cri-o://fbefc1f2ccb792f993e7de105c349cfdfca38199b4d450aac5b474db13e66f00" gracePeriod=15 Apr 16 21:01:34.191717 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.191693 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59fb76f94-qtq24_aa349265-8f10-4718-b52a-553dd987d335/console/0.log" Apr 16 21:01:34.191834 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.191755 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:34.362485 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.362438 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-service-ca\") pod \"aa349265-8f10-4718-b52a-553dd987d335\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " Apr 16 21:01:34.362675 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.362508 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hklrr\" (UniqueName: \"kubernetes.io/projected/aa349265-8f10-4718-b52a-553dd987d335-kube-api-access-hklrr\") pod \"aa349265-8f10-4718-b52a-553dd987d335\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " Apr 16 21:01:34.362675 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.362577 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-trusted-ca-bundle\") pod \"aa349265-8f10-4718-b52a-553dd987d335\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " Apr 16 21:01:34.362675 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.362631 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa349265-8f10-4718-b52a-553dd987d335-console-serving-cert\") pod \"aa349265-8f10-4718-b52a-553dd987d335\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " Apr 16 21:01:34.362675 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.362658 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa349265-8f10-4718-b52a-553dd987d335-console-oauth-config\") pod \"aa349265-8f10-4718-b52a-553dd987d335\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " Apr 16 21:01:34.362879 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.362694 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-console-config\") pod \"aa349265-8f10-4718-b52a-553dd987d335\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " Apr 16 21:01:34.362879 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.362740 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-oauth-serving-cert\") pod \"aa349265-8f10-4718-b52a-553dd987d335\" (UID: \"aa349265-8f10-4718-b52a-553dd987d335\") " Apr 16 21:01:34.362973 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.362879 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-service-ca" (OuterVolumeSpecName: "service-ca") pod "aa349265-8f10-4718-b52a-553dd987d335" (UID: "aa349265-8f10-4718-b52a-553dd987d335"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:01:34.363143 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.363096 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "aa349265-8f10-4718-b52a-553dd987d335" (UID: "aa349265-8f10-4718-b52a-553dd987d335"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:01:34.363143 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.363122 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-console-config" (OuterVolumeSpecName: "console-config") pod "aa349265-8f10-4718-b52a-553dd987d335" (UID: "aa349265-8f10-4718-b52a-553dd987d335"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:01:34.363347 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.363329 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "aa349265-8f10-4718-b52a-553dd987d335" (UID: "aa349265-8f10-4718-b52a-553dd987d335"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:01:34.363487 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.363465 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-console-config\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:01:34.363801 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.363782 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-oauth-serving-cert\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:01:34.363862 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.363809 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-service-ca\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:01:34.363862 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.363827 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa349265-8f10-4718-b52a-553dd987d335-trusted-ca-bundle\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:01:34.365131 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.365108 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa349265-8f10-4718-b52a-553dd987d335-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "aa349265-8f10-4718-b52a-553dd987d335" (UID: "aa349265-8f10-4718-b52a-553dd987d335"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 21:01:34.365295 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.365276 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa349265-8f10-4718-b52a-553dd987d335-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "aa349265-8f10-4718-b52a-553dd987d335" (UID: "aa349265-8f10-4718-b52a-553dd987d335"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 21:01:34.365353 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.365298 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa349265-8f10-4718-b52a-553dd987d335-kube-api-access-hklrr" (OuterVolumeSpecName: "kube-api-access-hklrr") pod "aa349265-8f10-4718-b52a-553dd987d335" (UID: "aa349265-8f10-4718-b52a-553dd987d335"). InnerVolumeSpecName "kube-api-access-hklrr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:01:34.465153 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.465097 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hklrr\" (UniqueName: \"kubernetes.io/projected/aa349265-8f10-4718-b52a-553dd987d335-kube-api-access-hklrr\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:01:34.465153 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.465146 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa349265-8f10-4718-b52a-553dd987d335-console-serving-cert\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:01:34.465153 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:34.465161 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa349265-8f10-4718-b52a-553dd987d335-console-oauth-config\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:01:35.047015 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:35.046981 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59fb76f94-qtq24_aa349265-8f10-4718-b52a-553dd987d335/console/0.log" Apr 16 21:01:35.047463 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:35.047033 2568 generic.go:358] "Generic (PLEG): container finished" podID="aa349265-8f10-4718-b52a-553dd987d335" containerID="fbefc1f2ccb792f993e7de105c349cfdfca38199b4d450aac5b474db13e66f00" exitCode=2 Apr 16 21:01:35.047463 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:35.047111 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59fb76f94-qtq24" event={"ID":"aa349265-8f10-4718-b52a-553dd987d335","Type":"ContainerDied","Data":"fbefc1f2ccb792f993e7de105c349cfdfca38199b4d450aac5b474db13e66f00"} Apr 16 21:01:35.047463 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:35.047128 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59fb76f94-qtq24" Apr 16 21:01:35.047463 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:35.047149 2568 scope.go:117] "RemoveContainer" containerID="fbefc1f2ccb792f993e7de105c349cfdfca38199b4d450aac5b474db13e66f00" Apr 16 21:01:35.047463 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:35.047138 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59fb76f94-qtq24" event={"ID":"aa349265-8f10-4718-b52a-553dd987d335","Type":"ContainerDied","Data":"50f79ab99e0c43159e90e8c7e68983faffef0a954768d629aff8f130de6305c7"} Apr 16 21:01:35.055965 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:35.055943 2568 scope.go:117] "RemoveContainer" containerID="fbefc1f2ccb792f993e7de105c349cfdfca38199b4d450aac5b474db13e66f00" Apr 16 21:01:35.056219 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:01:35.056198 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbefc1f2ccb792f993e7de105c349cfdfca38199b4d450aac5b474db13e66f00\": container with ID starting with fbefc1f2ccb792f993e7de105c349cfdfca38199b4d450aac5b474db13e66f00 not found: ID does not exist" containerID="fbefc1f2ccb792f993e7de105c349cfdfca38199b4d450aac5b474db13e66f00" Apr 16 21:01:35.056272 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:35.056227 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbefc1f2ccb792f993e7de105c349cfdfca38199b4d450aac5b474db13e66f00"} err="failed to get container status \"fbefc1f2ccb792f993e7de105c349cfdfca38199b4d450aac5b474db13e66f00\": rpc error: code = NotFound desc = could not find container \"fbefc1f2ccb792f993e7de105c349cfdfca38199b4d450aac5b474db13e66f00\": container with ID starting with fbefc1f2ccb792f993e7de105c349cfdfca38199b4d450aac5b474db13e66f00 not found: ID does not exist" Apr 16 21:01:35.068635 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:35.068610 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59fb76f94-qtq24"] Apr 16 21:01:35.072619 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:35.072597 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-59fb76f94-qtq24"] Apr 16 21:01:36.333898 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:01:36.333861 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa349265-8f10-4718-b52a-553dd987d335" path="/var/lib/kubelet/pods/aa349265-8f10-4718-b52a-553dd987d335/volumes" Apr 16 21:02:09.802406 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:09.799193 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:02:09.828980 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:09.828950 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:02:10.183362 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:10.183268 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 21:02:17.150529 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:17.150489 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs\") pod \"network-metrics-daemon-llm2q\" (UID: \"a6620072-c60f-4d78-bc86-aac34b2c5098\") " pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 21:02:17.152941 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:17.152919 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6620072-c60f-4d78-bc86-aac34b2c5098-metrics-certs\") pod \"network-metrics-daemon-llm2q\" (UID: \"a6620072-c60f-4d78-bc86-aac34b2c5098\") " pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 21:02:17.236139 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:17.236101 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lvzr6\"" Apr 16 21:02:17.241770 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:17.241744 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llm2q" Apr 16 21:02:17.371780 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:17.371732 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-llm2q"] Apr 16 21:02:17.374739 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:02:17.374715 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6620072_c60f_4d78_bc86_aac34b2c5098.slice/crio-1c2d17c8403d7f9a3daf3792ed1c64897ce645205aaa5ffbd2b21d8acf34b639 WatchSource:0}: Error finding container 1c2d17c8403d7f9a3daf3792ed1c64897ce645205aaa5ffbd2b21d8acf34b639: Status 404 returned error can't find the container with id 1c2d17c8403d7f9a3daf3792ed1c64897ce645205aaa5ffbd2b21d8acf34b639 Apr 16 21:02:18.186564 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:18.186524 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-llm2q" event={"ID":"a6620072-c60f-4d78-bc86-aac34b2c5098","Type":"ContainerStarted","Data":"1c2d17c8403d7f9a3daf3792ed1c64897ce645205aaa5ffbd2b21d8acf34b639"} Apr 16 21:02:19.191038 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:19.191001 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-llm2q" event={"ID":"a6620072-c60f-4d78-bc86-aac34b2c5098","Type":"ContainerStarted","Data":"e6fb50a24d5383f452dfbf464448e8e3c44b3b6336c04ebb860efebf0156cacc"} Apr 16 21:02:19.191038 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:19.191038 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-llm2q" event={"ID":"a6620072-c60f-4d78-bc86-aac34b2c5098","Type":"ContainerStarted","Data":"b20d91093ed245cc9769935993dfc33fa1071b7ffb232691f51b3d3bc65b6e04"} Apr 16 21:02:19.215167 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:19.215120 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-llm2q" podStartSLOduration=252.008829003 podStartE2EDuration="4m13.215104389s" podCreationTimestamp="2026-04-16 20:58:06 +0000 UTC" firstStartedPulling="2026-04-16 21:02:17.376778299 +0000 UTC m=+251.579315008" lastFinishedPulling="2026-04-16 21:02:18.583053685 +0000 UTC m=+252.785590394" observedRunningTime="2026-04-16 21:02:19.211934301 +0000 UTC m=+253.414471033" watchObservedRunningTime="2026-04-16 21:02:19.215104389 +0000 UTC m=+253.417641122" Apr 16 21:02:31.452743 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:31.452660 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b47d4479c-8wg47"] Apr 16 21:02:56.473171 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.473126 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7b47d4479c-8wg47" podUID="df8d6720-ddab-4afb-a6fa-30771bbcb52f" containerName="console" containerID="cri-o://658d498d01db2142ed26b1a70aebdb5e9993f1be36308260e954bf92d3d859e5" gracePeriod=15 Apr 16 21:02:56.730092 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.730037 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b47d4479c-8wg47_df8d6720-ddab-4afb-a6fa-30771bbcb52f/console/0.log" Apr 16 21:02:56.730203 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.730098 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:02:56.899324 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.899277 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-trusted-ca-bundle\") pod \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " Apr 16 21:02:56.899531 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.899332 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-console-config\") pod \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " Apr 16 21:02:56.899531 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.899392 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-oauth-serving-cert\") pod \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " Apr 16 21:02:56.899531 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.899420 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/df8d6720-ddab-4afb-a6fa-30771bbcb52f-console-serving-cert\") pod \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " Apr 16 21:02:56.899531 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.899453 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/df8d6720-ddab-4afb-a6fa-30771bbcb52f-console-oauth-config\") pod \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " Apr 16 21:02:56.899531 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.899486 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hppl\" (UniqueName: \"kubernetes.io/projected/df8d6720-ddab-4afb-a6fa-30771bbcb52f-kube-api-access-4hppl\") pod \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " Apr 16 21:02:56.899797 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.899618 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-service-ca\") pod \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\" (UID: \"df8d6720-ddab-4afb-a6fa-30771bbcb52f\") " Apr 16 21:02:56.899797 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.899775 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-console-config" (OuterVolumeSpecName: "console-config") pod "df8d6720-ddab-4afb-a6fa-30771bbcb52f" (UID: "df8d6720-ddab-4afb-a6fa-30771bbcb52f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:02:56.899892 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.899822 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "df8d6720-ddab-4afb-a6fa-30771bbcb52f" (UID: "df8d6720-ddab-4afb-a6fa-30771bbcb52f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:02:56.899944 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.899897 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-console-config\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:02:56.899944 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.899917 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-oauth-serving-cert\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:02:56.900088 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.900047 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "df8d6720-ddab-4afb-a6fa-30771bbcb52f" (UID: "df8d6720-ddab-4afb-a6fa-30771bbcb52f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:02:56.900268 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.900144 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-service-ca" (OuterVolumeSpecName: "service-ca") pod "df8d6720-ddab-4afb-a6fa-30771bbcb52f" (UID: "df8d6720-ddab-4afb-a6fa-30771bbcb52f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:02:56.901819 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.901785 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df8d6720-ddab-4afb-a6fa-30771bbcb52f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "df8d6720-ddab-4afb-a6fa-30771bbcb52f" (UID: "df8d6720-ddab-4afb-a6fa-30771bbcb52f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 21:02:56.901921 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.901882 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df8d6720-ddab-4afb-a6fa-30771bbcb52f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "df8d6720-ddab-4afb-a6fa-30771bbcb52f" (UID: "df8d6720-ddab-4afb-a6fa-30771bbcb52f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 21:02:56.901982 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:56.901929 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df8d6720-ddab-4afb-a6fa-30771bbcb52f-kube-api-access-4hppl" (OuterVolumeSpecName: "kube-api-access-4hppl") pod "df8d6720-ddab-4afb-a6fa-30771bbcb52f" (UID: "df8d6720-ddab-4afb-a6fa-30771bbcb52f"). InnerVolumeSpecName "kube-api-access-4hppl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:02:57.000921 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:57.000884 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4hppl\" (UniqueName: \"kubernetes.io/projected/df8d6720-ddab-4afb-a6fa-30771bbcb52f-kube-api-access-4hppl\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:02:57.000921 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:57.000918 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-service-ca\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:02:57.001118 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:57.000936 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df8d6720-ddab-4afb-a6fa-30771bbcb52f-trusted-ca-bundle\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:02:57.001118 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:57.000948 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/df8d6720-ddab-4afb-a6fa-30771bbcb52f-console-serving-cert\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:02:57.001118 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:57.000961 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/df8d6720-ddab-4afb-a6fa-30771bbcb52f-console-oauth-config\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:02:57.304129 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:57.304044 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b47d4479c-8wg47_df8d6720-ddab-4afb-a6fa-30771bbcb52f/console/0.log" Apr 16 21:02:57.304129 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:57.304091 2568 generic.go:358] "Generic (PLEG): container finished" podID="df8d6720-ddab-4afb-a6fa-30771bbcb52f" containerID="658d498d01db2142ed26b1a70aebdb5e9993f1be36308260e954bf92d3d859e5" exitCode=2 Apr 16 21:02:57.304312 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:57.304154 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b47d4479c-8wg47" Apr 16 21:02:57.304312 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:57.304175 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b47d4479c-8wg47" event={"ID":"df8d6720-ddab-4afb-a6fa-30771bbcb52f","Type":"ContainerDied","Data":"658d498d01db2142ed26b1a70aebdb5e9993f1be36308260e954bf92d3d859e5"} Apr 16 21:02:57.304312 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:57.304213 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b47d4479c-8wg47" event={"ID":"df8d6720-ddab-4afb-a6fa-30771bbcb52f","Type":"ContainerDied","Data":"64442d97771fa2d341bcfcff64dafada6a85ac8dae24b85955305f9bacbf004d"} Apr 16 21:02:57.304312 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:57.304228 2568 scope.go:117] "RemoveContainer" containerID="658d498d01db2142ed26b1a70aebdb5e9993f1be36308260e954bf92d3d859e5" Apr 16 21:02:57.314252 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:57.314163 2568 scope.go:117] "RemoveContainer" containerID="658d498d01db2142ed26b1a70aebdb5e9993f1be36308260e954bf92d3d859e5" Apr 16 21:02:57.314661 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:02:57.314632 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"658d498d01db2142ed26b1a70aebdb5e9993f1be36308260e954bf92d3d859e5\": container with ID starting with 658d498d01db2142ed26b1a70aebdb5e9993f1be36308260e954bf92d3d859e5 not found: ID does not exist" containerID="658d498d01db2142ed26b1a70aebdb5e9993f1be36308260e954bf92d3d859e5" Apr 16 21:02:57.314752 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:57.314674 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"658d498d01db2142ed26b1a70aebdb5e9993f1be36308260e954bf92d3d859e5"} err="failed to get container status \"658d498d01db2142ed26b1a70aebdb5e9993f1be36308260e954bf92d3d859e5\": rpc error: code = NotFound desc = could not find container \"658d498d01db2142ed26b1a70aebdb5e9993f1be36308260e954bf92d3d859e5\": container with ID starting with 658d498d01db2142ed26b1a70aebdb5e9993f1be36308260e954bf92d3d859e5 not found: ID does not exist" Apr 16 21:02:57.327832 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:57.327800 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b47d4479c-8wg47"] Apr 16 21:02:57.331950 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:57.331926 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b47d4479c-8wg47"] Apr 16 21:02:58.333850 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:02:58.333806 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df8d6720-ddab-4afb-a6fa-30771bbcb52f" path="/var/lib/kubelet/pods/df8d6720-ddab-4afb-a6fa-30771bbcb52f/volumes" Apr 16 21:03:06.202719 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:03:06.202686 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/1.log" Apr 16 21:03:06.205525 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:03:06.205499 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/0.log" Apr 16 21:03:06.205899 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:03:06.205882 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/1.log" Apr 16 21:03:06.208409 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:03:06.208368 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/0.log" Apr 16 21:03:06.212692 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:03:06.212667 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 21:04:03.727842 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:03.727749 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-b2cl2"] Apr 16 21:04:03.728328 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:03.728276 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df8d6720-ddab-4afb-a6fa-30771bbcb52f" containerName="console" Apr 16 21:04:03.728328 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:03.728298 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="df8d6720-ddab-4afb-a6fa-30771bbcb52f" containerName="console" Apr 16 21:04:03.728328 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:03.728320 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa349265-8f10-4718-b52a-553dd987d335" containerName="console" Apr 16 21:04:03.728328 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:03.728328 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa349265-8f10-4718-b52a-553dd987d335" containerName="console" Apr 16 21:04:03.728582 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:03.728426 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa349265-8f10-4718-b52a-553dd987d335" containerName="console" Apr 16 21:04:03.728582 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:03.728439 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="df8d6720-ddab-4afb-a6fa-30771bbcb52f" containerName="console" Apr 16 21:04:03.730329 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:03.730305 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-b2cl2" Apr 16 21:04:03.733079 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:03.733056 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 21:04:03.733204 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:03.733059 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 21:04:03.733204 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:03.733138 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-qht6l\"" Apr 16 21:04:03.738151 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:03.738124 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-b2cl2"] Apr 16 21:04:03.758501 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:03.758468 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8cz5\" (UniqueName: \"kubernetes.io/projected/2ba56669-6530-4a35-afe8-f75c1e731dac-kube-api-access-b8cz5\") pod \"cert-manager-759f64656b-b2cl2\" (UID: \"2ba56669-6530-4a35-afe8-f75c1e731dac\") " pod="cert-manager/cert-manager-759f64656b-b2cl2" Apr 16 21:04:03.758663 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:03.758541 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ba56669-6530-4a35-afe8-f75c1e731dac-bound-sa-token\") pod \"cert-manager-759f64656b-b2cl2\" (UID: \"2ba56669-6530-4a35-afe8-f75c1e731dac\") " pod="cert-manager/cert-manager-759f64656b-b2cl2" Apr 16 21:04:03.859749 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:03.859710 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8cz5\" (UniqueName: \"kubernetes.io/projected/2ba56669-6530-4a35-afe8-f75c1e731dac-kube-api-access-b8cz5\") pod \"cert-manager-759f64656b-b2cl2\" (UID: \"2ba56669-6530-4a35-afe8-f75c1e731dac\") " pod="cert-manager/cert-manager-759f64656b-b2cl2" Apr 16 21:04:03.859915 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:03.859814 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ba56669-6530-4a35-afe8-f75c1e731dac-bound-sa-token\") pod \"cert-manager-759f64656b-b2cl2\" (UID: \"2ba56669-6530-4a35-afe8-f75c1e731dac\") " pod="cert-manager/cert-manager-759f64656b-b2cl2" Apr 16 21:04:03.868129 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:03.868094 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ba56669-6530-4a35-afe8-f75c1e731dac-bound-sa-token\") pod \"cert-manager-759f64656b-b2cl2\" (UID: \"2ba56669-6530-4a35-afe8-f75c1e731dac\") " pod="cert-manager/cert-manager-759f64656b-b2cl2" Apr 16 21:04:03.868251 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:03.868163 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8cz5\" (UniqueName: \"kubernetes.io/projected/2ba56669-6530-4a35-afe8-f75c1e731dac-kube-api-access-b8cz5\") pod \"cert-manager-759f64656b-b2cl2\" (UID: \"2ba56669-6530-4a35-afe8-f75c1e731dac\") " pod="cert-manager/cert-manager-759f64656b-b2cl2" Apr 16 21:04:04.046975 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:04.046937 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-b2cl2" Apr 16 21:04:04.173249 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:04.173218 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-b2cl2"] Apr 16 21:04:04.177129 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:04:04.177100 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ba56669_6530_4a35_afe8_f75c1e731dac.slice/crio-9dc1bfa60b38bd02ae237ec2184d1c4fba18fb65ff99604a2e8b131b48aa37f0 WatchSource:0}: Error finding container 9dc1bfa60b38bd02ae237ec2184d1c4fba18fb65ff99604a2e8b131b48aa37f0: Status 404 returned error can't find the container with id 9dc1bfa60b38bd02ae237ec2184d1c4fba18fb65ff99604a2e8b131b48aa37f0 Apr 16 21:04:04.178971 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:04.178954 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:04:04.499698 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:04.499668 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-b2cl2" event={"ID":"2ba56669-6530-4a35-afe8-f75c1e731dac","Type":"ContainerStarted","Data":"9dc1bfa60b38bd02ae237ec2184d1c4fba18fb65ff99604a2e8b131b48aa37f0"} Apr 16 21:04:09.515991 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:09.515951 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-b2cl2" event={"ID":"2ba56669-6530-4a35-afe8-f75c1e731dac","Type":"ContainerStarted","Data":"f79e14bb437caa6c6e6dd5594dabf5ebf6db86e6e9b77875a01a399675d522a9"} Apr 16 21:04:09.534373 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:09.534322 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-b2cl2" podStartSLOduration=1.334171074 podStartE2EDuration="6.534308555s" podCreationTimestamp="2026-04-16 21:04:03 +0000 UTC" firstStartedPulling="2026-04-16 21:04:04.179083197 +0000 UTC m=+358.381619910" lastFinishedPulling="2026-04-16 21:04:09.379220675 +0000 UTC m=+363.581757391" observedRunningTime="2026-04-16 21:04:09.532309929 +0000 UTC m=+363.734846659" watchObservedRunningTime="2026-04-16 21:04:09.534308555 +0000 UTC m=+363.736845286" Apr 16 21:04:22.706198 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:22.706153 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp"] Apr 16 21:04:22.708493 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:22.708475 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp" Apr 16 21:04:22.712043 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:22.712025 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 21:04:22.712293 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:22.712272 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 21:04:22.712388 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:22.712323 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 21:04:22.712631 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:22.712608 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-jjnvs\"" Apr 16 21:04:22.712835 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:22.712821 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 21:04:22.723895 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:22.723874 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp"] Apr 16 21:04:22.824310 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:22.824266 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f45cf097-e1ff-4b18-b113-cab89f129ef3-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f94c666bb-lpddp\" (UID: \"f45cf097-e1ff-4b18-b113-cab89f129ef3\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp" Apr 16 21:04:22.824524 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:22.824329 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fxwh\" (UniqueName: \"kubernetes.io/projected/f45cf097-e1ff-4b18-b113-cab89f129ef3-kube-api-access-9fxwh\") pod \"opendatahub-operator-controller-manager-5f94c666bb-lpddp\" (UID: \"f45cf097-e1ff-4b18-b113-cab89f129ef3\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp" Apr 16 21:04:22.824524 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:22.824351 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f45cf097-e1ff-4b18-b113-cab89f129ef3-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f94c666bb-lpddp\" (UID: \"f45cf097-e1ff-4b18-b113-cab89f129ef3\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp" Apr 16 21:04:22.925287 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:22.925254 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f45cf097-e1ff-4b18-b113-cab89f129ef3-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f94c666bb-lpddp\" (UID: \"f45cf097-e1ff-4b18-b113-cab89f129ef3\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp" Apr 16 21:04:22.925491 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:22.925303 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fxwh\" (UniqueName: \"kubernetes.io/projected/f45cf097-e1ff-4b18-b113-cab89f129ef3-kube-api-access-9fxwh\") pod \"opendatahub-operator-controller-manager-5f94c666bb-lpddp\" (UID: \"f45cf097-e1ff-4b18-b113-cab89f129ef3\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp" Apr 16 21:04:22.925491 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:22.925333 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f45cf097-e1ff-4b18-b113-cab89f129ef3-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f94c666bb-lpddp\" (UID: \"f45cf097-e1ff-4b18-b113-cab89f129ef3\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp" Apr 16 21:04:22.927942 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:22.927911 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f45cf097-e1ff-4b18-b113-cab89f129ef3-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f94c666bb-lpddp\" (UID: \"f45cf097-e1ff-4b18-b113-cab89f129ef3\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp" Apr 16 21:04:22.928047 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:22.927957 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f45cf097-e1ff-4b18-b113-cab89f129ef3-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f94c666bb-lpddp\" (UID: \"f45cf097-e1ff-4b18-b113-cab89f129ef3\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp" Apr 16 21:04:22.933712 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:22.933685 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fxwh\" (UniqueName: \"kubernetes.io/projected/f45cf097-e1ff-4b18-b113-cab89f129ef3-kube-api-access-9fxwh\") pod \"opendatahub-operator-controller-manager-5f94c666bb-lpddp\" (UID: \"f45cf097-e1ff-4b18-b113-cab89f129ef3\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp" Apr 16 21:04:23.019483 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:23.019445 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp" Apr 16 21:04:23.156768 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:23.156745 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp"] Apr 16 21:04:23.159476 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:04:23.159434 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf45cf097_e1ff_4b18_b113_cab89f129ef3.slice/crio-bddc4845c63fde118f5d17f5df4cdbbe902c532923945b620f7cda7e78e2e396 WatchSource:0}: Error finding container bddc4845c63fde118f5d17f5df4cdbbe902c532923945b620f7cda7e78e2e396: Status 404 returned error can't find the container with id bddc4845c63fde118f5d17f5df4cdbbe902c532923945b620f7cda7e78e2e396 Apr 16 21:04:23.558096 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:23.558065 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp" event={"ID":"f45cf097-e1ff-4b18-b113-cab89f129ef3","Type":"ContainerStarted","Data":"bddc4845c63fde118f5d17f5df4cdbbe902c532923945b620f7cda7e78e2e396"} Apr 16 21:04:26.570534 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:26.570496 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp" event={"ID":"f45cf097-e1ff-4b18-b113-cab89f129ef3","Type":"ContainerStarted","Data":"05b62c99250ec19e371d7036714b4ae3e57e26fa23617eb9547baf347dec9805"} Apr 16 21:04:26.570979 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:26.570619 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp" Apr 16 21:04:26.594534 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:26.594480 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp" podStartSLOduration=1.768611908 podStartE2EDuration="4.594465057s" podCreationTimestamp="2026-04-16 21:04:22 +0000 UTC" firstStartedPulling="2026-04-16 21:04:23.16118992 +0000 UTC m=+377.363726629" lastFinishedPulling="2026-04-16 21:04:25.987043064 +0000 UTC m=+380.189579778" observedRunningTime="2026-04-16 21:04:26.594122471 +0000 UTC m=+380.796659214" watchObservedRunningTime="2026-04-16 21:04:26.594465057 +0000 UTC m=+380.797001788" Apr 16 21:04:37.575876 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:37.575843 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-lpddp" Apr 16 21:04:38.363098 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.363059 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4"] Apr 16 21:04:38.366007 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.365988 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" Apr 16 21:04:38.369719 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.369699 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 21:04:38.370647 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.370624 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 21:04:38.370751 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.370655 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 21:04:38.370751 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.370662 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-hdldg\"" Apr 16 21:04:38.370751 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.370662 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 21:04:38.370751 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.370661 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 21:04:38.385731 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.385698 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4"] Apr 16 21:04:38.465111 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.465065 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/80d62e77-a324-421f-a766-de3a332f82d5-metrics-cert\") pod \"lws-controller-manager-c5b769f8c-hm9s4\" (UID: \"80d62e77-a324-421f-a766-de3a332f82d5\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" Apr 16 21:04:38.465278 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.465203 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/80d62e77-a324-421f-a766-de3a332f82d5-manager-config\") pod \"lws-controller-manager-c5b769f8c-hm9s4\" (UID: \"80d62e77-a324-421f-a766-de3a332f82d5\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" Apr 16 21:04:38.465278 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.465257 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfv4t\" (UniqueName: \"kubernetes.io/projected/80d62e77-a324-421f-a766-de3a332f82d5-kube-api-access-cfv4t\") pod \"lws-controller-manager-c5b769f8c-hm9s4\" (UID: \"80d62e77-a324-421f-a766-de3a332f82d5\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" Apr 16 21:04:38.465353 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.465315 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80d62e77-a324-421f-a766-de3a332f82d5-cert\") pod \"lws-controller-manager-c5b769f8c-hm9s4\" (UID: \"80d62e77-a324-421f-a766-de3a332f82d5\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" Apr 16 21:04:38.566629 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.566588 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/80d62e77-a324-421f-a766-de3a332f82d5-metrics-cert\") pod \"lws-controller-manager-c5b769f8c-hm9s4\" (UID: \"80d62e77-a324-421f-a766-de3a332f82d5\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" Apr 16 21:04:38.566785 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.566669 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/80d62e77-a324-421f-a766-de3a332f82d5-manager-config\") pod \"lws-controller-manager-c5b769f8c-hm9s4\" (UID: \"80d62e77-a324-421f-a766-de3a332f82d5\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" Apr 16 21:04:38.566785 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.566705 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfv4t\" (UniqueName: \"kubernetes.io/projected/80d62e77-a324-421f-a766-de3a332f82d5-kube-api-access-cfv4t\") pod \"lws-controller-manager-c5b769f8c-hm9s4\" (UID: \"80d62e77-a324-421f-a766-de3a332f82d5\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" Apr 16 21:04:38.566785 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.566726 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80d62e77-a324-421f-a766-de3a332f82d5-cert\") pod \"lws-controller-manager-c5b769f8c-hm9s4\" (UID: \"80d62e77-a324-421f-a766-de3a332f82d5\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" Apr 16 21:04:38.567354 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.567328 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/80d62e77-a324-421f-a766-de3a332f82d5-manager-config\") pod \"lws-controller-manager-c5b769f8c-hm9s4\" (UID: \"80d62e77-a324-421f-a766-de3a332f82d5\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" Apr 16 21:04:38.569238 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.569218 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/80d62e77-a324-421f-a766-de3a332f82d5-metrics-cert\") pod \"lws-controller-manager-c5b769f8c-hm9s4\" (UID: \"80d62e77-a324-421f-a766-de3a332f82d5\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" Apr 16 21:04:38.569325 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.569273 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80d62e77-a324-421f-a766-de3a332f82d5-cert\") pod \"lws-controller-manager-c5b769f8c-hm9s4\" (UID: \"80d62e77-a324-421f-a766-de3a332f82d5\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" Apr 16 21:04:38.583561 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.583532 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfv4t\" (UniqueName: \"kubernetes.io/projected/80d62e77-a324-421f-a766-de3a332f82d5-kube-api-access-cfv4t\") pod \"lws-controller-manager-c5b769f8c-hm9s4\" (UID: \"80d62e77-a324-421f-a766-de3a332f82d5\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" Apr 16 21:04:38.675677 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.675569 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" Apr 16 21:04:38.804452 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:38.804422 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4"] Apr 16 21:04:38.807334 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:04:38.807303 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80d62e77_a324_421f_a766_de3a332f82d5.slice/crio-3888cf3cbcb47f2099f3631e9b3bcf72c37fd1531d2977a4efda0129f3db5007 WatchSource:0}: Error finding container 3888cf3cbcb47f2099f3631e9b3bcf72c37fd1531d2977a4efda0129f3db5007: Status 404 returned error can't find the container with id 3888cf3cbcb47f2099f3631e9b3bcf72c37fd1531d2977a4efda0129f3db5007 Apr 16 21:04:39.613813 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:39.613771 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" event={"ID":"80d62e77-a324-421f-a766-de3a332f82d5","Type":"ContainerStarted","Data":"3888cf3cbcb47f2099f3631e9b3bcf72c37fd1531d2977a4efda0129f3db5007"} Apr 16 21:04:42.112205 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.112173 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-56b49765cd-9zmvz"] Apr 16 21:04:42.163573 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.163538 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-56b49765cd-9zmvz"] Apr 16 21:04:42.163730 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.163616 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-56b49765cd-9zmvz" Apr 16 21:04:42.168311 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.168263 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 21:04:42.168311 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.168276 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 21:04:42.168573 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.168284 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 21:04:42.168573 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.168291 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-x4wfj\"" Apr 16 21:04:42.168573 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.168569 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 21:04:42.305507 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.305468 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pgpf\" (UniqueName: \"kubernetes.io/projected/c308960d-54f7-4b00-b5ef-867a99c50916-kube-api-access-8pgpf\") pod \"kube-auth-proxy-56b49765cd-9zmvz\" (UID: \"c308960d-54f7-4b00-b5ef-867a99c50916\") " pod="openshift-ingress/kube-auth-proxy-56b49765cd-9zmvz" Apr 16 21:04:42.305507 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.305512 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c308960d-54f7-4b00-b5ef-867a99c50916-tls-certs\") pod \"kube-auth-proxy-56b49765cd-9zmvz\" (UID: \"c308960d-54f7-4b00-b5ef-867a99c50916\") " pod="openshift-ingress/kube-auth-proxy-56b49765cd-9zmvz" Apr 16 21:04:42.305706 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.305532 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c308960d-54f7-4b00-b5ef-867a99c50916-tmp\") pod \"kube-auth-proxy-56b49765cd-9zmvz\" (UID: \"c308960d-54f7-4b00-b5ef-867a99c50916\") " pod="openshift-ingress/kube-auth-proxy-56b49765cd-9zmvz" Apr 16 21:04:42.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.407052 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pgpf\" (UniqueName: \"kubernetes.io/projected/c308960d-54f7-4b00-b5ef-867a99c50916-kube-api-access-8pgpf\") pod \"kube-auth-proxy-56b49765cd-9zmvz\" (UID: \"c308960d-54f7-4b00-b5ef-867a99c50916\") " pod="openshift-ingress/kube-auth-proxy-56b49765cd-9zmvz" Apr 16 21:04:42.407137 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.407102 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c308960d-54f7-4b00-b5ef-867a99c50916-tls-certs\") pod \"kube-auth-proxy-56b49765cd-9zmvz\" (UID: \"c308960d-54f7-4b00-b5ef-867a99c50916\") " pod="openshift-ingress/kube-auth-proxy-56b49765cd-9zmvz" Apr 16 21:04:42.407395 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.407240 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c308960d-54f7-4b00-b5ef-867a99c50916-tmp\") pod \"kube-auth-proxy-56b49765cd-9zmvz\" (UID: \"c308960d-54f7-4b00-b5ef-867a99c50916\") " pod="openshift-ingress/kube-auth-proxy-56b49765cd-9zmvz" Apr 16 21:04:42.409575 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.409541 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c308960d-54f7-4b00-b5ef-867a99c50916-tmp\") pod \"kube-auth-proxy-56b49765cd-9zmvz\" (UID: \"c308960d-54f7-4b00-b5ef-867a99c50916\") " pod="openshift-ingress/kube-auth-proxy-56b49765cd-9zmvz" Apr 16 21:04:42.409796 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.409776 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c308960d-54f7-4b00-b5ef-867a99c50916-tls-certs\") pod \"kube-auth-proxy-56b49765cd-9zmvz\" (UID: \"c308960d-54f7-4b00-b5ef-867a99c50916\") " pod="openshift-ingress/kube-auth-proxy-56b49765cd-9zmvz" Apr 16 21:04:42.415654 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.415632 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pgpf\" (UniqueName: \"kubernetes.io/projected/c308960d-54f7-4b00-b5ef-867a99c50916-kube-api-access-8pgpf\") pod \"kube-auth-proxy-56b49765cd-9zmvz\" (UID: \"c308960d-54f7-4b00-b5ef-867a99c50916\") " pod="openshift-ingress/kube-auth-proxy-56b49765cd-9zmvz" Apr 16 21:04:42.473427 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.473359 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-56b49765cd-9zmvz" Apr 16 21:04:42.793612 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:42.793583 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-56b49765cd-9zmvz"] Apr 16 21:04:42.795917 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:04:42.795886 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc308960d_54f7_4b00_b5ef_867a99c50916.slice/crio-043a430f13b991d9e823d784676e662e378e6e63fc3d883686b7407714f86db9 WatchSource:0}: Error finding container 043a430f13b991d9e823d784676e662e378e6e63fc3d883686b7407714f86db9: Status 404 returned error can't find the container with id 043a430f13b991d9e823d784676e662e378e6e63fc3d883686b7407714f86db9 Apr 16 21:04:43.630832 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:43.630790 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" event={"ID":"80d62e77-a324-421f-a766-de3a332f82d5","Type":"ContainerStarted","Data":"c8f1b2e091d864c09786050c625a617e10d27a6cab92eceb17c118850d573890"} Apr 16 21:04:43.631308 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:43.630855 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" Apr 16 21:04:43.632919 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:43.632884 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-56b49765cd-9zmvz" event={"ID":"c308960d-54f7-4b00-b5ef-867a99c50916","Type":"ContainerStarted","Data":"043a430f13b991d9e823d784676e662e378e6e63fc3d883686b7407714f86db9"} Apr 16 21:04:43.649563 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:43.649489 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" podStartSLOduration=1.788350104 podStartE2EDuration="5.649468917s" podCreationTimestamp="2026-04-16 21:04:38 +0000 UTC" firstStartedPulling="2026-04-16 21:04:38.809223207 +0000 UTC m=+393.011759917" lastFinishedPulling="2026-04-16 21:04:42.670342021 +0000 UTC m=+396.872878730" observedRunningTime="2026-04-16 21:04:43.647670718 +0000 UTC m=+397.850207454" watchObservedRunningTime="2026-04-16 21:04:43.649468917 +0000 UTC m=+397.852005652" Apr 16 21:04:46.644590 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:46.644555 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-56b49765cd-9zmvz" event={"ID":"c308960d-54f7-4b00-b5ef-867a99c50916","Type":"ContainerStarted","Data":"b63d848265464beafb6d93a14c46279ada580aec0c04e032290aaf22f7b3cf25"} Apr 16 21:04:46.662115 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:46.662057 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-56b49765cd-9zmvz" podStartSLOduration=1.697088253 podStartE2EDuration="4.662038498s" podCreationTimestamp="2026-04-16 21:04:42 +0000 UTC" firstStartedPulling="2026-04-16 21:04:42.797688225 +0000 UTC m=+397.000224934" lastFinishedPulling="2026-04-16 21:04:45.76263846 +0000 UTC m=+399.965175179" observedRunningTime="2026-04-16 21:04:46.660703114 +0000 UTC m=+400.863239839" watchObservedRunningTime="2026-04-16 21:04:46.662038498 +0000 UTC m=+400.864575231" Apr 16 21:04:54.638898 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:04:54.638863 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-hm9s4" Apr 16 21:06:31.376363 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:31.376324 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cpfml"] Apr 16 21:06:31.378579 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:31.378562 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cpfml" Apr 16 21:06:31.381321 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:31.381299 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 21:06:31.381437 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:31.381299 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 21:06:31.382485 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:31.382462 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-ljsfg\"" Apr 16 21:06:31.382601 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:31.382491 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 21:06:31.382601 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:31.382549 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 21:06:31.387983 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:31.387961 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cpfml"] Apr 16 21:06:31.450974 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:31.450939 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n99jn\" (UniqueName: \"kubernetes.io/projected/fa853e1d-0a8e-46f6-a383-669539117800-kube-api-access-n99jn\") pod \"kuadrant-console-plugin-6cb54b5c86-cpfml\" (UID: \"fa853e1d-0a8e-46f6-a383-669539117800\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cpfml" Apr 16 21:06:31.451156 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:31.450989 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa853e1d-0a8e-46f6-a383-669539117800-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-cpfml\" (UID: \"fa853e1d-0a8e-46f6-a383-669539117800\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cpfml" Apr 16 21:06:31.451156 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:31.451036 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fa853e1d-0a8e-46f6-a383-669539117800-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-cpfml\" (UID: \"fa853e1d-0a8e-46f6-a383-669539117800\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cpfml" Apr 16 21:06:31.552316 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:31.552287 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n99jn\" (UniqueName: \"kubernetes.io/projected/fa853e1d-0a8e-46f6-a383-669539117800-kube-api-access-n99jn\") pod \"kuadrant-console-plugin-6cb54b5c86-cpfml\" (UID: \"fa853e1d-0a8e-46f6-a383-669539117800\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cpfml" Apr 16 21:06:31.552529 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:31.552336 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa853e1d-0a8e-46f6-a383-669539117800-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-cpfml\" (UID: \"fa853e1d-0a8e-46f6-a383-669539117800\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cpfml" Apr 16 21:06:31.552529 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:31.552355 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fa853e1d-0a8e-46f6-a383-669539117800-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-cpfml\" (UID: \"fa853e1d-0a8e-46f6-a383-669539117800\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cpfml" Apr 16 21:06:31.552529 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:06:31.552482 2568 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 16 21:06:31.552694 ip-10-0-138-120 kubenswrapper[2568]: E0416 21:06:31.552547 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa853e1d-0a8e-46f6-a383-669539117800-plugin-serving-cert podName:fa853e1d-0a8e-46f6-a383-669539117800 nodeName:}" failed. No retries permitted until 2026-04-16 21:06:32.052527364 +0000 UTC m=+506.255064076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/fa853e1d-0a8e-46f6-a383-669539117800-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-cpfml" (UID: "fa853e1d-0a8e-46f6-a383-669539117800") : secret "plugin-serving-cert" not found Apr 16 21:06:31.553030 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:31.553011 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fa853e1d-0a8e-46f6-a383-669539117800-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-cpfml\" (UID: \"fa853e1d-0a8e-46f6-a383-669539117800\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cpfml" Apr 16 21:06:31.565256 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:31.565227 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n99jn\" (UniqueName: \"kubernetes.io/projected/fa853e1d-0a8e-46f6-a383-669539117800-kube-api-access-n99jn\") pod \"kuadrant-console-plugin-6cb54b5c86-cpfml\" (UID: \"fa853e1d-0a8e-46f6-a383-669539117800\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cpfml" Apr 16 21:06:32.056969 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:32.056930 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa853e1d-0a8e-46f6-a383-669539117800-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-cpfml\" (UID: \"fa853e1d-0a8e-46f6-a383-669539117800\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cpfml" Apr 16 21:06:32.059479 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:32.059449 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa853e1d-0a8e-46f6-a383-669539117800-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-cpfml\" (UID: \"fa853e1d-0a8e-46f6-a383-669539117800\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cpfml" Apr 16 21:06:32.288781 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:32.288742 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cpfml" Apr 16 21:06:32.416400 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:32.416356 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cpfml"] Apr 16 21:06:32.418876 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:06:32.418846 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa853e1d_0a8e_46f6_a383_669539117800.slice/crio-45edee3debc87e64e2f403f22833cb5d8631f303cc7310903c952d86d7d5499b WatchSource:0}: Error finding container 45edee3debc87e64e2f403f22833cb5d8631f303cc7310903c952d86d7d5499b: Status 404 returned error can't find the container with id 45edee3debc87e64e2f403f22833cb5d8631f303cc7310903c952d86d7d5499b Apr 16 21:06:32.996739 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:32.996705 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cpfml" event={"ID":"fa853e1d-0a8e-46f6-a383-669539117800","Type":"ContainerStarted","Data":"45edee3debc87e64e2f403f22833cb5d8631f303cc7310903c952d86d7d5499b"} Apr 16 21:06:41.489621 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:41.489587 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hgxnf"] Apr 16 21:06:41.491887 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:41.491865 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hgxnf" Apr 16 21:06:41.497791 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:41.497760 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-ldzvv\"" Apr 16 21:06:41.516290 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:41.516254 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hgxnf"] Apr 16 21:06:41.544868 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:41.544832 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/92cb18d7-af0b-494d-b5e6-05268ca7e7d6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-hgxnf\" (UID: \"92cb18d7-af0b-494d-b5e6-05268ca7e7d6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hgxnf" Apr 16 21:06:41.545042 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:41.544962 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rfvf\" (UniqueName: \"kubernetes.io/projected/92cb18d7-af0b-494d-b5e6-05268ca7e7d6-kube-api-access-6rfvf\") pod \"kuadrant-operator-controller-manager-84b657d985-hgxnf\" (UID: \"92cb18d7-af0b-494d-b5e6-05268ca7e7d6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hgxnf" Apr 16 21:06:41.645844 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:41.645803 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/92cb18d7-af0b-494d-b5e6-05268ca7e7d6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-hgxnf\" (UID: \"92cb18d7-af0b-494d-b5e6-05268ca7e7d6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hgxnf" Apr 16 21:06:41.646014 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:41.645914 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rfvf\" (UniqueName: \"kubernetes.io/projected/92cb18d7-af0b-494d-b5e6-05268ca7e7d6-kube-api-access-6rfvf\") pod \"kuadrant-operator-controller-manager-84b657d985-hgxnf\" (UID: \"92cb18d7-af0b-494d-b5e6-05268ca7e7d6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hgxnf" Apr 16 21:06:41.646216 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:41.646187 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/92cb18d7-af0b-494d-b5e6-05268ca7e7d6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-hgxnf\" (UID: \"92cb18d7-af0b-494d-b5e6-05268ca7e7d6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hgxnf" Apr 16 21:06:41.658956 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:41.658929 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rfvf\" (UniqueName: \"kubernetes.io/projected/92cb18d7-af0b-494d-b5e6-05268ca7e7d6-kube-api-access-6rfvf\") pod \"kuadrant-operator-controller-manager-84b657d985-hgxnf\" (UID: \"92cb18d7-af0b-494d-b5e6-05268ca7e7d6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hgxnf" Apr 16 21:06:41.803755 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:41.803719 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hgxnf" Apr 16 21:06:41.944961 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:41.944915 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hgxnf"] Apr 16 21:06:41.954750 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:41.954715 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hgxnf"] Apr 16 21:06:57.092721 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:57.092666 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cpfml" event={"ID":"fa853e1d-0a8e-46f6-a383-669539117800","Type":"ContainerStarted","Data":"ac5d320b2c7ebf8669d5d1532f3573884c2b80f5300e2ab99e8f932500cc2b97"} Apr 16 21:06:57.109560 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:06:57.109507 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cpfml" podStartSLOduration=2.229040811 podStartE2EDuration="26.10949251s" podCreationTimestamp="2026-04-16 21:06:31 +0000 UTC" firstStartedPulling="2026-04-16 21:06:32.42024378 +0000 UTC m=+506.622780494" lastFinishedPulling="2026-04-16 21:06:56.300695478 +0000 UTC m=+530.503232193" observedRunningTime="2026-04-16 21:06:57.108714138 +0000 UTC m=+531.311250869" watchObservedRunningTime="2026-04-16 21:06:57.10949251 +0000 UTC m=+531.312029242" Apr 16 21:06:58.660623 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:06:58.660587 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92cb18d7_af0b_494d_b5e6_05268ca7e7d6.slice/crio-6ca7e4c5d9bebf4b658b425d825b114ae90a3fddd3003a542b306f8eb962fde7 WatchSource:0}: Error finding container 6ca7e4c5d9bebf4b658b425d825b114ae90a3fddd3003a542b306f8eb962fde7: Status 404 returned error can't find the container with id 6ca7e4c5d9bebf4b658b425d825b114ae90a3fddd3003a542b306f8eb962fde7 Apr 16 21:07:08.132808 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:08.132775 2568 generic.go:358] "Generic (PLEG): container finished" podID="92cb18d7-af0b-494d-b5e6-05268ca7e7d6" containerID="621ffce197f40f392b4d4fc3e484aa927e88ee8487f6b1d6b55a0d45bd3916fb" exitCode=1 Apr 16 21:07:08.135126 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:08.135092 2568 status_manager.go:895] "Failed to get status for pod" podUID="92cb18d7-af0b-494d-b5e6-05268ca7e7d6" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hgxnf" err="pods \"kuadrant-operator-controller-manager-84b657d985-hgxnf\" is forbidden: User \"system:node:ip-10-0-138-120.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-120.ec2.internal' and this object" Apr 16 21:07:08.164746 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:08.164725 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hgxnf" Apr 16 21:07:08.167624 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:08.167598 2568 status_manager.go:895] "Failed to get status for pod" podUID="92cb18d7-af0b-494d-b5e6-05268ca7e7d6" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hgxnf" err="pods \"kuadrant-operator-controller-manager-84b657d985-hgxnf\" is forbidden: User \"system:node:ip-10-0-138-120.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-120.ec2.internal' and this object" Apr 16 21:07:08.300560 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:08.300517 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rfvf\" (UniqueName: \"kubernetes.io/projected/92cb18d7-af0b-494d-b5e6-05268ca7e7d6-kube-api-access-6rfvf\") pod \"92cb18d7-af0b-494d-b5e6-05268ca7e7d6\" (UID: \"92cb18d7-af0b-494d-b5e6-05268ca7e7d6\") " Apr 16 21:07:08.300748 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:08.300591 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/92cb18d7-af0b-494d-b5e6-05268ca7e7d6-extensions-socket-volume\") pod \"92cb18d7-af0b-494d-b5e6-05268ca7e7d6\" (UID: \"92cb18d7-af0b-494d-b5e6-05268ca7e7d6\") " Apr 16 21:07:08.300875 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:08.300817 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92cb18d7-af0b-494d-b5e6-05268ca7e7d6-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "92cb18d7-af0b-494d-b5e6-05268ca7e7d6" (UID: "92cb18d7-af0b-494d-b5e6-05268ca7e7d6"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:07:08.302977 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:08.302947 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92cb18d7-af0b-494d-b5e6-05268ca7e7d6-kube-api-access-6rfvf" (OuterVolumeSpecName: "kube-api-access-6rfvf") pod "92cb18d7-af0b-494d-b5e6-05268ca7e7d6" (UID: "92cb18d7-af0b-494d-b5e6-05268ca7e7d6"). InnerVolumeSpecName "kube-api-access-6rfvf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:07:08.334011 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:08.333972 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92cb18d7-af0b-494d-b5e6-05268ca7e7d6" path="/var/lib/kubelet/pods/92cb18d7-af0b-494d-b5e6-05268ca7e7d6/volumes" Apr 16 21:07:08.402047 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:08.401973 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6rfvf\" (UniqueName: \"kubernetes.io/projected/92cb18d7-af0b-494d-b5e6-05268ca7e7d6-kube-api-access-6rfvf\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:07:08.402047 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:08.402002 2568 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/92cb18d7-af0b-494d-b5e6-05268ca7e7d6-extensions-socket-volume\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 16 21:07:09.137342 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:09.137283 2568 scope.go:117] "RemoveContainer" containerID="621ffce197f40f392b4d4fc3e484aa927e88ee8487f6b1d6b55a0d45bd3916fb" Apr 16 21:07:09.137342 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:09.137333 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hgxnf" Apr 16 21:07:09.140046 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:09.140008 2568 status_manager.go:895] "Failed to get status for pod" podUID="92cb18d7-af0b-494d-b5e6-05268ca7e7d6" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hgxnf" err="pods \"kuadrant-operator-controller-manager-84b657d985-hgxnf\" is forbidden: User \"system:node:ip-10-0-138-120.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-120.ec2.internal' and this object" Apr 16 21:07:09.142344 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:09.142313 2568 status_manager.go:895] "Failed to get status for pod" podUID="92cb18d7-af0b-494d-b5e6-05268ca7e7d6" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hgxnf" err="pods \"kuadrant-operator-controller-manager-84b657d985-hgxnf\" is forbidden: User \"system:node:ip-10-0-138-120.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-120.ec2.internal' and this object" Apr 16 21:07:27.088913 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:27.088874 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:07:27.089443 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:27.089426 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92cb18d7-af0b-494d-b5e6-05268ca7e7d6" containerName="manager" Apr 16 21:07:27.089496 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:27.089448 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="92cb18d7-af0b-494d-b5e6-05268ca7e7d6" containerName="manager" Apr 16 21:07:27.089584 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:27.089571 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="92cb18d7-af0b-494d-b5e6-05268ca7e7d6" containerName="manager" Apr 16 21:07:27.109999 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:27.109957 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:07:27.110167 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:27.110099 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-5jbn2" Apr 16 21:07:27.113046 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:27.113010 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 21:07:27.130153 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:27.130114 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:07:27.158297 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:27.158255 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gm8n\" (UniqueName: \"kubernetes.io/projected/cdf7005b-5ecd-4098-a360-468e68270d39-kube-api-access-7gm8n\") pod \"limitador-limitador-78c99df468-5jbn2\" (UID: \"cdf7005b-5ecd-4098-a360-468e68270d39\") " pod="kuadrant-system/limitador-limitador-78c99df468-5jbn2" Apr 16 21:07:27.158480 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:27.158343 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cdf7005b-5ecd-4098-a360-468e68270d39-config-file\") pod \"limitador-limitador-78c99df468-5jbn2\" (UID: \"cdf7005b-5ecd-4098-a360-468e68270d39\") " pod="kuadrant-system/limitador-limitador-78c99df468-5jbn2" Apr 16 21:07:27.259339 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:27.259303 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gm8n\" (UniqueName: \"kubernetes.io/projected/cdf7005b-5ecd-4098-a360-468e68270d39-kube-api-access-7gm8n\") pod \"limitador-limitador-78c99df468-5jbn2\" (UID: \"cdf7005b-5ecd-4098-a360-468e68270d39\") " pod="kuadrant-system/limitador-limitador-78c99df468-5jbn2" Apr 16 21:07:27.259524 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:27.259369 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cdf7005b-5ecd-4098-a360-468e68270d39-config-file\") pod \"limitador-limitador-78c99df468-5jbn2\" (UID: \"cdf7005b-5ecd-4098-a360-468e68270d39\") " pod="kuadrant-system/limitador-limitador-78c99df468-5jbn2" Apr 16 21:07:27.260144 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:27.260124 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cdf7005b-5ecd-4098-a360-468e68270d39-config-file\") pod \"limitador-limitador-78c99df468-5jbn2\" (UID: \"cdf7005b-5ecd-4098-a360-468e68270d39\") " pod="kuadrant-system/limitador-limitador-78c99df468-5jbn2" Apr 16 21:07:27.268802 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:27.268770 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gm8n\" (UniqueName: \"kubernetes.io/projected/cdf7005b-5ecd-4098-a360-468e68270d39-kube-api-access-7gm8n\") pod \"limitador-limitador-78c99df468-5jbn2\" (UID: \"cdf7005b-5ecd-4098-a360-468e68270d39\") " pod="kuadrant-system/limitador-limitador-78c99df468-5jbn2" Apr 16 21:07:27.423157 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:27.423075 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-5jbn2" Apr 16 21:07:27.551067 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:27.551041 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:07:27.553665 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:07:27.553635 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdf7005b_5ecd_4098_a360_468e68270d39.slice/crio-7debbe19a5bba122c02b54bafed1c0a8b258fd7bb41564e8b94d814f5041249a WatchSource:0}: Error finding container 7debbe19a5bba122c02b54bafed1c0a8b258fd7bb41564e8b94d814f5041249a: Status 404 returned error can't find the container with id 7debbe19a5bba122c02b54bafed1c0a8b258fd7bb41564e8b94d814f5041249a Apr 16 21:07:28.207142 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:28.207098 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-5jbn2" event={"ID":"cdf7005b-5ecd-4098-a360-468e68270d39","Type":"ContainerStarted","Data":"7debbe19a5bba122c02b54bafed1c0a8b258fd7bb41564e8b94d814f5041249a"} Apr 16 21:07:30.216650 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:30.216610 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-5jbn2" event={"ID":"cdf7005b-5ecd-4098-a360-468e68270d39","Type":"ContainerStarted","Data":"e9b715e164ea123f47b54890cb6bb1b4475493fcabff475b86f555e9016c7a4d"} Apr 16 21:07:30.217119 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:30.216729 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-5jbn2" Apr 16 21:07:30.235629 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:30.235572 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-5jbn2" podStartSLOduration=0.780133969 podStartE2EDuration="3.235558761s" podCreationTimestamp="2026-04-16 21:07:27 +0000 UTC" firstStartedPulling="2026-04-16 21:07:27.555999264 +0000 UTC m=+561.758535979" lastFinishedPulling="2026-04-16 21:07:30.01142406 +0000 UTC m=+564.213960771" observedRunningTime="2026-04-16 21:07:30.233694613 +0000 UTC m=+564.436231345" watchObservedRunningTime="2026-04-16 21:07:30.235558761 +0000 UTC m=+564.438095496" Apr 16 21:07:41.221102 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:07:41.221074 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-5jbn2" Apr 16 21:08:00.590434 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:08:00.590396 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:08:06.230964 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:08:06.230933 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/1.log" Apr 16 21:08:06.231492 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:08:06.231326 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/1.log" Apr 16 21:08:06.233535 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:08:06.233515 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/0.log" Apr 16 21:08:06.233959 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:08:06.233945 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/0.log" Apr 16 21:08:42.487642 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:08:42.487607 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:08:46.487720 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:08:46.487687 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:08:55.080344 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:08:55.080310 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:09:03.375998 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:09:03.375966 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:09:35.882777 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:09:35.882741 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:09:42.176355 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:09:42.176318 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:10:25.874650 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:10:25.874577 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:10:33.574026 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:10:33.573993 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:11:05.281395 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:11:05.281339 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:11:20.672805 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:11:20.672766 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:11:59.290142 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:11:59.290054 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:12:16.286800 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:12:16.286763 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:12:30.297579 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:12:30.297543 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:12:46.680966 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:12:46.680927 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:13:06.254369 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:13:06.254298 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/1.log" Apr 16 21:13:06.256585 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:13:06.256562 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/1.log" Apr 16 21:13:06.257245 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:13:06.257224 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/0.log" Apr 16 21:13:06.259020 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:13:06.258999 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/0.log" Apr 16 21:13:39.795690 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:13:39.795647 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:13:47.885924 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:13:47.885881 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:14:04.995000 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:14:04.994957 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:14:13.275445 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:14:13.275412 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:14:30.202823 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:14:30.202739 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:14:39.314632 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:14:39.314587 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:15:11.594048 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:15:11.594012 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:15:20.294910 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:15:20.294869 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:15:28.698841 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:15:28.698804 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:15:36.986310 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:15:36.986274 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:15:45.219252 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:15:45.219215 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:16:02.800941 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:16:02.800846 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:16:13.601975 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:16:13.601936 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:17:02.695640 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:17:02.695604 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:17:11.294723 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:17:11.294686 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:17:20.288678 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:17:20.288640 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:17:28.558476 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:17:28.558393 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:17:37.526394 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:17:37.526347 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:17:47.291598 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:17:47.291549 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:17:54.700855 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:17:54.700815 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:17:59.394357 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:17:59.394321 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:18:02.987102 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:18:02.987065 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:18:06.278614 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:18:06.278585 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/1.log" Apr 16 21:18:06.281091 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:18:06.281058 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/1.log" Apr 16 21:18:06.281216 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:18:06.281158 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/0.log" Apr 16 21:18:06.283559 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:18:06.283540 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/0.log" Apr 16 21:18:12.293167 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:18:12.293126 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:18:20.982733 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:18:20.982696 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:18:30.498362 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:18:30.498320 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:18:39.605999 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:18:39.605961 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:18:48.488813 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:18:48.488779 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:18:56.325520 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:18:56.325428 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:19:05.106680 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:19:05.106633 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:19:13.878631 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:19:13.878592 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:19:22.617472 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:19:22.617437 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:19:32.219314 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:19:32.219278 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:20:42.219297 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:20:42.219218 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:20:46.358105 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:20:46.358069 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:20:57.379683 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:20:57.379642 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:21:27.911206 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:21:27.911168 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:22:10.740839 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:22:10.740756 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:22:18.997764 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:22:18.997730 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:22:28.596300 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:22:28.596260 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:22:36.624784 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:22:36.624749 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:22:44.811484 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:22:44.811449 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:22:53.312173 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:22:53.312138 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:23:02.392107 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:23:02.392075 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:23:06.302303 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:23:06.302271 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/1.log" Apr 16 21:23:06.304753 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:23:06.304734 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/0.log" Apr 16 21:23:06.305986 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:23:06.305963 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/1.log" Apr 16 21:23:06.308659 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:23:06.308640 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/0.log" Apr 16 21:23:07.522905 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:23:07.522860 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:23:17.248257 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:23:17.248220 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:23:25.090306 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:23:25.090224 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:23:33.610955 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:23:33.610914 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:23:41.483087 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:23:41.483050 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:23:59.689355 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:23:59.689316 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:24:08.002533 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:24:08.002497 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:24:16.781654 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:24:16.781610 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:24:24.979398 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:24:24.979350 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:24:42.584246 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:24:42.584212 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:24:50.387908 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:24:50.387868 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:24:59.282615 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:24:59.282527 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:25:07.902596 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:25:07.902562 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:25:16.001985 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:25:16.001947 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:25:25.381689 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:25:25.381651 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:25:34.285255 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:25:34.285216 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:25:50.592318 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:25:50.592279 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:26:00.405429 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:26:00.405395 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:26:16.881152 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:26:16.881112 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:26:25.684288 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:26:25.684201 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:26:33.795336 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:26:33.795293 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:26:42.282790 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:26:42.282754 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:26:47.280464 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:26:47.280422 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:27:05.375169 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:27:05.375123 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:27:13.379409 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:27:13.379346 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:27:22.286413 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:27:22.286357 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:27:27.286551 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:27:27.286512 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:27:51.583286 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:27:51.583242 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:28:05.079609 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:05.079567 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5jbn2"] Apr 16 21:28:06.329979 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:06.329943 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/1.log" Apr 16 21:28:06.333031 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:06.333003 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/1.log" Apr 16 21:28:06.333427 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:06.333403 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/0.log" Apr 16 21:28:06.336031 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:06.336011 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/0.log" Apr 16 21:28:11.208061 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:11.208028 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5f94c666bb-lpddp_f45cf097-e1ff-4b18-b113-cab89f129ef3/manager/0.log" Apr 16 21:28:12.972259 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:12.972226 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-cpfml_fa853e1d-0a8e-46f6-a383-669539117800/kuadrant-console-plugin/0.log" Apr 16 21:28:13.325665 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:13.325636 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-5jbn2_cdf7005b-5ecd-4098-a360-468e68270d39/limitador/0.log" Apr 16 21:28:14.027716 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:14.027685 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-56b49765cd-9zmvz_c308960d-54f7-4b00-b5ef-867a99c50916/kube-auth-proxy/0.log" Apr 16 21:28:19.043587 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:19.043548 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hmfkt/must-gather-28nn9"] Apr 16 21:28:19.051344 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:19.051305 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hmfkt/must-gather-28nn9" Apr 16 21:28:19.051657 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:19.051624 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfxmv\" (UniqueName: \"kubernetes.io/projected/8b417a0e-e8b0-42ab-abc9-fe0a77e2e817-kube-api-access-qfxmv\") pod \"must-gather-28nn9\" (UID: \"8b417a0e-e8b0-42ab-abc9-fe0a77e2e817\") " pod="openshift-must-gather-hmfkt/must-gather-28nn9" Apr 16 21:28:19.051767 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:19.051712 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8b417a0e-e8b0-42ab-abc9-fe0a77e2e817-must-gather-output\") pod \"must-gather-28nn9\" (UID: \"8b417a0e-e8b0-42ab-abc9-fe0a77e2e817\") " pod="openshift-must-gather-hmfkt/must-gather-28nn9" Apr 16 21:28:19.054457 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:19.054430 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hmfkt\"/\"kube-root-ca.crt\"" Apr 16 21:28:19.054564 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:19.054457 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hmfkt\"/\"openshift-service-ca.crt\"" Apr 16 21:28:19.055671 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:19.055653 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-hmfkt\"/\"default-dockercfg-7w55v\"" Apr 16 21:28:19.064497 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:19.064472 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hmfkt/must-gather-28nn9"] Apr 16 21:28:19.152346 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:19.152313 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfxmv\" (UniqueName: \"kubernetes.io/projected/8b417a0e-e8b0-42ab-abc9-fe0a77e2e817-kube-api-access-qfxmv\") pod \"must-gather-28nn9\" (UID: \"8b417a0e-e8b0-42ab-abc9-fe0a77e2e817\") " pod="openshift-must-gather-hmfkt/must-gather-28nn9" Apr 16 21:28:19.152529 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:19.152405 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8b417a0e-e8b0-42ab-abc9-fe0a77e2e817-must-gather-output\") pod \"must-gather-28nn9\" (UID: \"8b417a0e-e8b0-42ab-abc9-fe0a77e2e817\") " pod="openshift-must-gather-hmfkt/must-gather-28nn9" Apr 16 21:28:19.152676 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:19.152659 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8b417a0e-e8b0-42ab-abc9-fe0a77e2e817-must-gather-output\") pod \"must-gather-28nn9\" (UID: \"8b417a0e-e8b0-42ab-abc9-fe0a77e2e817\") " pod="openshift-must-gather-hmfkt/must-gather-28nn9" Apr 16 21:28:19.161672 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:19.161653 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfxmv\" (UniqueName: \"kubernetes.io/projected/8b417a0e-e8b0-42ab-abc9-fe0a77e2e817-kube-api-access-qfxmv\") pod \"must-gather-28nn9\" (UID: \"8b417a0e-e8b0-42ab-abc9-fe0a77e2e817\") " pod="openshift-must-gather-hmfkt/must-gather-28nn9" Apr 16 21:28:19.361399 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:19.361295 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hmfkt/must-gather-28nn9" Apr 16 21:28:19.487040 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:19.486881 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hmfkt/must-gather-28nn9"] Apr 16 21:28:19.491276 ip-10-0-138-120 kubenswrapper[2568]: W0416 21:28:19.491242 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b417a0e_e8b0_42ab_abc9_fe0a77e2e817.slice/crio-981390152aeda3050a0b804f189feff207ecd88116b7b64a1059c87c485d07a1 WatchSource:0}: Error finding container 981390152aeda3050a0b804f189feff207ecd88116b7b64a1059c87c485d07a1: Status 404 returned error can't find the container with id 981390152aeda3050a0b804f189feff207ecd88116b7b64a1059c87c485d07a1 Apr 16 21:28:19.493298 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:19.493278 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:28:20.440626 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:20.440587 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hmfkt/must-gather-28nn9" event={"ID":"8b417a0e-e8b0-42ab-abc9-fe0a77e2e817","Type":"ContainerStarted","Data":"981390152aeda3050a0b804f189feff207ecd88116b7b64a1059c87c485d07a1"} Apr 16 21:28:21.445841 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:21.445806 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hmfkt/must-gather-28nn9" event={"ID":"8b417a0e-e8b0-42ab-abc9-fe0a77e2e817","Type":"ContainerStarted","Data":"029945aa39de74443cd7db9bfa27403b22b16cd2868ba2deb4ce4c89eb01d01d"} Apr 16 21:28:21.446243 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:21.445848 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hmfkt/must-gather-28nn9" event={"ID":"8b417a0e-e8b0-42ab-abc9-fe0a77e2e817","Type":"ContainerStarted","Data":"5031c39bf4857d30cc925daafa78150b7f18deb3b1f9f13a83fde35c889abadb"} Apr 16 21:28:21.466761 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:21.466710 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hmfkt/must-gather-28nn9" podStartSLOduration=1.612965527 podStartE2EDuration="2.466696563s" podCreationTimestamp="2026-04-16 21:28:19 +0000 UTC" firstStartedPulling="2026-04-16 21:28:19.493462919 +0000 UTC m=+1813.695999629" lastFinishedPulling="2026-04-16 21:28:20.347193955 +0000 UTC m=+1814.549730665" observedRunningTime="2026-04-16 21:28:21.464209193 +0000 UTC m=+1815.666745950" watchObservedRunningTime="2026-04-16 21:28:21.466696563 +0000 UTC m=+1815.669233292" Apr 16 21:28:21.970318 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:21.970255 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-h8fjd_3b1b29e8-6823-494b-9501-ec38717ca6cd/global-pull-secret-syncer/0.log" Apr 16 21:28:22.131279 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:22.131242 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6dn6k_9d60d121-9e76-46da-a382-d5c74b2c3a1e/konnectivity-agent/0.log" Apr 16 21:28:22.226704 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:22.226631 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-120.ec2.internal_4c5556f6af36052a906fa0aef20bfb6c/haproxy/0.log" Apr 16 21:28:26.833125 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:26.833044 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-cpfml_fa853e1d-0a8e-46f6-a383-669539117800/kuadrant-console-plugin/0.log" Apr 16 21:28:27.071765 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:27.071734 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-5jbn2_cdf7005b-5ecd-4098-a360-468e68270d39/limitador/0.log" Apr 16 21:28:28.712618 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:28.712519 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_091e6cfb-6782-44ce-8308-7094ca7107cf/alertmanager/0.log" Apr 16 21:28:28.776608 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:28.776484 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_091e6cfb-6782-44ce-8308-7094ca7107cf/config-reloader/0.log" Apr 16 21:28:28.820763 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:28.820727 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_091e6cfb-6782-44ce-8308-7094ca7107cf/kube-rbac-proxy-web/0.log" Apr 16 21:28:28.895168 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:28.895127 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_091e6cfb-6782-44ce-8308-7094ca7107cf/kube-rbac-proxy/0.log" Apr 16 21:28:28.956016 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:28.955987 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_091e6cfb-6782-44ce-8308-7094ca7107cf/kube-rbac-proxy-metric/0.log" Apr 16 21:28:28.985867 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:28.985791 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_091e6cfb-6782-44ce-8308-7094ca7107cf/prom-label-proxy/0.log" Apr 16 21:28:29.016789 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.016746 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_091e6cfb-6782-44ce-8308-7094ca7107cf/init-config-reloader/0.log" Apr 16 21:28:29.095963 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.095925 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8v8q5_c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086/kube-state-metrics/0.log" Apr 16 21:28:29.133531 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.133498 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8v8q5_c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086/kube-rbac-proxy-main/0.log" Apr 16 21:28:29.162159 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.162124 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8v8q5_c7e0ea71-e0ca-4bd3-8a86-ebcf3bb88086/kube-rbac-proxy-self/0.log" Apr 16 21:28:29.332699 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.332670 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ks9m9_bb636b7d-dc5b-447a-8265-30a0d805ab14/node-exporter/0.log" Apr 16 21:28:29.365012 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.364978 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ks9m9_bb636b7d-dc5b-447a-8265-30a0d805ab14/kube-rbac-proxy/0.log" Apr 16 21:28:29.397042 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.397002 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ks9m9_bb636b7d-dc5b-447a-8265-30a0d805ab14/init-textfile/0.log" Apr 16 21:28:29.524355 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.524323 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7sgvx_a9e3ec16-5b10-458c-92e9-fc70564250dc/kube-rbac-proxy-main/0.log" Apr 16 21:28:29.552418 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.552370 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7sgvx_a9e3ec16-5b10-458c-92e9-fc70564250dc/kube-rbac-proxy-self/0.log" Apr 16 21:28:29.579484 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.579453 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7sgvx_a9e3ec16-5b10-458c-92e9-fc70564250dc/openshift-state-metrics/0.log" Apr 16 21:28:29.618419 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.618314 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d0f404b7-076d-4b10-a425-05da184a3e01/prometheus/0.log" Apr 16 21:28:29.644139 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.644105 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d0f404b7-076d-4b10-a425-05da184a3e01/config-reloader/0.log" Apr 16 21:28:29.672880 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.672850 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d0f404b7-076d-4b10-a425-05da184a3e01/thanos-sidecar/0.log" Apr 16 21:28:29.699170 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.699144 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d0f404b7-076d-4b10-a425-05da184a3e01/kube-rbac-proxy-web/0.log" Apr 16 21:28:29.728725 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.728694 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d0f404b7-076d-4b10-a425-05da184a3e01/kube-rbac-proxy/0.log" Apr 16 21:28:29.766318 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.766285 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d0f404b7-076d-4b10-a425-05da184a3e01/kube-rbac-proxy-thanos/0.log" Apr 16 21:28:29.797164 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.797136 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d0f404b7-076d-4b10-a425-05da184a3e01/init-config-reloader/0.log" Apr 16 21:28:29.895297 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.895221 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-hgw5q_9629b488-a52e-4946-9c32-2acc348e6da5/prometheus-operator-admission-webhook/0.log" Apr 16 21:28:29.930002 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.929972 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-78dcd8b57c-2vd2v_08025926-ab85-485a-b626-2d52b4c806ed/telemeter-client/0.log" Apr 16 21:28:29.960991 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.960958 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-78dcd8b57c-2vd2v_08025926-ab85-485a-b626-2d52b4c806ed/reload/0.log" Apr 16 21:28:29.987557 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:29.987522 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-78dcd8b57c-2vd2v_08025926-ab85-485a-b626-2d52b4c806ed/kube-rbac-proxy/0.log" Apr 16 21:28:30.688193 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:30.688149 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927"] Apr 16 21:28:30.694980 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:30.694947 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:30.706080 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:30.706044 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927"] Apr 16 21:28:30.763638 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:30.763595 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlt6z\" (UniqueName: \"kubernetes.io/projected/71656d41-a5d6-4cfb-8d88-9a23fd028746-kube-api-access-rlt6z\") pod \"perf-node-gather-daemonset-kr927\" (UID: \"71656d41-a5d6-4cfb-8d88-9a23fd028746\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:30.764290 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:30.764264 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/71656d41-a5d6-4cfb-8d88-9a23fd028746-sys\") pod \"perf-node-gather-daemonset-kr927\" (UID: \"71656d41-a5d6-4cfb-8d88-9a23fd028746\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:30.764483 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:30.764466 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/71656d41-a5d6-4cfb-8d88-9a23fd028746-podres\") pod \"perf-node-gather-daemonset-kr927\" (UID: \"71656d41-a5d6-4cfb-8d88-9a23fd028746\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:30.764648 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:30.764634 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/71656d41-a5d6-4cfb-8d88-9a23fd028746-proc\") pod \"perf-node-gather-daemonset-kr927\" (UID: \"71656d41-a5d6-4cfb-8d88-9a23fd028746\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:30.764778 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:30.764763 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/71656d41-a5d6-4cfb-8d88-9a23fd028746-lib-modules\") pod \"perf-node-gather-daemonset-kr927\" (UID: \"71656d41-a5d6-4cfb-8d88-9a23fd028746\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:30.866617 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:30.866566 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlt6z\" (UniqueName: \"kubernetes.io/projected/71656d41-a5d6-4cfb-8d88-9a23fd028746-kube-api-access-rlt6z\") pod \"perf-node-gather-daemonset-kr927\" (UID: \"71656d41-a5d6-4cfb-8d88-9a23fd028746\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:30.867364 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:30.867330 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/71656d41-a5d6-4cfb-8d88-9a23fd028746-sys\") pod \"perf-node-gather-daemonset-kr927\" (UID: \"71656d41-a5d6-4cfb-8d88-9a23fd028746\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:30.867527 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:30.867436 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/71656d41-a5d6-4cfb-8d88-9a23fd028746-podres\") pod \"perf-node-gather-daemonset-kr927\" (UID: \"71656d41-a5d6-4cfb-8d88-9a23fd028746\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:30.867527 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:30.867463 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/71656d41-a5d6-4cfb-8d88-9a23fd028746-sys\") pod \"perf-node-gather-daemonset-kr927\" (UID: \"71656d41-a5d6-4cfb-8d88-9a23fd028746\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:30.867527 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:30.867489 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/71656d41-a5d6-4cfb-8d88-9a23fd028746-proc\") pod \"perf-node-gather-daemonset-kr927\" (UID: \"71656d41-a5d6-4cfb-8d88-9a23fd028746\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:30.867527 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:30.867517 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/71656d41-a5d6-4cfb-8d88-9a23fd028746-lib-modules\") pod \"perf-node-gather-daemonset-kr927\" (UID: \"71656d41-a5d6-4cfb-8d88-9a23fd028746\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:30.867749 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:30.867586 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/71656d41-a5d6-4cfb-8d88-9a23fd028746-proc\") pod \"perf-node-gather-daemonset-kr927\" (UID: \"71656d41-a5d6-4cfb-8d88-9a23fd028746\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:30.867749 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:30.867603 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/71656d41-a5d6-4cfb-8d88-9a23fd028746-podres\") pod \"perf-node-gather-daemonset-kr927\" (UID: \"71656d41-a5d6-4cfb-8d88-9a23fd028746\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:30.867749 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:30.867662 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/71656d41-a5d6-4cfb-8d88-9a23fd028746-lib-modules\") pod \"perf-node-gather-daemonset-kr927\" (UID: \"71656d41-a5d6-4cfb-8d88-9a23fd028746\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:30.877337 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:30.877306 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlt6z\" (UniqueName: \"kubernetes.io/projected/71656d41-a5d6-4cfb-8d88-9a23fd028746-kube-api-access-rlt6z\") pod \"perf-node-gather-daemonset-kr927\" (UID: \"71656d41-a5d6-4cfb-8d88-9a23fd028746\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:31.007917 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:31.007872 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:31.166092 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:31.166021 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927"] Apr 16 21:28:31.494316 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:31.494276 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" event={"ID":"71656d41-a5d6-4cfb-8d88-9a23fd028746","Type":"ContainerStarted","Data":"692d1abbee576e5d37693bfbda7a18651405a944b30083e7b580c34009fbac5c"} Apr 16 21:28:31.494316 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:31.494319 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" event={"ID":"71656d41-a5d6-4cfb-8d88-9a23fd028746","Type":"ContainerStarted","Data":"9007859a8bf998f26c2e496797d8ca2095094818cfa245c9c7f4df3ff3414dc7"} Apr 16 21:28:31.494574 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:31.494365 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:31.516108 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:31.516060 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" podStartSLOduration=1.5160411150000002 podStartE2EDuration="1.516041115s" podCreationTimestamp="2026-04-16 21:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:28:31.514860452 +0000 UTC m=+1825.717397183" watchObservedRunningTime="2026-04-16 21:28:31.516041115 +0000 UTC m=+1825.718577848" Apr 16 21:28:31.816827 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:31.816798 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/1.log" Apr 16 21:28:31.822729 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:31.822688 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7jmtg_861b7de5-b08b-459b-b432-8f80dc4d6df7/console-operator/2.log" Apr 16 21:28:32.754550 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:32.754522 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-zn96z_c556d26b-cf58-4648-b502-fd757f2e826b/volume-data-source-validator/0.log" Apr 16 21:28:33.594514 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:33.594485 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tbbn8_7d61f010-bef5-435b-a6dd-30cf6ec4dbe2/dns/0.log" Apr 16 21:28:33.627335 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:33.627304 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tbbn8_7d61f010-bef5-435b-a6dd-30cf6ec4dbe2/kube-rbac-proxy/0.log" Apr 16 21:28:33.654512 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:33.654486 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cjqkn_16da182a-ff12-4e2d-800d-10e00ef1512d/dns-node-resolver/0.log" Apr 16 21:28:34.184694 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:34.184656 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7fbd4b68b-464br_89b22818-d7c9-4f53-b62b-fa46caa8ef94/registry/0.log" Apr 16 21:28:34.283781 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:34.283754 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-s6rx6_0a96ee09-38fe-48c6-891c-03c6540c788b/node-ca/0.log" Apr 16 21:28:35.315502 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:35.315473 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-56b49765cd-9zmvz_c308960d-54f7-4b00-b5ef-867a99c50916/kube-auth-proxy/0.log" Apr 16 21:28:35.922146 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:35.922115 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4chc2_6d8eb548-23d0-403d-a61e-f91a50c71507/serve-healthcheck-canary/0.log" Apr 16 21:28:36.432900 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:36.432871 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f8gqh_7467cabe-4fe5-428e-85ad-b7c1293cf891/kube-rbac-proxy/0.log" Apr 16 21:28:36.461579 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:36.461545 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f8gqh_7467cabe-4fe5-428e-85ad-b7c1293cf891/exporter/0.log" Apr 16 21:28:36.487231 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:36.487198 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f8gqh_7467cabe-4fe5-428e-85ad-b7c1293cf891/extractor/0.log" Apr 16 21:28:37.510473 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:37.510445 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-kr927" Apr 16 21:28:38.830655 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:38.830620 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5f94c666bb-lpddp_f45cf097-e1ff-4b18-b113-cab89f129ef3/manager/0.log" Apr 16 21:28:40.207589 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:40.207554 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-c5b769f8c-hm9s4_80d62e77-a324-421f-a766-de3a332f82d5/manager/0.log" Apr 16 21:28:44.457660 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:44.457627 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-lmxn9_2b8d9335-a0d6-4d52-b133-5eb8282aab9a/migrator/0.log" Apr 16 21:28:44.482780 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:44.482745 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-lmxn9_2b8d9335-a0d6-4d52-b133-5eb8282aab9a/graceful-termination/0.log" Apr 16 21:28:46.061929 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:46.061894 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bt6zf_53b01fc4-bf7b-492d-ac6d-538fc5854832/kube-multus-additional-cni-plugins/0.log" Apr 16 21:28:46.088052 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:46.088011 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bt6zf_53b01fc4-bf7b-492d-ac6d-538fc5854832/egress-router-binary-copy/0.log" Apr 16 21:28:46.113149 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:46.113119 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bt6zf_53b01fc4-bf7b-492d-ac6d-538fc5854832/cni-plugins/0.log" Apr 16 21:28:46.136367 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:46.136327 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bt6zf_53b01fc4-bf7b-492d-ac6d-538fc5854832/bond-cni-plugin/0.log" Apr 16 21:28:46.160560 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:46.160535 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bt6zf_53b01fc4-bf7b-492d-ac6d-538fc5854832/routeoverride-cni/0.log" Apr 16 21:28:46.184974 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:46.184933 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bt6zf_53b01fc4-bf7b-492d-ac6d-538fc5854832/whereabouts-cni-bincopy/0.log" Apr 16 21:28:46.210715 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:46.210684 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bt6zf_53b01fc4-bf7b-492d-ac6d-538fc5854832/whereabouts-cni/0.log" Apr 16 21:28:46.423626 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:46.423543 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pd7bg_790473ff-de78-426c-8164-f182aadaf583/kube-multus/0.log" Apr 16 21:28:46.445484 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:46.445457 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-llm2q_a6620072-c60f-4d78-bc86-aac34b2c5098/network-metrics-daemon/0.log" Apr 16 21:28:46.480648 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:46.480605 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-llm2q_a6620072-c60f-4d78-bc86-aac34b2c5098/kube-rbac-proxy/0.log" Apr 16 21:28:47.411647 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:47.411566 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-controller/0.log" Apr 16 21:28:47.436570 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:47.436541 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/0.log" Apr 16 21:28:47.444765 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:47.444743 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovn-acl-logging/1.log" Apr 16 21:28:47.469190 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:47.469127 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/kube-rbac-proxy-node/0.log" Apr 16 21:28:47.495109 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:47.495078 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 21:28:47.517463 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:47.517435 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/northd/0.log" Apr 16 21:28:47.543232 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:47.543199 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/nbdb/0.log" Apr 16 21:28:47.568976 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:47.568920 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/sbdb/0.log" Apr 16 21:28:47.697984 ip-10-0-138-120 kubenswrapper[2568]: I0416 21:28:47.697896 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvjzw_66b94037-d7e2-4eef-911d-5525fbe6343a/ovnkube-controller/0.log"