Apr 22 14:13:13.868854 ip-10-0-133-65 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 14:13:13.868864 ip-10-0-133-65 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 14:13:13.868873 ip-10-0-133-65 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 14:13:13.869168 ip-10-0-133-65 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 14:13:23.960665 ip-10-0-133-65 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 14:13:23.960683 ip-10-0-133-65 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 0b7b5e09e2d245ae96db4570713cf4e0 -- Apr 22 14:15:38.091463 ip-10-0-133-65 systemd[1]: Starting Kubernetes Kubelet... Apr 22 14:15:38.569607 ip-10-0-133-65 kubenswrapper[2542]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:38.569607 ip-10-0-133-65 kubenswrapper[2542]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 14:15:38.569607 ip-10-0-133-65 kubenswrapper[2542]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:38.569607 ip-10-0-133-65 kubenswrapper[2542]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 14:15:38.569607 ip-10-0-133-65 kubenswrapper[2542]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:38.571941 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.571827 2542 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 14:15:38.576815 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576794 2542 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:38.576815 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576810 2542 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:38.576815 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576814 2542 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:38.576815 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576818 2542 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576824 2542 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576827 2542 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576830 2542 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576833 2542 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576836 2542 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576838 2542 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576841 2542 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576845 2542 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576850 2542 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576854 2542 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576857 2542 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576860 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576865 2542 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576869 2542 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576872 2542 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576875 2542 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576877 2542 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576880 2542 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:38.577074 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576882 2542 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576885 2542 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576888 2542 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576890 2542 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576893 2542 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576896 2542 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576899 2542 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576904 2542 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576907 2542 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576909 2542 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576912 2542 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576914 2542 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576917 2542 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576919 2542 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576921 2542 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576925 2542 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576928 2542 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576930 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576934 2542 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:38.577606 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576936 2542 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576941 2542 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576944 2542 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576947 2542 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576950 2542 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576953 2542 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576955 2542 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576958 2542 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576960 2542 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576963 2542 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576966 2542 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576968 2542 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576971 2542 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576974 2542 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576978 2542 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576981 2542 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576984 2542 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576987 2542 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576989 2542 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576992 2542 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:38.578161 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.576997 2542 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577000 2542 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577003 2542 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577005 2542 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577008 2542 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577014 2542 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577017 2542 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577019 2542 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577023 2542 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577027 2542 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577029 2542 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577032 2542 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577035 2542 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577037 2542 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577040 2542 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577044 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577051 2542 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577053 2542 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577056 2542 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577059 2542 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:38.578688 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577062 2542 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577064 2542 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577067 2542 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577069 2542 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.577071 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578025 2542 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578036 2542 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578040 2542 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578043 2542 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578046 2542 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578049 2542 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578052 2542 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578055 2542 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578058 2542 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578061 2542 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578064 2542 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578067 2542 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578069 2542 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578072 2542 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578075 2542 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:38.579145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578077 2542 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578080 2542 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578083 2542 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578085 2542 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578087 2542 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578090 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578092 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578095 2542 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578100 2542 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578103 2542 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578106 2542 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578108 2542 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578111 2542 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578114 2542 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578117 2542 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578119 2542 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578121 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578124 2542 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578127 2542 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578130 2542 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:38.579628 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578132 2542 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578135 2542 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578137 2542 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578140 2542 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578142 2542 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578145 2542 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578147 2542 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578150 2542 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578152 2542 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578154 2542 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578157 2542 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578160 2542 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578162 2542 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578165 2542 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578168 2542 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578170 2542 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578173 2542 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578175 2542 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578178 2542 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578180 2542 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:38.580149 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578196 2542 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578198 2542 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578201 2542 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578203 2542 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578206 2542 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578209 2542 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578211 2542 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578213 2542 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578216 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578221 2542 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578224 2542 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578227 2542 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578229 2542 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578232 2542 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578235 2542 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578237 2542 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578240 2542 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578242 2542 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578245 2542 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:38.580632 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578248 2542 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578250 2542 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578253 2542 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578255 2542 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578259 2542 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578262 2542 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578264 2542 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578267 2542 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578269 2542 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578273 2542 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578277 2542 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.578279 2542 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579490 2542 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579499 2542 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579506 2542 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579511 2542 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579516 2542 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579520 2542 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579524 2542 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579528 2542 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579531 2542 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 14:15:38.581080 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579534 2542 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579540 2542 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579543 2542 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579546 2542 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579549 2542 flags.go:64] FLAG: --cgroup-root="" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579552 2542 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579555 2542 flags.go:64] FLAG: --client-ca-file="" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579558 2542 flags.go:64] FLAG: --cloud-config="" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579561 2542 flags.go:64] FLAG: --cloud-provider="external" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579564 2542 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579568 2542 flags.go:64] FLAG: --cluster-domain="" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579570 2542 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579574 2542 flags.go:64] FLAG: --config-dir="" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579576 2542 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579579 2542 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579583 2542 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579586 2542 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579589 2542 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579592 2542 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579595 2542 flags.go:64] FLAG: --contention-profiling="false" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579598 2542 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579600 2542 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579603 2542 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579606 2542 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579611 2542 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 14:15:38.581588 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579613 2542 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579616 2542 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579619 2542 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579622 2542 flags.go:64] FLAG: --enable-server="true" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579625 2542 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579630 2542 flags.go:64] FLAG: --event-burst="100" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579633 2542 flags.go:64] FLAG: --event-qps="50" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579636 2542 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579641 2542 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579644 2542 flags.go:64] FLAG: --eviction-hard="" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579648 2542 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579651 2542 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579654 2542 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579657 2542 flags.go:64] FLAG: --eviction-soft="" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579660 2542 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579663 2542 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579666 2542 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579669 2542 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579672 2542 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579674 2542 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579677 2542 flags.go:64] FLAG: --feature-gates="" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579681 2542 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579683 2542 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579687 2542 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579690 2542 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579692 2542 flags.go:64] FLAG: --healthz-port="10248" Apr 22 14:15:38.582179 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579695 2542 flags.go:64] FLAG: --help="false" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579698 2542 flags.go:64] FLAG: --hostname-override="ip-10-0-133-65.ec2.internal" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579702 2542 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579704 2542 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579707 2542 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579711 2542 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579714 2542 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579717 2542 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579719 2542 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579722 2542 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579725 2542 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579728 2542 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579731 2542 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579735 2542 flags.go:64] FLAG: --kube-reserved="" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579738 2542 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579742 2542 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579745 2542 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579748 2542 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579751 2542 flags.go:64] FLAG: --lock-file="" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579753 2542 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579756 2542 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579759 2542 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579770 2542 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 14:15:38.582794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579773 2542 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579776 2542 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579779 2542 flags.go:64] FLAG: --logging-format="text" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579782 2542 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579785 2542 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579788 2542 flags.go:64] FLAG: --manifest-url="" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579791 2542 flags.go:64] FLAG: --manifest-url-header="" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579795 2542 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579798 2542 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579801 2542 flags.go:64] FLAG: --max-pods="110" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579804 2542 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579807 2542 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579810 2542 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579813 2542 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579815 2542 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579818 2542 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579821 2542 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579829 2542 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579832 2542 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579835 2542 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579838 2542 flags.go:64] FLAG: --pod-cidr="" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579841 2542 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579847 2542 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579850 2542 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 14:15:38.583382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579853 2542 flags.go:64] FLAG: --pods-per-core="0" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579856 2542 flags.go:64] FLAG: --port="10250" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579860 2542 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579862 2542 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a5224b02736c6c44" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579866 2542 flags.go:64] FLAG: --qos-reserved="" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579868 2542 flags.go:64] FLAG: --read-only-port="10255" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579871 2542 flags.go:64] FLAG: --register-node="true" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579874 2542 flags.go:64] FLAG: --register-schedulable="true" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579877 2542 flags.go:64] FLAG: --register-with-taints="" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579880 2542 flags.go:64] FLAG: --registry-burst="10" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579883 2542 flags.go:64] FLAG: --registry-qps="5" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579886 2542 flags.go:64] FLAG: --reserved-cpus="" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579889 2542 flags.go:64] FLAG: --reserved-memory="" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579892 2542 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579895 2542 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579898 2542 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579901 2542 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579904 2542 flags.go:64] FLAG: --runonce="false" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579907 2542 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579909 2542 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579912 2542 flags.go:64] FLAG: --seccomp-default="false" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579915 2542 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579918 2542 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579921 2542 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579924 2542 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579927 2542 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 14:15:38.583974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579929 2542 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579932 2542 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579935 2542 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579939 2542 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579942 2542 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579945 2542 flags.go:64] FLAG: --system-cgroups="" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579948 2542 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579953 2542 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579956 2542 flags.go:64] FLAG: --tls-cert-file="" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579959 2542 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579963 2542 flags.go:64] FLAG: --tls-min-version="" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579966 2542 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579969 2542 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579972 2542 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579974 2542 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579987 2542 flags.go:64] FLAG: --v="2" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579992 2542 flags.go:64] FLAG: --version="false" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.579996 2542 flags.go:64] FLAG: --vmodule="" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.580003 2542 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.580006 2542 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580086 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580090 2542 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580093 2542 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580096 2542 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:38.584605 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580098 2542 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580101 2542 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580103 2542 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580106 2542 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580108 2542 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580111 2542 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580114 2542 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580116 2542 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580119 2542 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580121 2542 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580123 2542 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580126 2542 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580129 2542 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580131 2542 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580135 2542 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580138 2542 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580141 2542 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580143 2542 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580146 2542 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:38.585207 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580148 2542 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580151 2542 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580153 2542 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580156 2542 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580158 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580161 2542 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580164 2542 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580166 2542 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580169 2542 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580172 2542 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580174 2542 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580177 2542 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580179 2542 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580181 2542 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580196 2542 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580198 2542 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580201 2542 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580204 2542 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580208 2542 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:38.585694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580211 2542 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580214 2542 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580217 2542 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580220 2542 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580223 2542 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580226 2542 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580228 2542 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580234 2542 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580237 2542 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580241 2542 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580243 2542 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580246 2542 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580248 2542 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580251 2542 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580253 2542 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580256 2542 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580258 2542 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580261 2542 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580263 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580266 2542 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:38.586145 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580268 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580270 2542 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580273 2542 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580275 2542 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580278 2542 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580280 2542 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580283 2542 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580285 2542 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580288 2542 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580290 2542 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580292 2542 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580295 2542 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580298 2542 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580300 2542 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580303 2542 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580305 2542 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580307 2542 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580310 2542 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580312 2542 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580316 2542 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:38.586623 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580318 2542 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:38.587086 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580322 2542 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:38.587086 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580325 2542 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:38.587086 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.580327 2542 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:38.587086 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.580333 2542 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:38.587086 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.586881 2542 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 14:15:38.587086 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.586896 2542 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 14:15:38.587086 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586940 2542 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:38.587086 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586945 2542 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:38.587086 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586949 2542 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:38.587086 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586952 2542 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:38.587086 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586954 2542 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:38.587086 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586957 2542 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:38.587086 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586959 2542 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:38.587086 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586962 2542 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:38.587086 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586965 2542 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586967 2542 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586970 2542 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586972 2542 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586975 2542 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586977 2542 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586980 2542 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586982 2542 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586985 2542 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586987 2542 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586990 2542 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586992 2542 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586995 2542 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.586997 2542 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587000 2542 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587002 2542 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587005 2542 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587007 2542 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587010 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587013 2542 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:38.587469 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587016 2542 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587018 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587021 2542 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587026 2542 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587030 2542 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587033 2542 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587035 2542 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587038 2542 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587041 2542 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587043 2542 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587046 2542 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587048 2542 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587051 2542 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587053 2542 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587056 2542 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587058 2542 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587061 2542 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587063 2542 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587065 2542 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:38.587943 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587068 2542 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587071 2542 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587074 2542 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587076 2542 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587079 2542 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587081 2542 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587083 2542 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587086 2542 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587088 2542 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587091 2542 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587093 2542 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587096 2542 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587099 2542 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587101 2542 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587104 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587106 2542 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587109 2542 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587112 2542 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587114 2542 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:38.588447 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587117 2542 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587121 2542 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587124 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587127 2542 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587129 2542 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587132 2542 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587134 2542 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587137 2542 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587139 2542 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587142 2542 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587144 2542 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587147 2542 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587149 2542 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587151 2542 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587154 2542 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587156 2542 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587159 2542 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587161 2542 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587163 2542 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:38.588878 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587166 2542 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:38.589335 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.587170 2542 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:38.589335 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587277 2542 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:38.589335 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587282 2542 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:38.589335 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587285 2542 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:38.589335 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587288 2542 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:38.589335 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587291 2542 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:38.589335 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587293 2542 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:38.589335 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587296 2542 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:38.589335 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587298 2542 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:38.589335 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587300 2542 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:38.589335 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587303 2542 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:38.589335 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587306 2542 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:38.589335 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587309 2542 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:38.589335 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587311 2542 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:38.589335 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587313 2542 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587316 2542 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587318 2542 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587321 2542 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587323 2542 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587325 2542 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587328 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587330 2542 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587333 2542 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587335 2542 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587338 2542 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587340 2542 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587343 2542 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587348 2542 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587350 2542 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587353 2542 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587355 2542 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587357 2542 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587360 2542 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587363 2542 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:38.589698 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587366 2542 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587368 2542 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587371 2542 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587373 2542 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587376 2542 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587378 2542 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587380 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587383 2542 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587385 2542 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587388 2542 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587391 2542 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587393 2542 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587395 2542 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587398 2542 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587400 2542 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587402 2542 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587405 2542 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587407 2542 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587409 2542 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587412 2542 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:38.590167 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587416 2542 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587419 2542 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587421 2542 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587424 2542 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587427 2542 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587430 2542 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587432 2542 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587435 2542 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587437 2542 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587440 2542 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587442 2542 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587445 2542 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587447 2542 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587450 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587452 2542 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587455 2542 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587457 2542 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587460 2542 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587462 2542 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587464 2542 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:38.590674 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587467 2542 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:38.591134 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587469 2542 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:38.591134 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587472 2542 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:38.591134 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587475 2542 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:38.591134 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587477 2542 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:38.591134 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587480 2542 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:38.591134 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587483 2542 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:38.591134 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587485 2542 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:38.591134 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587488 2542 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:38.591134 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587490 2542 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:38.591134 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587492 2542 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:38.591134 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587495 2542 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:38.591134 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:38.587497 2542 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:38.591134 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.587502 2542 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:38.591134 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.588152 2542 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 14:15:38.591944 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.591930 2542 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 14:15:38.592941 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.592931 2542 server.go:1019] "Starting client certificate rotation" Apr 22 14:15:38.593041 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.593026 2542 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:15:38.593075 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.593067 2542 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:15:38.620171 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.620154 2542 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:15:38.622927 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.622909 2542 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:15:38.634516 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.634502 2542 log.go:25] "Validated CRI v1 runtime API" Apr 22 14:15:38.640470 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.640454 2542 log.go:25] "Validated CRI v1 image API" Apr 22 14:15:38.644784 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.644769 2542 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 14:15:38.648924 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.648869 2542 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 8e86a957-b015-4005-9b10-b0500d8279b4:/dev/nvme0n1p4 afa21467-0c98-4756-a3ad-c77cf5f7c5aa:/dev/nvme0n1p3] Apr 22 14:15:38.649008 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.648919 2542 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 14:15:38.651596 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.651573 2542 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:38.654659 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.654555 2542 manager.go:217] Machine: {Timestamp:2026-04-22 14:15:38.65352325 +0000 UTC m=+0.435900373 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:2499998 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2aa9a12a2b99e34307f494c1933eb8 SystemUUID:ec2aa9a1-2a2b-99e3-4307-f494c1933eb8 BootID:0b7b5e09-e2d2-45ae-96db-4570713cf4e0 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:eb:fc:8f:2e:9d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:eb:fc:8f:2e:9d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ae:4c:ba:12:e9:d8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 14:15:38.654659 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.654651 2542 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 14:15:38.654762 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.654736 2542 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 14:15:38.657324 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.657303 2542 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 14:15:38.657453 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.657326 2542 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-65.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 14:15:38.657499 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.657462 2542 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 14:15:38.657499 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.657470 2542 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 14:15:38.657499 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.657482 2542 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:15:38.657499 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.657494 2542 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:15:38.658919 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.658909 2542 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:15:38.659054 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.659045 2542 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 14:15:38.661729 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.661720 2542 kubelet.go:491] "Attempting to sync node with API server" Apr 22 14:15:38.661760 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.661731 2542 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 14:15:38.661760 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.661742 2542 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 14:15:38.661760 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.661751 2542 kubelet.go:397] "Adding apiserver pod source" Apr 22 14:15:38.661760 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.661758 2542 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 14:15:38.662812 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.662801 2542 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:15:38.662852 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.662818 2542 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:15:38.665534 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.665516 2542 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 14:15:38.667310 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.667296 2542 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 14:15:38.668625 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.668613 2542 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 14:15:38.668706 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.668631 2542 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 14:15:38.668706 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.668640 2542 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 14:15:38.668706 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.668648 2542 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 14:15:38.668706 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.668656 2542 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 14:15:38.668706 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.668664 2542 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 14:15:38.668706 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.668671 2542 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 14:15:38.668706 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.668680 2542 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 14:15:38.668706 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.668690 2542 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 14:15:38.668706 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.668699 2542 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 14:15:38.668980 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.668711 2542 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 14:15:38.668980 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.668835 2542 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 14:15:38.669958 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.669948 2542 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 14:15:38.670013 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.669961 2542 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 14:15:38.673237 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.673215 2542 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 14:15:38.673336 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.673262 2542 server.go:1295] "Started kubelet" Apr 22 14:15:38.673382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.673329 2542 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 14:15:38.673447 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.673406 2542 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 14:15:38.673484 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.673465 2542 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 14:15:38.673971 ip-10-0-133-65 systemd[1]: Started Kubernetes Kubelet. Apr 22 14:15:38.674629 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:38.674559 2542 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-65.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 14:15:38.674727 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.674714 2542 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-65.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 14:15:38.674789 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:38.674767 2542 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 14:15:38.675278 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.675262 2542 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 14:15:38.675772 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.675761 2542 server.go:317] "Adding debug handlers to kubelet server" Apr 22 14:15:38.679489 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:38.678500 2542 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-65.ec2.internal.18a8b36e269ca6e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-65.ec2.internal,UID:ip-10-0-133-65.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-65.ec2.internal,},FirstTimestamp:2026-04-22 14:15:38.673235687 +0000 UTC m=+0.455612817,LastTimestamp:2026-04-22 14:15:38.673235687 +0000 UTC m=+0.455612817,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-65.ec2.internal,}" Apr 22 14:15:38.680974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.680956 2542 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:38.681500 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.681483 2542 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 14:15:38.682217 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.682200 2542 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 14:15:38.682217 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.682219 2542 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 14:15:38.682343 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.682295 2542 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 14:15:38.682391 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.682348 2542 reconstruct.go:97] "Volume reconstruction finished" Apr 22 14:15:38.682391 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.682355 2542 reconciler.go:26] "Reconciler: start to sync state" Apr 22 14:15:38.682479 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:38.682417 2542 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-65.ec2.internal\" not found" Apr 22 14:15:38.685038 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.685021 2542 factory.go:55] Registering systemd factory Apr 22 14:15:38.685038 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.685041 2542 factory.go:223] Registration of the systemd container factory successfully Apr 22 14:15:38.685299 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.685287 2542 factory.go:153] Registering CRI-O factory Apr 22 14:15:38.685339 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.685303 2542 factory.go:223] Registration of the crio container factory successfully Apr 22 14:15:38.685385 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.685361 2542 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 14:15:38.685416 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.685388 2542 factory.go:103] Registering Raw factory Apr 22 14:15:38.685416 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.685402 2542 manager.go:1196] Started watching for new ooms in manager Apr 22 14:15:38.687077 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.687059 2542 manager.go:319] Starting recovery of all containers Apr 22 14:15:38.687170 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:38.687085 2542 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 14:15:38.688101 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:38.688067 2542 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 14:15:38.688215 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:38.688110 2542 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-65.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 14:15:38.692563 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.692541 2542 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-m8mg4" Apr 22 14:15:38.696694 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.696680 2542 manager.go:324] Recovery completed Apr 22 14:15:38.699525 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.699508 2542 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-m8mg4" Apr 22 14:15:38.700788 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.700772 2542 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:38.703171 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.703156 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:38.703248 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.703181 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:38.703248 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.703206 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:38.703693 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.703675 2542 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 14:15:38.703693 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.703691 2542 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 14:15:38.703796 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.703709 2542 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:15:38.705986 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.705973 2542 policy_none.go:49] "None policy: Start" Apr 22 14:15:38.706025 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.705990 2542 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 14:15:38.706025 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.706000 2542 state_mem.go:35] "Initializing new in-memory state store" Apr 22 14:15:38.708216 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:38.708042 2542 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-65.ec2.internal.18a8b36e28656957 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-65.ec2.internal,UID:ip-10-0-133-65.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-133-65.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-133-65.ec2.internal,},FirstTimestamp:2026-04-22 14:15:38.703169879 +0000 UTC m=+0.485547002,LastTimestamp:2026-04-22 14:15:38.703169879 +0000 UTC m=+0.485547002,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-65.ec2.internal,}" Apr 22 14:15:38.746554 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.739360 2542 manager.go:341] "Starting Device Plugin manager" Apr 22 14:15:38.746554 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:38.739384 2542 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 14:15:38.746554 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.739393 2542 server.go:85] "Starting device plugin registration server" Apr 22 14:15:38.746554 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.739607 2542 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 14:15:38.746554 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.739620 2542 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 14:15:38.746554 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.739709 2542 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 14:15:38.746554 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.739782 2542 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 14:15:38.746554 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.739791 2542 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 14:15:38.746554 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:38.740266 2542 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 14:15:38.746554 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:38.740296 2542 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-65.ec2.internal\" not found" Apr 22 14:15:38.781503 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.781478 2542 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 14:15:38.782769 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.782754 2542 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 14:15:38.782836 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.782778 2542 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 14:15:38.782836 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.782793 2542 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 14:15:38.782836 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.782799 2542 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 14:15:38.782836 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:38.782833 2542 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 14:15:38.784694 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.784674 2542 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:38.840333 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.840281 2542 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:38.841017 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.841001 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:38.841089 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.841029 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:38.841089 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.841040 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:38.841089 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.841066 2542 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-65.ec2.internal" Apr 22 14:15:38.854891 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.854875 2542 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-65.ec2.internal" Apr 22 14:15:38.854936 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:38.854894 2542 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-65.ec2.internal\": node \"ip-10-0-133-65.ec2.internal\" not found" Apr 22 14:15:38.871534 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:38.871511 2542 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-65.ec2.internal\" not found" Apr 22 14:15:38.883691 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.883660 2542 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-65.ec2.internal"] Apr 22 14:15:38.883748 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.883740 2542 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:38.885109 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.885096 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:38.885196 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.885124 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:38.885196 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.885135 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:38.886320 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.886308 2542 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:38.886443 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.886431 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal" Apr 22 14:15:38.886477 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.886458 2542 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:38.886867 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.886856 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:38.886924 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.886878 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:38.886924 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.886857 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:38.886924 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.886890 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:38.886924 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.886903 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:38.886924 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.886915 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:38.887860 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.887848 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-65.ec2.internal" Apr 22 14:15:38.887906 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.887870 2542 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:38.888506 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.888492 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:38.888583 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.888515 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:38.888583 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.888528 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:38.917753 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:38.917734 2542 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-65.ec2.internal\" not found" node="ip-10-0-133-65.ec2.internal" Apr 22 14:15:38.922023 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:38.922009 2542 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-65.ec2.internal\" not found" node="ip-10-0-133-65.ec2.internal" Apr 22 14:15:38.972545 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:38.972527 2542 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-65.ec2.internal\" not found" Apr 22 14:15:38.982999 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.982977 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f18f1cd7406c822d6c890121cdd6a6f4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal\" (UID: \"f18f1cd7406c822d6c890121cdd6a6f4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal" Apr 22 14:15:38.983080 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.983009 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f18f1cd7406c822d6c890121cdd6a6f4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal\" (UID: \"f18f1cd7406c822d6c890121cdd6a6f4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal" Apr 22 14:15:38.983080 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:38.983032 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0c43fa24b7e06ba8965ac528ac76a464-config\") pod \"kube-apiserver-proxy-ip-10-0-133-65.ec2.internal\" (UID: \"0c43fa24b7e06ba8965ac528ac76a464\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-65.ec2.internal" Apr 22 14:15:39.073159 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:39.073139 2542 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-65.ec2.internal\" not found" Apr 22 14:15:39.083433 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.083413 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f18f1cd7406c822d6c890121cdd6a6f4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal\" (UID: \"f18f1cd7406c822d6c890121cdd6a6f4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal" Apr 22 14:15:39.083498 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.083446 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f18f1cd7406c822d6c890121cdd6a6f4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal\" (UID: \"f18f1cd7406c822d6c890121cdd6a6f4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal" Apr 22 14:15:39.083498 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.083465 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f18f1cd7406c822d6c890121cdd6a6f4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal\" (UID: \"f18f1cd7406c822d6c890121cdd6a6f4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal" Apr 22 14:15:39.083498 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.083473 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0c43fa24b7e06ba8965ac528ac76a464-config\") pod \"kube-apiserver-proxy-ip-10-0-133-65.ec2.internal\" (UID: \"0c43fa24b7e06ba8965ac528ac76a464\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-65.ec2.internal" Apr 22 14:15:39.083598 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.083502 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f18f1cd7406c822d6c890121cdd6a6f4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal\" (UID: \"f18f1cd7406c822d6c890121cdd6a6f4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal" Apr 22 14:15:39.083598 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.083502 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0c43fa24b7e06ba8965ac528ac76a464-config\") pod \"kube-apiserver-proxy-ip-10-0-133-65.ec2.internal\" (UID: \"0c43fa24b7e06ba8965ac528ac76a464\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-65.ec2.internal" Apr 22 14:15:39.173800 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:39.173781 2542 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-65.ec2.internal\" not found" Apr 22 14:15:39.219228 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.219207 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal" Apr 22 14:15:39.224683 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.224669 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-65.ec2.internal" Apr 22 14:15:39.274400 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:39.274380 2542 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-65.ec2.internal\" not found" Apr 22 14:15:39.374888 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:39.374863 2542 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-65.ec2.internal\" not found" Apr 22 14:15:39.475464 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:39.475399 2542 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-65.ec2.internal\" not found" Apr 22 14:15:39.507890 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.507872 2542 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:39.582364 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.582343 2542 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal" Apr 22 14:15:39.592858 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.592842 2542 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 14:15:39.593011 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.592994 2542 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 14:15:39.593077 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:39.593003 2542 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://acdb166417a3143c6baed2f381c11d1a-49409844364ddaea.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.133.65:47976->54.84.32.154:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal" Apr 22 14:15:39.593077 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.593028 2542 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-65.ec2.internal" Apr 22 14:15:39.611340 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.611321 2542 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:39.662262 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.662242 2542 apiserver.go:52] "Watching apiserver" Apr 22 14:15:39.665853 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.665829 2542 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:39.668020 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.668003 2542 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 14:15:39.669631 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.669612 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-g66xm","kube-system/konnectivity-agent-h4hlm","kube-system/kube-apiserver-proxy-ip-10-0-133-65.ec2.internal","openshift-multus/multus-additional-cni-plugins-7x75g","openshift-network-diagnostics/network-check-target-b27s2","openshift-network-operator/iptables-alerter-hgz5l","openshift-ovn-kubernetes/ovnkube-node-q7q7t","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4","openshift-cluster-node-tuning-operator/tuned-tg5pr","openshift-dns/node-resolver-k4tqk","openshift-image-registry/node-ca-wkvtd","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal","openshift-multus/multus-r7sbq"] Apr 22 14:15:39.671136 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.671123 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.673670 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.673650 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.674355 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.674339 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 14:15:39.674447 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.674392 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 14:15:39.674447 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.674397 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 14:15:39.674447 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.674437 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 14:15:39.674586 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.674469 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-llt9k\"" Apr 22 14:15:39.674586 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.674545 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 14:15:39.674690 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.674677 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 14:15:39.674844 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.674826 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hgz5l" Apr 22 14:15:39.675639 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.675572 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 14:15:39.675929 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.675915 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 14:15:39.676036 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.675915 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-5sv9s\"" Apr 22 14:15:39.676109 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.676095 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 14:15:39.676175 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.676150 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 14:15:39.676593 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.676574 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wkvtd" Apr 22 14:15:39.676678 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.676663 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 14:15:39.677051 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.677036 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 14:15:39.677141 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.677073 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:39.677351 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.677338 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:39.677630 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.677615 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7qfm6\"" Apr 22 14:15:39.678082 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.678065 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h4hlm" Apr 22 14:15:39.678568 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.678547 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2dvwk\"" Apr 22 14:15:39.678679 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.678662 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 14:15:39.679046 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.678810 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 14:15:39.679046 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.678905 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 14:15:39.680645 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.680624 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 14:15:39.680736 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.680653 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8vqk8\"" Apr 22 14:15:39.680789 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.680736 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 14:15:39.681363 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.681346 2542 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:39.681454 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.681370 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:15:39.681454 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.681434 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:39.681581 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:39.681435 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b27s2" podUID="6259e3a6-004d-4f50-9a36-dc28c9b0cd96" Apr 22 14:15:39.681581 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:39.681500 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g66xm" podUID="d326b6a1-5cbd-47fa-a676-90af9406d2a9" Apr 22 14:15:39.683127 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.683113 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.684473 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.684458 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.685966 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.685947 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2096774e-6172-439a-a5c0-779a91d43a80-konnectivity-ca\") pod \"konnectivity-agent-h4hlm\" (UID: \"2096774e-6172-439a-a5c0-779a91d43a80\") " pod="kube-system/konnectivity-agent-h4hlm" Apr 22 14:15:39.686055 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.685977 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-run-openvswitch\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.686055 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686027 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-log-socket\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.686055 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686052 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/73c63add-22e8-4809-b696-9279d2454538-env-overrides\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.686219 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686074 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/73c63add-22e8-4809-b696-9279d2454538-ovn-node-metrics-cert\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.686219 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686100 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcvvf\" (UniqueName: \"kubernetes.io/projected/6f09af95-f295-4dec-8131-f3dad5bd3e4d-kube-api-access-jcvvf\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.686219 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686124 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs\") pod \"network-metrics-daemon-g66xm\" (UID: \"d326b6a1-5cbd-47fa-a676-90af9406d2a9\") " pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:39.686219 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686146 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/73c63add-22e8-4809-b696-9279d2454538-ovnkube-script-lib\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.686219 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686171 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6f09af95-f295-4dec-8131-f3dad5bd3e4d-cni-binary-copy\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.686219 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686214 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6f09af95-f295-4dec-8131-f3dad5bd3e4d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.686491 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686246 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k4tqk" Apr 22 14:15:39.686491 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686238 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md6xx\" (UniqueName: \"kubernetes.io/projected/d326b6a1-5cbd-47fa-a676-90af9406d2a9-kube-api-access-md6xx\") pod \"network-metrics-daemon-g66xm\" (UID: \"d326b6a1-5cbd-47fa-a676-90af9406d2a9\") " pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:39.686491 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686277 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c-iptables-alerter-script\") pod \"iptables-alerter-hgz5l\" (UID: \"b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c\") " pod="openshift-network-operator/iptables-alerter-hgz5l" Apr 22 14:15:39.686491 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686302 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e3614a5a-0db9-44d9-bdc3-8d3344b36689-etc-selinux\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.686491 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686340 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6f09af95-f295-4dec-8131-f3dad5bd3e4d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.686491 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686366 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-run-systemd\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.686491 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686389 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-var-lib-openvswitch\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.686491 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686413 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-etc-openvswitch\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.686491 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686437 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-node-log\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.686491 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686487 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb99l\" (UniqueName: \"kubernetes.io/projected/b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c-kube-api-access-fb99l\") pod \"iptables-alerter-hgz5l\" (UID: \"b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c\") " pod="openshift-network-operator/iptables-alerter-hgz5l" Apr 22 14:15:39.686907 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686516 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e3614a5a-0db9-44d9-bdc3-8d3344b36689-socket-dir\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.686907 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686539 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6f09af95-f295-4dec-8131-f3dad5bd3e4d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.686907 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686581 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-run-netns\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.686907 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686605 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-run-ovn\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.686907 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686625 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-cni-netd\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.686907 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686664 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3dd1c758-7fe6-4a4e-b170-8e5c199c937c-host\") pod \"node-ca-wkvtd\" (UID: \"3dd1c758-7fe6-4a4e-b170-8e5c199c937c\") " pod="openshift-image-registry/node-ca-wkvtd" Apr 22 14:15:39.686907 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686689 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2096774e-6172-439a-a5c0-779a91d43a80-agent-certs\") pod \"konnectivity-agent-h4hlm\" (UID: \"2096774e-6172-439a-a5c0-779a91d43a80\") " pod="kube-system/konnectivity-agent-h4hlm" Apr 22 14:15:39.686907 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686710 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3614a5a-0db9-44d9-bdc3-8d3344b36689-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.686907 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.686735 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e3614a5a-0db9-44d9-bdc3-8d3344b36689-registration-dir\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.687211 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687147 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-kubelet\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.687211 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687166 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-run-ovn-kubernetes\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.687211 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687198 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-cni-bin\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.687355 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687262 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.687355 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687295 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6f09af95-f295-4dec-8131-f3dad5bd3e4d-cnibin\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.687553 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687351 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e3614a5a-0db9-44d9-bdc3-8d3344b36689-device-dir\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.687553 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687385 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e3614a5a-0db9-44d9-bdc3-8d3344b36689-sys-fs\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.687553 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687390 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.687553 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687411 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcdxk\" (UniqueName: \"kubernetes.io/projected/e3614a5a-0db9-44d9-bdc3-8d3344b36689-kube-api-access-jcdxk\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.687553 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687440 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-systemd-units\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.687553 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687463 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6f09af95-f295-4dec-8131-f3dad5bd3e4d-os-release\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.687553 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687485 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3dd1c758-7fe6-4a4e-b170-8e5c199c937c-serviceca\") pod \"node-ca-wkvtd\" (UID: \"3dd1c758-7fe6-4a4e-b170-8e5c199c937c\") " pod="openshift-image-registry/node-ca-wkvtd" Apr 22 14:15:39.687553 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687506 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxfvw\" (UniqueName: \"kubernetes.io/projected/3dd1c758-7fe6-4a4e-b170-8e5c199c937c-kube-api-access-qxfvw\") pod \"node-ca-wkvtd\" (UID: \"3dd1c758-7fe6-4a4e-b170-8e5c199c937c\") " pod="openshift-image-registry/node-ca-wkvtd" Apr 22 14:15:39.687553 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687529 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f09af95-f295-4dec-8131-f3dad5bd3e4d-system-cni-dir\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.687553 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687550 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-slash\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.687888 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687574 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/73c63add-22e8-4809-b696-9279d2454538-ovnkube-config\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.687888 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687597 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krdf4\" (UniqueName: \"kubernetes.io/projected/73c63add-22e8-4809-b696-9279d2454538-kube-api-access-krdf4\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.687888 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687622 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8b2k\" (UniqueName: \"kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k\") pod \"network-check-target-b27s2\" (UID: \"6259e3a6-004d-4f50-9a36-dc28c9b0cd96\") " pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:15:39.687888 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.687644 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c-host-slash\") pod \"iptables-alerter-hgz5l\" (UID: \"b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c\") " pod="openshift-network-operator/iptables-alerter-hgz5l" Apr 22 14:15:39.690309 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.690251 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 14:15:39.690309 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.690294 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:39.690478 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.690412 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 14:15:39.690478 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.690444 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 14:15:39.692618 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.690831 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9qgh5\"" Apr 22 14:15:39.692618 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.691077 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 14:15:39.692618 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.691126 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 14:15:39.692618 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.691165 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-b4c54\"" Apr 22 14:15:39.692618 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.691360 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:39.692618 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.691432 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 14:15:39.692618 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.691554 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-szcqw\"" Apr 22 14:15:39.692618 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.691656 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sxftm\"" Apr 22 14:15:39.701823 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.701784 2542 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 14:10:38 +0000 UTC" deadline="2027-12-18 16:28:08.765811713 +0000 UTC" Apr 22 14:15:39.701823 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.701821 2542 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14522h12m29.063993593s" Apr 22 14:15:39.704652 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.704632 2542 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:39.728127 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.728001 2542 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-44h2s" Apr 22 14:15:39.736812 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.736794 2542 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-44h2s" Apr 22 14:15:39.766475 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:39.766451 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c43fa24b7e06ba8965ac528ac76a464.slice/crio-5f51a6a0a7fdeed4589818eb0de4893ed6a2d758e52d92bb8e2c0e5c9b90b6c5 WatchSource:0}: Error finding container 5f51a6a0a7fdeed4589818eb0de4893ed6a2d758e52d92bb8e2c0e5c9b90b6c5: Status 404 returned error can't find the container with id 5f51a6a0a7fdeed4589818eb0de4893ed6a2d758e52d92bb8e2c0e5c9b90b6c5 Apr 22 14:15:39.767264 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:39.767244 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf18f1cd7406c822d6c890121cdd6a6f4.slice/crio-9e7d17c461cd48dff39d1d4772e760cb6e20378a4762bb0b707c6955353765b2 WatchSource:0}: Error finding container 9e7d17c461cd48dff39d1d4772e760cb6e20378a4762bb0b707c6955353765b2: Status 404 returned error can't find the container with id 9e7d17c461cd48dff39d1d4772e760cb6e20378a4762bb0b707c6955353765b2 Apr 22 14:15:39.770865 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.770851 2542 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:15:39.783252 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.783232 2542 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 14:15:39.785140 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.785068 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-65.ec2.internal" event={"ID":"0c43fa24b7e06ba8965ac528ac76a464","Type":"ContainerStarted","Data":"5f51a6a0a7fdeed4589818eb0de4893ed6a2d758e52d92bb8e2c0e5c9b90b6c5"} Apr 22 14:15:39.786091 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.786066 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal" event={"ID":"f18f1cd7406c822d6c890121cdd6a6f4","Type":"ContainerStarted","Data":"9e7d17c461cd48dff39d1d4772e760cb6e20378a4762bb0b707c6955353765b2"} Apr 22 14:15:39.787842 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.787822 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c-host-slash\") pod \"iptables-alerter-hgz5l\" (UID: \"b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c\") " pod="openshift-network-operator/iptables-alerter-hgz5l" Apr 22 14:15:39.787910 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.787859 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6d9a3188-bb23-4be3-b39b-234bee924217-tmp-dir\") pod \"node-resolver-k4tqk\" (UID: \"6d9a3188-bb23-4be3-b39b-234bee924217\") " pod="openshift-dns/node-resolver-k4tqk" Apr 22 14:15:39.787910 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.787875 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-cnibin\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.787910 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.787890 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-run-openvswitch\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.788052 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.787910 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/73c63add-22e8-4809-b696-9279d2454538-ovn-node-metrics-cert\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.788052 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.787914 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c-host-slash\") pod \"iptables-alerter-hgz5l\" (UID: \"b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c\") " pod="openshift-network-operator/iptables-alerter-hgz5l" Apr 22 14:15:39.788052 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.787933 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs\") pod \"network-metrics-daemon-g66xm\" (UID: \"d326b6a1-5cbd-47fa-a676-90af9406d2a9\") " pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:39.788052 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.787995 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-sys\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.788052 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:39.788024 2542 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:39.788052 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.788035 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-run-openvswitch\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.788343 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:39.788071 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs podName:d326b6a1-5cbd-47fa-a676-90af9406d2a9 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:40.288050677 +0000 UTC m=+2.070427789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs") pod "network-metrics-daemon-g66xm" (UID: "d326b6a1-5cbd-47fa-a676-90af9406d2a9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:39.788343 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.788089 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-multus-socket-dir-parent\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.788343 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.788107 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/73c63add-22e8-4809-b696-9279d2454538-ovnkube-script-lib\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.788343 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.788122 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6f09af95-f295-4dec-8131-f3dad5bd3e4d-cni-binary-copy\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.788343 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.788145 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6f09af95-f295-4dec-8131-f3dad5bd3e4d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.788343 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.788174 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c-iptables-alerter-script\") pod \"iptables-alerter-hgz5l\" (UID: \"b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c\") " pod="openshift-network-operator/iptables-alerter-hgz5l" Apr 22 14:15:39.788343 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.788210 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e3614a5a-0db9-44d9-bdc3-8d3344b36689-etc-selinux\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.788343 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.788241 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-systemd\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.788343 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.788269 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-modprobe-d\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.788343 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.788295 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-sysctl-d\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.788343 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.788324 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-os-release\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.788343 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.788336 2542 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 14:15:39.788831 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.788353 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-hostroot\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.788831 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.788622 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e3614a5a-0db9-44d9-bdc3-8d3344b36689-etc-selinux\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.789044 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789024 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c-iptables-alerter-script\") pod \"iptables-alerter-hgz5l\" (UID: \"b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c\") " pod="openshift-network-operator/iptables-alerter-hgz5l" Apr 22 14:15:39.789101 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789070 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/73c63add-22e8-4809-b696-9279d2454538-ovnkube-script-lib\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.789101 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789078 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6f09af95-f295-4dec-8131-f3dad5bd3e4d-cni-binary-copy\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.789320 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789275 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6f09af95-f295-4dec-8131-f3dad5bd3e4d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.789560 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789086 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-multus-conf-dir\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.789612 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789594 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8db7d438-84db-45bb-919c-709bca043fd8-multus-daemon-config\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.789659 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789620 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-kubernetes\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.789659 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789654 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-run-systemd\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.789740 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789684 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-etc-openvswitch\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.789740 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789713 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fb99l\" (UniqueName: \"kubernetes.io/projected/b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c-kube-api-access-fb99l\") pod \"iptables-alerter-hgz5l\" (UID: \"b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c\") " pod="openshift-network-operator/iptables-alerter-hgz5l" Apr 22 14:15:39.789825 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789743 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-host-var-lib-cni-bin\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.789825 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789767 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e3614a5a-0db9-44d9-bdc3-8d3344b36689-device-dir\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.789825 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789797 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6f09af95-f295-4dec-8131-f3dad5bd3e4d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.789944 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789826 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-run-ovn\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.789944 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789868 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3dd1c758-7fe6-4a4e-b170-8e5c199c937c-host\") pod \"node-ca-wkvtd\" (UID: \"3dd1c758-7fe6-4a4e-b170-8e5c199c937c\") " pod="openshift-image-registry/node-ca-wkvtd" Apr 22 14:15:39.789944 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789885 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-run-ovn\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.789944 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789908 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2096774e-6172-439a-a5c0-779a91d43a80-agent-certs\") pod \"konnectivity-agent-h4hlm\" (UID: \"2096774e-6172-439a-a5c0-779a91d43a80\") " pod="kube-system/konnectivity-agent-h4hlm" Apr 22 14:15:39.789944 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789934 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e3614a5a-0db9-44d9-bdc3-8d3344b36689-registration-dir\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.790145 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789966 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-kubelet\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.790145 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.789997 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-run-ovn-kubernetes\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.790145 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790027 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-cni-bin\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.790145 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790058 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.790145 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790084 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6f09af95-f295-4dec-8131-f3dad5bd3e4d-cnibin\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.790145 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790123 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcdxk\" (UniqueName: \"kubernetes.io/projected/e3614a5a-0db9-44d9-bdc3-8d3344b36689-kube-api-access-jcdxk\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.790414 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790158 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-system-cni-dir\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.790414 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790208 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxfvw\" (UniqueName: \"kubernetes.io/projected/3dd1c758-7fe6-4a4e-b170-8e5c199c937c-kube-api-access-qxfvw\") pod \"node-ca-wkvtd\" (UID: \"3dd1c758-7fe6-4a4e-b170-8e5c199c937c\") " pod="openshift-image-registry/node-ca-wkvtd" Apr 22 14:15:39.790414 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790226 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e3614a5a-0db9-44d9-bdc3-8d3344b36689-device-dir\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.790414 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790241 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-host-var-lib-cni-multus\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.790414 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790280 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pztq\" (UniqueName: \"kubernetes.io/projected/8db7d438-84db-45bb-919c-709bca043fd8-kube-api-access-4pztq\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.790619 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790576 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-run-systemd\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.790665 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790619 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3dd1c758-7fe6-4a4e-b170-8e5c199c937c-host\") pod \"node-ca-wkvtd\" (UID: \"3dd1c758-7fe6-4a4e-b170-8e5c199c937c\") " pod="openshift-image-registry/node-ca-wkvtd" Apr 22 14:15:39.791410 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790726 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6f09af95-f295-4dec-8131-f3dad5bd3e4d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.791410 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790771 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-etc-openvswitch\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.791410 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790801 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e3614a5a-0db9-44d9-bdc3-8d3344b36689-registration-dir\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.791410 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790809 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-kubelet\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.791410 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790850 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-cni-bin\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.791410 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790881 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.791410 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790888 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/73c63add-22e8-4809-b696-9279d2454538-ovnkube-config\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.791410 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790918 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-run-ovn-kubernetes\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.791410 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.790924 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krdf4\" (UniqueName: \"kubernetes.io/projected/73c63add-22e8-4809-b696-9279d2454538-kube-api-access-krdf4\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.791410 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.791316 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6f09af95-f295-4dec-8131-f3dad5bd3e4d-cnibin\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.791410 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.791363 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2096774e-6172-439a-a5c0-779a91d43a80-konnectivity-ca\") pod \"konnectivity-agent-h4hlm\" (UID: \"2096774e-6172-439a-a5c0-779a91d43a80\") " pod="kube-system/konnectivity-agent-h4hlm" Apr 22 14:15:39.791888 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.791479 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-multus-cni-dir\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.791888 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.791755 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-host-run-netns\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.791888 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.791795 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-host-var-lib-kubelet\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.791888 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.791830 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-log-socket\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.792079 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.791879 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/73c63add-22e8-4809-b696-9279d2454538-env-overrides\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.792079 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.791913 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2096774e-6172-439a-a5c0-779a91d43a80-konnectivity-ca\") pod \"konnectivity-agent-h4hlm\" (UID: \"2096774e-6172-439a-a5c0-779a91d43a80\") " pod="kube-system/konnectivity-agent-h4hlm" Apr 22 14:15:39.792079 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.791936 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcvvf\" (UniqueName: \"kubernetes.io/projected/6f09af95-f295-4dec-8131-f3dad5bd3e4d-kube-api-access-jcvvf\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.792079 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.791969 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-log-socket\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.792079 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792000 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-md6xx\" (UniqueName: \"kubernetes.io/projected/d326b6a1-5cbd-47fa-a676-90af9406d2a9-kube-api-access-md6xx\") pod \"network-metrics-daemon-g66xm\" (UID: \"d326b6a1-5cbd-47fa-a676-90af9406d2a9\") " pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:39.792079 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792036 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-sysconfig\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.792079 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792063 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zmc8\" (UniqueName: \"kubernetes.io/projected/cae1d379-ac2f-4586-9634-429e6dfce7be-kube-api-access-8zmc8\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.792416 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792096 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6d9a3188-bb23-4be3-b39b-234bee924217-hosts-file\") pod \"node-resolver-k4tqk\" (UID: \"6d9a3188-bb23-4be3-b39b-234bee924217\") " pod="openshift-dns/node-resolver-k4tqk" Apr 22 14:15:39.792416 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792150 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6f09af95-f295-4dec-8131-f3dad5bd3e4d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.792416 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792211 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-var-lib-kubelet\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.792416 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792245 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-tuned\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.792416 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792288 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-var-lib-openvswitch\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.792416 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792320 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-node-log\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.792416 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792350 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e3614a5a-0db9-44d9-bdc3-8d3344b36689-socket-dir\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.792416 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792395 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-run\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.792765 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792426 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-run-netns\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.792765 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792453 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-cni-netd\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.792765 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792485 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3614a5a-0db9-44d9-bdc3-8d3344b36689-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.792765 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792516 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-lib-modules\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.792765 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792598 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvzgs\" (UniqueName: \"kubernetes.io/projected/6d9a3188-bb23-4be3-b39b-234bee924217-kube-api-access-gvzgs\") pod \"node-resolver-k4tqk\" (UID: \"6d9a3188-bb23-4be3-b39b-234bee924217\") " pod="openshift-dns/node-resolver-k4tqk" Apr 22 14:15:39.792765 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792641 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/73c63add-22e8-4809-b696-9279d2454538-env-overrides\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.792765 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792632 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e3614a5a-0db9-44d9-bdc3-8d3344b36689-sys-fs\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.792765 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792686 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cae1d379-ac2f-4586-9634-429e6dfce7be-tmp\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.792765 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792712 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8db7d438-84db-45bb-919c-709bca043fd8-cni-binary-copy\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.792765 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792713 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e3614a5a-0db9-44d9-bdc3-8d3344b36689-sys-fs\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.792765 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792751 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-var-lib-openvswitch\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.792765 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792755 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-host-run-k8s-cni-cncf-io\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.793275 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792802 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-host-run-multus-certs\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.793275 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792843 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-systemd-units\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.793275 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792922 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-node-log\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.793275 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792944 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6f09af95-f295-4dec-8131-f3dad5bd3e4d-os-release\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.793275 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792968 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3614a5a-0db9-44d9-bdc3-8d3344b36689-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.793275 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.792974 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-systemd-units\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.793275 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.793040 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-cni-netd\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.793275 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.793094 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3dd1c758-7fe6-4a4e-b170-8e5c199c937c-serviceca\") pod \"node-ca-wkvtd\" (UID: \"3dd1c758-7fe6-4a4e-b170-8e5c199c937c\") " pod="openshift-image-registry/node-ca-wkvtd" Apr 22 14:15:39.793275 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.793159 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6f09af95-f295-4dec-8131-f3dad5bd3e4d-os-release\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.793275 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.793164 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e3614a5a-0db9-44d9-bdc3-8d3344b36689-socket-dir\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.793275 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.793200 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-sysctl-conf\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.793275 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.793238 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-host\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.793779 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.793283 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-etc-kubernetes\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.793779 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.793293 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-run-netns\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.793779 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.793320 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f09af95-f295-4dec-8131-f3dad5bd3e4d-system-cni-dir\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.793779 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.793382 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-slash\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.793779 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.793390 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f09af95-f295-4dec-8131-f3dad5bd3e4d-system-cni-dir\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.793779 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.793430 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8b2k\" (UniqueName: \"kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k\") pod \"network-check-target-b27s2\" (UID: \"6259e3a6-004d-4f50-9a36-dc28c9b0cd96\") " pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:15:39.793779 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.793564 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/73c63add-22e8-4809-b696-9279d2454538-ovnkube-config\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.793779 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.793571 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3dd1c758-7fe6-4a4e-b170-8e5c199c937c-serviceca\") pod \"node-ca-wkvtd\" (UID: \"3dd1c758-7fe6-4a4e-b170-8e5c199c937c\") " pod="openshift-image-registry/node-ca-wkvtd" Apr 22 14:15:39.793779 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.793679 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/73c63add-22e8-4809-b696-9279d2454538-host-slash\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.793779 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.793738 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6f09af95-f295-4dec-8131-f3dad5bd3e4d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.794074 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.793715 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/73c63add-22e8-4809-b696-9279d2454538-ovn-node-metrics-cert\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.794147 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.794112 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2096774e-6172-439a-a5c0-779a91d43a80-agent-certs\") pod \"konnectivity-agent-h4hlm\" (UID: \"2096774e-6172-439a-a5c0-779a91d43a80\") " pod="kube-system/konnectivity-agent-h4hlm" Apr 22 14:15:39.800494 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:39.800477 2542 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:39.800494 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:39.800492 2542 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:39.800628 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:39.800502 2542 projected.go:194] Error preparing data for projected volume kube-api-access-b8b2k for pod openshift-network-diagnostics/network-check-target-b27s2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:39.800628 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:39.800541 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k podName:6259e3a6-004d-4f50-9a36-dc28c9b0cd96 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:40.300530091 +0000 UTC m=+2.082907202 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-b8b2k" (UniqueName: "kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k") pod "network-check-target-b27s2" (UID: "6259e3a6-004d-4f50-9a36-dc28c9b0cd96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:39.801275 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.801258 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb99l\" (UniqueName: \"kubernetes.io/projected/b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c-kube-api-access-fb99l\") pod \"iptables-alerter-hgz5l\" (UID: \"b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c\") " pod="openshift-network-operator/iptables-alerter-hgz5l" Apr 22 14:15:39.802772 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.802750 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxfvw\" (UniqueName: \"kubernetes.io/projected/3dd1c758-7fe6-4a4e-b170-8e5c199c937c-kube-api-access-qxfvw\") pod \"node-ca-wkvtd\" (UID: \"3dd1c758-7fe6-4a4e-b170-8e5c199c937c\") " pod="openshift-image-registry/node-ca-wkvtd" Apr 22 14:15:39.803084 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.803062 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcdxk\" (UniqueName: \"kubernetes.io/projected/e3614a5a-0db9-44d9-bdc3-8d3344b36689-kube-api-access-jcdxk\") pod \"aws-ebs-csi-driver-node-lkbg4\" (UID: \"e3614a5a-0db9-44d9-bdc3-8d3344b36689\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:39.803351 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.803334 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krdf4\" (UniqueName: \"kubernetes.io/projected/73c63add-22e8-4809-b696-9279d2454538-kube-api-access-krdf4\") pod \"ovnkube-node-q7q7t\" (UID: \"73c63add-22e8-4809-b696-9279d2454538\") " pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:39.803413 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.803379 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcvvf\" (UniqueName: \"kubernetes.io/projected/6f09af95-f295-4dec-8131-f3dad5bd3e4d-kube-api-access-jcvvf\") pod \"multus-additional-cni-plugins-7x75g\" (UID: \"6f09af95-f295-4dec-8131-f3dad5bd3e4d\") " pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:39.803803 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.803785 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-md6xx\" (UniqueName: \"kubernetes.io/projected/d326b6a1-5cbd-47fa-a676-90af9406d2a9-kube-api-access-md6xx\") pod \"network-metrics-daemon-g66xm\" (UID: \"d326b6a1-5cbd-47fa-a676-90af9406d2a9\") " pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:39.893947 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.893928 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-system-cni-dir\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.894023 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.893950 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-host-var-lib-cni-multus\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.894023 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.893967 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pztq\" (UniqueName: \"kubernetes.io/projected/8db7d438-84db-45bb-919c-709bca043fd8-kube-api-access-4pztq\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.894023 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.893982 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-multus-cni-dir\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.894023 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.893995 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-host-run-netns\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.894023 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894008 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-host-var-lib-kubelet\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.894023 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894018 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-host-var-lib-cni-multus\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.894306 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894026 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-sysconfig\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.894306 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894049 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zmc8\" (UniqueName: \"kubernetes.io/projected/cae1d379-ac2f-4586-9634-429e6dfce7be-kube-api-access-8zmc8\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.894306 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894060 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-host-run-netns\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.894306 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894072 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-host-var-lib-kubelet\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.894306 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894073 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6d9a3188-bb23-4be3-b39b-234bee924217-hosts-file\") pod \"node-resolver-k4tqk\" (UID: \"6d9a3188-bb23-4be3-b39b-234bee924217\") " pod="openshift-dns/node-resolver-k4tqk" Apr 22 14:15:39.894306 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894094 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-system-cni-dir\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.894306 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894115 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6d9a3188-bb23-4be3-b39b-234bee924217-hosts-file\") pod \"node-resolver-k4tqk\" (UID: \"6d9a3188-bb23-4be3-b39b-234bee924217\") " pod="openshift-dns/node-resolver-k4tqk" Apr 22 14:15:39.894306 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894117 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-var-lib-kubelet\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.894306 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894128 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-multus-cni-dir\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.894306 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894114 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-sysconfig\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.894306 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894155 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-var-lib-kubelet\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.894306 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894163 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-tuned\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.894306 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894209 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-run\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.894306 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894228 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-lib-modules\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.894306 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894242 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvzgs\" (UniqueName: \"kubernetes.io/projected/6d9a3188-bb23-4be3-b39b-234bee924217-kube-api-access-gvzgs\") pod \"node-resolver-k4tqk\" (UID: \"6d9a3188-bb23-4be3-b39b-234bee924217\") " pod="openshift-dns/node-resolver-k4tqk" Apr 22 14:15:39.894306 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894257 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cae1d379-ac2f-4586-9634-429e6dfce7be-tmp\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.894306 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894272 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8db7d438-84db-45bb-919c-709bca043fd8-cni-binary-copy\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895022 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894351 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-lib-modules\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.895022 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894392 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-host-run-k8s-cni-cncf-io\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895022 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894396 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-run\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.895022 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894422 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-host-run-multus-certs\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895022 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894427 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-host-run-k8s-cni-cncf-io\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895022 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894471 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-sysctl-conf\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.895022 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894533 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-host-run-multus-certs\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895022 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894537 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-host\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.895022 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894571 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-host\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.895022 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894580 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-etc-kubernetes\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895022 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894571 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-sysctl-conf\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.895022 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894630 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-etc-kubernetes\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895022 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894628 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6d9a3188-bb23-4be3-b39b-234bee924217-tmp-dir\") pod \"node-resolver-k4tqk\" (UID: \"6d9a3188-bb23-4be3-b39b-234bee924217\") " pod="openshift-dns/node-resolver-k4tqk" Apr 22 14:15:39.895022 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894659 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-cnibin\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895022 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894694 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-sys\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.895022 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894698 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-cnibin\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895022 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894712 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8db7d438-84db-45bb-919c-709bca043fd8-cni-binary-copy\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895022 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894742 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-multus-socket-dir-parent\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894751 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-sys\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.895641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894782 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-systemd\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.895641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894795 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-multus-socket-dir-parent\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894806 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-modprobe-d\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.895641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894828 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-sysctl-d\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.895641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894832 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-systemd\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.895641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894842 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-os-release\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894861 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-hostroot\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894867 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6d9a3188-bb23-4be3-b39b-234bee924217-tmp-dir\") pod \"node-resolver-k4tqk\" (UID: \"6d9a3188-bb23-4be3-b39b-234bee924217\") " pod="openshift-dns/node-resolver-k4tqk" Apr 22 14:15:39.895641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894883 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-multus-conf-dir\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894914 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-os-release\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894924 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-modprobe-d\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.895641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894915 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-multus-conf-dir\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894941 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-hostroot\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894938 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8db7d438-84db-45bb-919c-709bca043fd8-multus-daemon-config\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894968 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-kubernetes\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.895641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894985 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-host-var-lib-cni-bin\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.895641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.894924 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-sysctl-d\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.896158 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.895041 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-kubernetes\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.896158 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.895062 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8db7d438-84db-45bb-919c-709bca043fd8-host-var-lib-cni-bin\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.896158 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.895351 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8db7d438-84db-45bb-919c-709bca043fd8-multus-daemon-config\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.896331 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.896317 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cae1d379-ac2f-4586-9634-429e6dfce7be-tmp\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.896365 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.896350 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cae1d379-ac2f-4586-9634-429e6dfce7be-etc-tuned\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.901562 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.901541 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zmc8\" (UniqueName: \"kubernetes.io/projected/cae1d379-ac2f-4586-9634-429e6dfce7be-kube-api-access-8zmc8\") pod \"tuned-tg5pr\" (UID: \"cae1d379-ac2f-4586-9634-429e6dfce7be\") " pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:39.901668 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.901652 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvzgs\" (UniqueName: \"kubernetes.io/projected/6d9a3188-bb23-4be3-b39b-234bee924217-kube-api-access-gvzgs\") pod \"node-resolver-k4tqk\" (UID: \"6d9a3188-bb23-4be3-b39b-234bee924217\") " pod="openshift-dns/node-resolver-k4tqk" Apr 22 14:15:39.902113 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.902095 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pztq\" (UniqueName: \"kubernetes.io/projected/8db7d438-84db-45bb-919c-709bca043fd8-kube-api-access-4pztq\") pod \"multus-r7sbq\" (UID: \"8db7d438-84db-45bb-919c-709bca043fd8\") " pod="openshift-multus/multus-r7sbq" Apr 22 14:15:39.998287 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:39.998236 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:15:40.004041 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:40.004018 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73c63add_22e8_4809_b696_9279d2454538.slice/crio-93338f73685c8be0baadc476b3f4a161b3abe76d0bed83ecc50c7f7b11ca1ea5 WatchSource:0}: Error finding container 93338f73685c8be0baadc476b3f4a161b3abe76d0bed83ecc50c7f7b11ca1ea5: Status 404 returned error can't find the container with id 93338f73685c8be0baadc476b3f4a161b3abe76d0bed83ecc50c7f7b11ca1ea5 Apr 22 14:15:40.016869 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.016851 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7x75g" Apr 22 14:15:40.022694 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:40.022670 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f09af95_f295_4dec_8131_f3dad5bd3e4d.slice/crio-fb9309bb6e4549bf7929b4ff01430ab03a5eba445f7bfdbf545e9adcdbe27ce2 WatchSource:0}: Error finding container fb9309bb6e4549bf7929b4ff01430ab03a5eba445f7bfdbf545e9adcdbe27ce2: Status 404 returned error can't find the container with id fb9309bb6e4549bf7929b4ff01430ab03a5eba445f7bfdbf545e9adcdbe27ce2 Apr 22 14:15:40.027689 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.027672 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hgz5l" Apr 22 14:15:40.032893 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:40.032875 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6f1e4c3_79b8_4c6b_a1f0_d69c6984614c.slice/crio-268f7c7649446f99832e4a47f006a2edb61e17d8b07d5b2e65f53373f6ff9c71 WatchSource:0}: Error finding container 268f7c7649446f99832e4a47f006a2edb61e17d8b07d5b2e65f53373f6ff9c71: Status 404 returned error can't find the container with id 268f7c7649446f99832e4a47f006a2edb61e17d8b07d5b2e65f53373f6ff9c71 Apr 22 14:15:40.042981 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.042966 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wkvtd" Apr 22 14:15:40.049405 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:40.049381 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dd1c758_7fe6_4a4e_b170_8e5c199c937c.slice/crio-c2efc98767078d8540331caf61d8789feb7ee02ed87d92bd0ca513d969cab451 WatchSource:0}: Error finding container c2efc98767078d8540331caf61d8789feb7ee02ed87d92bd0ca513d969cab451: Status 404 returned error can't find the container with id c2efc98767078d8540331caf61d8789feb7ee02ed87d92bd0ca513d969cab451 Apr 22 14:15:40.060228 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.060213 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h4hlm" Apr 22 14:15:40.065740 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:40.065720 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2096774e_6172_439a_a5c0_779a91d43a80.slice/crio-9d867bff854ba35bdbcac1ea349c033791767b5809e72ed178af091324528347 WatchSource:0}: Error finding container 9d867bff854ba35bdbcac1ea349c033791767b5809e72ed178af091324528347: Status 404 returned error can't find the container with id 9d867bff854ba35bdbcac1ea349c033791767b5809e72ed178af091324528347 Apr 22 14:15:40.068320 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.068304 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" Apr 22 14:15:40.073067 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:40.073050 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3614a5a_0db9_44d9_bdc3_8d3344b36689.slice/crio-1010dacb5f9f443a8c34492b267844378af61a4c23b2612923787f482652dc00 WatchSource:0}: Error finding container 1010dacb5f9f443a8c34492b267844378af61a4c23b2612923787f482652dc00: Status 404 returned error can't find the container with id 1010dacb5f9f443a8c34492b267844378af61a4c23b2612923787f482652dc00 Apr 22 14:15:40.075155 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.075144 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" Apr 22 14:15:40.080690 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.080672 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k4tqk" Apr 22 14:15:40.081216 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:40.081176 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcae1d379_ac2f_4586_9634_429e6dfce7be.slice/crio-a16880f75a3bbda7a520b0701955ad99dc8e2cf4c249959cd00df50aad8d9aa3 WatchSource:0}: Error finding container a16880f75a3bbda7a520b0701955ad99dc8e2cf4c249959cd00df50aad8d9aa3: Status 404 returned error can't find the container with id a16880f75a3bbda7a520b0701955ad99dc8e2cf4c249959cd00df50aad8d9aa3 Apr 22 14:15:40.085414 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.085396 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r7sbq" Apr 22 14:15:40.087232 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:40.087210 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d9a3188_bb23_4be3_b39b_234bee924217.slice/crio-9ddb8c2d15d5f0472abbfafbcdd8c95ab1f08b69c2e5a0fbe5fcaf389e91a6d6 WatchSource:0}: Error finding container 9ddb8c2d15d5f0472abbfafbcdd8c95ab1f08b69c2e5a0fbe5fcaf389e91a6d6: Status 404 returned error can't find the container with id 9ddb8c2d15d5f0472abbfafbcdd8c95ab1f08b69c2e5a0fbe5fcaf389e91a6d6 Apr 22 14:15:40.091542 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:15:40.091521 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8db7d438_84db_45bb_919c_709bca043fd8.slice/crio-c5ba257236fec8ebd00d511b4c8d56870819c96b909ac5624fe29f017d6e7ca9 WatchSource:0}: Error finding container c5ba257236fec8ebd00d511b4c8d56870819c96b909ac5624fe29f017d6e7ca9: Status 404 returned error can't find the container with id c5ba257236fec8ebd00d511b4c8d56870819c96b909ac5624fe29f017d6e7ca9 Apr 22 14:15:40.171436 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.171280 2542 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:40.297501 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.297415 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs\") pod \"network-metrics-daemon-g66xm\" (UID: \"d326b6a1-5cbd-47fa-a676-90af9406d2a9\") " pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:40.297645 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:40.297552 2542 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:40.297645 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:40.297609 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs podName:d326b6a1-5cbd-47fa-a676-90af9406d2a9 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:41.297590841 +0000 UTC m=+3.079967952 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs") pod "network-metrics-daemon-g66xm" (UID: "d326b6a1-5cbd-47fa-a676-90af9406d2a9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:40.398575 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.398447 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8b2k\" (UniqueName: \"kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k\") pod \"network-check-target-b27s2\" (UID: \"6259e3a6-004d-4f50-9a36-dc28c9b0cd96\") " pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:15:40.398757 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:40.398630 2542 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:40.398757 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:40.398650 2542 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:40.398757 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:40.398663 2542 projected.go:194] Error preparing data for projected volume kube-api-access-b8b2k for pod openshift-network-diagnostics/network-check-target-b27s2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:40.398757 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:40.398715 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k podName:6259e3a6-004d-4f50-9a36-dc28c9b0cd96 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:41.398697428 +0000 UTC m=+3.181074544 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-b8b2k" (UniqueName: "kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k") pod "network-check-target-b27s2" (UID: "6259e3a6-004d-4f50-9a36-dc28c9b0cd96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:40.737475 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.737427 2542 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:39 +0000 UTC" deadline="2027-10-15 21:11:16.167412179 +0000 UTC" Apr 22 14:15:40.737475 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.737467 2542 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12990h55m35.429949418s" Apr 22 14:15:40.786353 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.786325 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:40.786516 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:40.786462 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g66xm" podUID="d326b6a1-5cbd-47fa-a676-90af9406d2a9" Apr 22 14:15:40.825176 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.823016 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" event={"ID":"cae1d379-ac2f-4586-9634-429e6dfce7be","Type":"ContainerStarted","Data":"a16880f75a3bbda7a520b0701955ad99dc8e2cf4c249959cd00df50aad8d9aa3"} Apr 22 14:15:40.829499 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.829469 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" event={"ID":"e3614a5a-0db9-44d9-bdc3-8d3344b36689","Type":"ContainerStarted","Data":"1010dacb5f9f443a8c34492b267844378af61a4c23b2612923787f482652dc00"} Apr 22 14:15:40.834570 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.834500 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h4hlm" event={"ID":"2096774e-6172-439a-a5c0-779a91d43a80","Type":"ContainerStarted","Data":"9d867bff854ba35bdbcac1ea349c033791767b5809e72ed178af091324528347"} Apr 22 14:15:40.847298 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.847260 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hgz5l" event={"ID":"b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c","Type":"ContainerStarted","Data":"268f7c7649446f99832e4a47f006a2edb61e17d8b07d5b2e65f53373f6ff9c71"} Apr 22 14:15:40.853744 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.853719 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7x75g" event={"ID":"6f09af95-f295-4dec-8131-f3dad5bd3e4d","Type":"ContainerStarted","Data":"fb9309bb6e4549bf7929b4ff01430ab03a5eba445f7bfdbf545e9adcdbe27ce2"} Apr 22 14:15:40.884230 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.884198 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r7sbq" event={"ID":"8db7d438-84db-45bb-919c-709bca043fd8","Type":"ContainerStarted","Data":"c5ba257236fec8ebd00d511b4c8d56870819c96b909ac5624fe29f017d6e7ca9"} Apr 22 14:15:40.901147 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.901064 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wkvtd" event={"ID":"3dd1c758-7fe6-4a4e-b170-8e5c199c937c","Type":"ContainerStarted","Data":"c2efc98767078d8540331caf61d8789feb7ee02ed87d92bd0ca513d969cab451"} Apr 22 14:15:40.911244 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.911209 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" event={"ID":"73c63add-22e8-4809-b696-9279d2454538","Type":"ContainerStarted","Data":"93338f73685c8be0baadc476b3f4a161b3abe76d0bed83ecc50c7f7b11ca1ea5"} Apr 22 14:15:40.914464 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:40.914337 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k4tqk" event={"ID":"6d9a3188-bb23-4be3-b39b-234bee924217","Type":"ContainerStarted","Data":"9ddb8c2d15d5f0472abbfafbcdd8c95ab1f08b69c2e5a0fbe5fcaf389e91a6d6"} Apr 22 14:15:41.159528 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:41.159345 2542 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:41.305807 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:41.305265 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs\") pod \"network-metrics-daemon-g66xm\" (UID: \"d326b6a1-5cbd-47fa-a676-90af9406d2a9\") " pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:41.305807 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:41.305426 2542 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:41.305807 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:41.305489 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs podName:d326b6a1-5cbd-47fa-a676-90af9406d2a9 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:43.305471142 +0000 UTC m=+5.087848270 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs") pod "network-metrics-daemon-g66xm" (UID: "d326b6a1-5cbd-47fa-a676-90af9406d2a9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:41.406297 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:41.405692 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8b2k\" (UniqueName: \"kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k\") pod \"network-check-target-b27s2\" (UID: \"6259e3a6-004d-4f50-9a36-dc28c9b0cd96\") " pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:15:41.406297 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:41.405883 2542 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:41.406297 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:41.405900 2542 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:41.406297 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:41.405913 2542 projected.go:194] Error preparing data for projected volume kube-api-access-b8b2k for pod openshift-network-diagnostics/network-check-target-b27s2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:41.406297 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:41.405965 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k podName:6259e3a6-004d-4f50-9a36-dc28c9b0cd96 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:43.405949112 +0000 UTC m=+5.188326223 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-b8b2k" (UniqueName: "kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k") pod "network-check-target-b27s2" (UID: "6259e3a6-004d-4f50-9a36-dc28c9b0cd96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:41.737810 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:41.737746 2542 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:39 +0000 UTC" deadline="2027-10-28 11:09:43.8292622 +0000 UTC" Apr 22 14:15:41.737810 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:41.737786 2542 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13292h54m2.09147936s" Apr 22 14:15:41.783756 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:41.783239 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:15:41.783756 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:41.783382 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b27s2" podUID="6259e3a6-004d-4f50-9a36-dc28c9b0cd96" Apr 22 14:15:42.784902 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:42.784423 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:42.784902 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:42.784568 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g66xm" podUID="d326b6a1-5cbd-47fa-a676-90af9406d2a9" Apr 22 14:15:43.321445 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:43.321407 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs\") pod \"network-metrics-daemon-g66xm\" (UID: \"d326b6a1-5cbd-47fa-a676-90af9406d2a9\") " pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:43.321629 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:43.321533 2542 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:43.321629 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:43.321596 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs podName:d326b6a1-5cbd-47fa-a676-90af9406d2a9 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:47.321579165 +0000 UTC m=+9.103956293 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs") pod "network-metrics-daemon-g66xm" (UID: "d326b6a1-5cbd-47fa-a676-90af9406d2a9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:43.422698 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:43.422666 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8b2k\" (UniqueName: \"kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k\") pod \"network-check-target-b27s2\" (UID: \"6259e3a6-004d-4f50-9a36-dc28c9b0cd96\") " pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:15:43.422859 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:43.422844 2542 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:43.422913 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:43.422867 2542 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:43.422913 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:43.422879 2542 projected.go:194] Error preparing data for projected volume kube-api-access-b8b2k for pod openshift-network-diagnostics/network-check-target-b27s2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:43.423024 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:43.422937 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k podName:6259e3a6-004d-4f50-9a36-dc28c9b0cd96 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:47.422917086 +0000 UTC m=+9.205294201 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-b8b2k" (UniqueName: "kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k") pod "network-check-target-b27s2" (UID: "6259e3a6-004d-4f50-9a36-dc28c9b0cd96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:43.783803 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:43.783769 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:15:43.784001 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:43.783910 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b27s2" podUID="6259e3a6-004d-4f50-9a36-dc28c9b0cd96" Apr 22 14:15:44.783892 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:44.783858 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:44.784351 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:44.783992 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g66xm" podUID="d326b6a1-5cbd-47fa-a676-90af9406d2a9" Apr 22 14:15:45.784084 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:45.783891 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:15:45.784084 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:45.784043 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b27s2" podUID="6259e3a6-004d-4f50-9a36-dc28c9b0cd96" Apr 22 14:15:46.783777 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:46.783742 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:46.783976 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:46.783901 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g66xm" podUID="d326b6a1-5cbd-47fa-a676-90af9406d2a9" Apr 22 14:15:47.355851 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:47.355806 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs\") pod \"network-metrics-daemon-g66xm\" (UID: \"d326b6a1-5cbd-47fa-a676-90af9406d2a9\") " pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:47.356279 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:47.355946 2542 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:47.356279 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:47.356036 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs podName:d326b6a1-5cbd-47fa-a676-90af9406d2a9 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:55.356014522 +0000 UTC m=+17.138391645 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs") pod "network-metrics-daemon-g66xm" (UID: "d326b6a1-5cbd-47fa-a676-90af9406d2a9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:47.456728 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:47.456653 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8b2k\" (UniqueName: \"kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k\") pod \"network-check-target-b27s2\" (UID: \"6259e3a6-004d-4f50-9a36-dc28c9b0cd96\") " pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:15:47.456923 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:47.456826 2542 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:47.456923 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:47.456849 2542 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:47.456923 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:47.456862 2542 projected.go:194] Error preparing data for projected volume kube-api-access-b8b2k for pod openshift-network-diagnostics/network-check-target-b27s2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:47.457088 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:47.456969 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k podName:6259e3a6-004d-4f50-9a36-dc28c9b0cd96 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:55.456949337 +0000 UTC m=+17.239326464 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-b8b2k" (UniqueName: "kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k") pod "network-check-target-b27s2" (UID: "6259e3a6-004d-4f50-9a36-dc28c9b0cd96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:47.783391 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:47.783353 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:15:47.783563 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:47.783498 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b27s2" podUID="6259e3a6-004d-4f50-9a36-dc28c9b0cd96" Apr 22 14:15:48.784436 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:48.784404 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:48.784886 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:48.784519 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g66xm" podUID="d326b6a1-5cbd-47fa-a676-90af9406d2a9" Apr 22 14:15:49.783751 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:49.783719 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:15:49.783948 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:49.783845 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b27s2" podUID="6259e3a6-004d-4f50-9a36-dc28c9b0cd96" Apr 22 14:15:49.989403 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:49.989369 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-s96sk"] Apr 22 14:15:49.993287 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:49.993253 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:15:49.993400 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:49.993327 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s96sk" podUID="00b1aeaa-9a7a-4380-a5aa-0891caae4c5e" Apr 22 14:15:50.077138 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:50.077105 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret\") pod \"global-pull-secret-syncer-s96sk\" (UID: \"00b1aeaa-9a7a-4380-a5aa-0891caae4c5e\") " pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:15:50.077329 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:50.077152 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-dbus\") pod \"global-pull-secret-syncer-s96sk\" (UID: \"00b1aeaa-9a7a-4380-a5aa-0891caae4c5e\") " pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:15:50.077329 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:50.077241 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-kubelet-config\") pod \"global-pull-secret-syncer-s96sk\" (UID: \"00b1aeaa-9a7a-4380-a5aa-0891caae4c5e\") " pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:15:50.177685 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:50.177650 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret\") pod \"global-pull-secret-syncer-s96sk\" (UID: \"00b1aeaa-9a7a-4380-a5aa-0891caae4c5e\") " pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:15:50.177841 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:50.177712 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-dbus\") pod \"global-pull-secret-syncer-s96sk\" (UID: \"00b1aeaa-9a7a-4380-a5aa-0891caae4c5e\") " pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:15:50.177841 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:50.177750 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-kubelet-config\") pod \"global-pull-secret-syncer-s96sk\" (UID: \"00b1aeaa-9a7a-4380-a5aa-0891caae4c5e\") " pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:15:50.177841 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:50.177817 2542 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:50.178017 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:50.177856 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-kubelet-config\") pod \"global-pull-secret-syncer-s96sk\" (UID: \"00b1aeaa-9a7a-4380-a5aa-0891caae4c5e\") " pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:15:50.178017 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:50.177876 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret podName:00b1aeaa-9a7a-4380-a5aa-0891caae4c5e nodeName:}" failed. No retries permitted until 2026-04-22 14:15:50.677858156 +0000 UTC m=+12.460235281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret") pod "global-pull-secret-syncer-s96sk" (UID: "00b1aeaa-9a7a-4380-a5aa-0891caae4c5e") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:50.178017 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:50.177942 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-dbus\") pod \"global-pull-secret-syncer-s96sk\" (UID: \"00b1aeaa-9a7a-4380-a5aa-0891caae4c5e\") " pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:15:50.682357 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:50.682326 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret\") pod \"global-pull-secret-syncer-s96sk\" (UID: \"00b1aeaa-9a7a-4380-a5aa-0891caae4c5e\") " pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:15:50.682523 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:50.682479 2542 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:50.682568 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:50.682555 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret podName:00b1aeaa-9a7a-4380-a5aa-0891caae4c5e nodeName:}" failed. No retries permitted until 2026-04-22 14:15:51.68253734 +0000 UTC m=+13.464914471 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret") pod "global-pull-secret-syncer-s96sk" (UID: "00b1aeaa-9a7a-4380-a5aa-0891caae4c5e") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:50.783424 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:50.783394 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:50.783567 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:50.783527 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g66xm" podUID="d326b6a1-5cbd-47fa-a676-90af9406d2a9" Apr 22 14:15:51.689342 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:51.689310 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret\") pod \"global-pull-secret-syncer-s96sk\" (UID: \"00b1aeaa-9a7a-4380-a5aa-0891caae4c5e\") " pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:15:51.689780 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:51.689469 2542 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:51.689780 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:51.689548 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret podName:00b1aeaa-9a7a-4380-a5aa-0891caae4c5e nodeName:}" failed. No retries permitted until 2026-04-22 14:15:53.689527059 +0000 UTC m=+15.471904176 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret") pod "global-pull-secret-syncer-s96sk" (UID: "00b1aeaa-9a7a-4380-a5aa-0891caae4c5e") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:51.783646 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:51.783613 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:15:51.783832 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:51.783612 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:15:51.783832 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:51.783743 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b27s2" podUID="6259e3a6-004d-4f50-9a36-dc28c9b0cd96" Apr 22 14:15:51.783940 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:51.783830 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s96sk" podUID="00b1aeaa-9a7a-4380-a5aa-0891caae4c5e" Apr 22 14:15:52.783632 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:52.783593 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:52.784013 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:52.783746 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g66xm" podUID="d326b6a1-5cbd-47fa-a676-90af9406d2a9" Apr 22 14:15:53.705709 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:53.705666 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret\") pod \"global-pull-secret-syncer-s96sk\" (UID: \"00b1aeaa-9a7a-4380-a5aa-0891caae4c5e\") " pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:15:53.705891 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:53.705825 2542 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:53.705944 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:53.705903 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret podName:00b1aeaa-9a7a-4380-a5aa-0891caae4c5e nodeName:}" failed. No retries permitted until 2026-04-22 14:15:57.705885761 +0000 UTC m=+19.488262889 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret") pod "global-pull-secret-syncer-s96sk" (UID: "00b1aeaa-9a7a-4380-a5aa-0891caae4c5e") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:53.783947 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:53.783913 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:15:53.784372 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:53.783913 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:15:53.784372 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:53.784036 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s96sk" podUID="00b1aeaa-9a7a-4380-a5aa-0891caae4c5e" Apr 22 14:15:53.784372 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:53.784107 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b27s2" podUID="6259e3a6-004d-4f50-9a36-dc28c9b0cd96" Apr 22 14:15:54.783467 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:54.783432 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:54.783666 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:54.783557 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g66xm" podUID="d326b6a1-5cbd-47fa-a676-90af9406d2a9" Apr 22 14:15:55.417175 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:55.416961 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs\") pod \"network-metrics-daemon-g66xm\" (UID: \"d326b6a1-5cbd-47fa-a676-90af9406d2a9\") " pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:55.417609 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:55.417138 2542 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:55.417609 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:55.417274 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs podName:d326b6a1-5cbd-47fa-a676-90af9406d2a9 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:11.417258925 +0000 UTC m=+33.199636036 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs") pod "network-metrics-daemon-g66xm" (UID: "d326b6a1-5cbd-47fa-a676-90af9406d2a9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:55.517784 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:55.517743 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8b2k\" (UniqueName: \"kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k\") pod \"network-check-target-b27s2\" (UID: \"6259e3a6-004d-4f50-9a36-dc28c9b0cd96\") " pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:15:55.517972 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:55.517951 2542 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:55.518039 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:55.517980 2542 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:55.518039 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:55.517995 2542 projected.go:194] Error preparing data for projected volume kube-api-access-b8b2k for pod openshift-network-diagnostics/network-check-target-b27s2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:55.518098 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:55.518059 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k podName:6259e3a6-004d-4f50-9a36-dc28c9b0cd96 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:11.518040555 +0000 UTC m=+33.300417682 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-b8b2k" (UniqueName: "kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k") pod "network-check-target-b27s2" (UID: "6259e3a6-004d-4f50-9a36-dc28c9b0cd96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:55.783213 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:55.783122 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:15:55.783363 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:55.783122 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:15:55.783363 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:55.783262 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b27s2" podUID="6259e3a6-004d-4f50-9a36-dc28c9b0cd96" Apr 22 14:15:55.783363 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:55.783306 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s96sk" podUID="00b1aeaa-9a7a-4380-a5aa-0891caae4c5e" Apr 22 14:15:56.783458 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:56.783414 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:56.783836 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:56.783557 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g66xm" podUID="d326b6a1-5cbd-47fa-a676-90af9406d2a9" Apr 22 14:15:57.735636 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:57.735614 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret\") pod \"global-pull-secret-syncer-s96sk\" (UID: \"00b1aeaa-9a7a-4380-a5aa-0891caae4c5e\") " pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:15:57.735762 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:57.735742 2542 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:57.735834 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:57.735806 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret podName:00b1aeaa-9a7a-4380-a5aa-0891caae4c5e nodeName:}" failed. No retries permitted until 2026-04-22 14:16:05.735787475 +0000 UTC m=+27.518164594 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret") pod "global-pull-secret-syncer-s96sk" (UID: "00b1aeaa-9a7a-4380-a5aa-0891caae4c5e") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:57.782959 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:57.782932 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:15:57.783092 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:57.783054 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b27s2" podUID="6259e3a6-004d-4f50-9a36-dc28c9b0cd96" Apr 22 14:15:57.783092 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:57.782932 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:15:57.783220 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:57.783149 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s96sk" podUID="00b1aeaa-9a7a-4380-a5aa-0891caae4c5e" Apr 22 14:15:57.946793 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:57.946749 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r7sbq" event={"ID":"8db7d438-84db-45bb-919c-709bca043fd8","Type":"ContainerStarted","Data":"32c9872f492795a154775a12c1cf9436b08e5e8e5101a2fe67fa27c689fd0510"} Apr 22 14:15:57.948760 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:57.948743 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:15:57.949089 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:57.949069 2542 generic.go:358] "Generic (PLEG): container finished" podID="73c63add-22e8-4809-b696-9279d2454538" containerID="96178ed4cc5f8f1a79cdb875d97b84f55d1ae87352b826a6809d560fc6830d7b" exitCode=1 Apr 22 14:15:57.949175 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:57.949124 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" event={"ID":"73c63add-22e8-4809-b696-9279d2454538","Type":"ContainerStarted","Data":"9c647c315d8c2ab2d9a77d4eaa6d5ec7d85e68dd30c1deb59ac24050d6be7700"} Apr 22 14:15:57.949175 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:57.949145 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" event={"ID":"73c63add-22e8-4809-b696-9279d2454538","Type":"ContainerStarted","Data":"3991003e84a52a4bcbb9deb8c715bfc56031994ae851fa1f28ad08625333f17e"} Apr 22 14:15:57.949175 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:57.949157 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" event={"ID":"73c63add-22e8-4809-b696-9279d2454538","Type":"ContainerDied","Data":"96178ed4cc5f8f1a79cdb875d97b84f55d1ae87352b826a6809d560fc6830d7b"} Apr 22 14:15:57.949175 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:57.949167 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" event={"ID":"73c63add-22e8-4809-b696-9279d2454538","Type":"ContainerStarted","Data":"58ba0af78088bc10ceb7b6763fb352539dc136673bc88cb2766f854228a64c95"} Apr 22 14:15:57.950504 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:57.950485 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-65.ec2.internal" event={"ID":"0c43fa24b7e06ba8965ac528ac76a464","Type":"ContainerStarted","Data":"64c3bab9efbf72f8c8935d6702b6f36159db510fdbb82445404a330a395a83d4"} Apr 22 14:15:57.951874 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:57.951856 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" event={"ID":"cae1d379-ac2f-4586-9634-429e6dfce7be","Type":"ContainerStarted","Data":"c1a3bfe6d96b7420c87d2649977b60d1ca7ce86728ebe52ebb0ca7d14c55f093"} Apr 22 14:15:57.985287 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:57.985224 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tg5pr" podStartSLOduration=1.605832072 podStartE2EDuration="18.98520733s" podCreationTimestamp="2026-04-22 14:15:39 +0000 UTC" firstStartedPulling="2026-04-22 14:15:40.082647661 +0000 UTC m=+1.865024773" lastFinishedPulling="2026-04-22 14:15:57.462022904 +0000 UTC m=+19.244400031" observedRunningTime="2026-04-22 14:15:57.98471627 +0000 UTC m=+19.767093404" watchObservedRunningTime="2026-04-22 14:15:57.98520733 +0000 UTC m=+19.767584466" Apr 22 14:15:57.985523 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:57.985478 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-r7sbq" podStartSLOduration=1.400064461 podStartE2EDuration="18.985467119s" podCreationTimestamp="2026-04-22 14:15:39 +0000 UTC" firstStartedPulling="2026-04-22 14:15:40.092975697 +0000 UTC m=+1.875352811" lastFinishedPulling="2026-04-22 14:15:57.678378342 +0000 UTC m=+19.460755469" observedRunningTime="2026-04-22 14:15:57.965112035 +0000 UTC m=+19.747489168" watchObservedRunningTime="2026-04-22 14:15:57.985467119 +0000 UTC m=+19.767844253" Apr 22 14:15:58.004295 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:58.004244 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-65.ec2.internal" podStartSLOduration=19.004229543 podStartE2EDuration="19.004229543s" podCreationTimestamp="2026-04-22 14:15:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:58.00418039 +0000 UTC m=+19.786557524" watchObservedRunningTime="2026-04-22 14:15:58.004229543 +0000 UTC m=+19.786606676" Apr 22 14:15:58.784233 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:58.784174 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:15:58.784406 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:58.784329 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g66xm" podUID="d326b6a1-5cbd-47fa-a676-90af9406d2a9" Apr 22 14:15:58.955424 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:58.955236 2542 generic.go:358] "Generic (PLEG): container finished" podID="f18f1cd7406c822d6c890121cdd6a6f4" containerID="1732ef6fdd24db7605279011689002faae700ef6fec01b60a6c0ceb6260d8b8f" exitCode=0 Apr 22 14:15:58.956207 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:58.955321 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal" event={"ID":"f18f1cd7406c822d6c890121cdd6a6f4","Type":"ContainerDied","Data":"1732ef6fdd24db7605279011689002faae700ef6fec01b60a6c0ceb6260d8b8f"} Apr 22 14:15:58.956817 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:58.956788 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k4tqk" event={"ID":"6d9a3188-bb23-4be3-b39b-234bee924217","Type":"ContainerStarted","Data":"822a452b1ef3e253410fcd6e9f6ac09a234ee3d201ed676ca20881eda5ad6326"} Apr 22 14:15:58.958176 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:58.958155 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" event={"ID":"e3614a5a-0db9-44d9-bdc3-8d3344b36689","Type":"ContainerStarted","Data":"59600b11308edb3c675474c6b5251a3662662acb25123549c133e8ea7289a123"} Apr 22 14:15:58.959447 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:58.959409 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h4hlm" event={"ID":"2096774e-6172-439a-a5c0-779a91d43a80","Type":"ContainerStarted","Data":"3682397c3031611b7138d9437c39b37cc8bd164b110b5b3b76843cd360aef2e5"} Apr 22 14:15:58.960637 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:58.960616 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hgz5l" event={"ID":"b6f1e4c3-79b8-4c6b-a1f0-d69c6984614c","Type":"ContainerStarted","Data":"90c2c3d0a500bca5ed8020ad6e308c427ee888ef4341012c1a308e0d1c96b15c"} Apr 22 14:15:58.961871 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:58.961849 2542 generic.go:358] "Generic (PLEG): container finished" podID="6f09af95-f295-4dec-8131-f3dad5bd3e4d" containerID="c40bbdcefdf495f0fff7b03f9be98a9a7a222e7747a6cc4c1174a9717ec4c9ea" exitCode=0 Apr 22 14:15:58.961969 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:58.961924 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7x75g" event={"ID":"6f09af95-f295-4dec-8131-f3dad5bd3e4d","Type":"ContainerDied","Data":"c40bbdcefdf495f0fff7b03f9be98a9a7a222e7747a6cc4c1174a9717ec4c9ea"} Apr 22 14:15:58.963277 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:58.963257 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wkvtd" event={"ID":"3dd1c758-7fe6-4a4e-b170-8e5c199c937c","Type":"ContainerStarted","Data":"ae43ef2c4ef5f972ddc1a18ecac4f7f4f8053a18aa53b5227a90a7fd49f14a42"} Apr 22 14:15:58.965953 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:58.965937 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:15:58.966289 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:58.966271 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" event={"ID":"73c63add-22e8-4809-b696-9279d2454538","Type":"ContainerStarted","Data":"803625e739b1aaf12ba23cc3ded8b35f7dc0e9e631f57357f57c34d892219bd0"} Apr 22 14:15:58.966359 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:58.966296 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" event={"ID":"73c63add-22e8-4809-b696-9279d2454538","Type":"ContainerStarted","Data":"8b42d3b2c77fec847bb2846e8f611ccdfb45eb1fae0fc0505249f57e77baa61d"} Apr 22 14:15:58.988594 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:58.988538 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-k4tqk" podStartSLOduration=2.517986203 podStartE2EDuration="19.988521857s" podCreationTimestamp="2026-04-22 14:15:39 +0000 UTC" firstStartedPulling="2026-04-22 14:15:40.088734365 +0000 UTC m=+1.871111476" lastFinishedPulling="2026-04-22 14:15:57.559270005 +0000 UTC m=+19.341647130" observedRunningTime="2026-04-22 14:15:58.988073953 +0000 UTC m=+20.770451077" watchObservedRunningTime="2026-04-22 14:15:58.988521857 +0000 UTC m=+20.770898989" Apr 22 14:15:59.024304 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:59.024262 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hgz5l" podStartSLOduration=3.55246291 podStartE2EDuration="21.02424791s" podCreationTimestamp="2026-04-22 14:15:38 +0000 UTC" firstStartedPulling="2026-04-22 14:15:40.034153403 +0000 UTC m=+1.816530515" lastFinishedPulling="2026-04-22 14:15:57.505938404 +0000 UTC m=+19.288315515" observedRunningTime="2026-04-22 14:15:59.023797528 +0000 UTC m=+20.806174660" watchObservedRunningTime="2026-04-22 14:15:59.02424791 +0000 UTC m=+20.806625042" Apr 22 14:15:59.054316 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:59.054273 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wkvtd" podStartSLOduration=3.5457558750000002 podStartE2EDuration="21.054259354s" podCreationTimestamp="2026-04-22 14:15:38 +0000 UTC" firstStartedPulling="2026-04-22 14:15:40.05034579 +0000 UTC m=+1.832722915" lastFinishedPulling="2026-04-22 14:15:57.558849266 +0000 UTC m=+19.341226394" observedRunningTime="2026-04-22 14:15:59.038086041 +0000 UTC m=+20.820463175" watchObservedRunningTime="2026-04-22 14:15:59.054259354 +0000 UTC m=+20.836636465" Apr 22 14:15:59.054493 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:59.054471 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-h4hlm" podStartSLOduration=2.6593597989999997 podStartE2EDuration="20.054468275s" podCreationTimestamp="2026-04-22 14:15:39 +0000 UTC" firstStartedPulling="2026-04-22 14:15:40.06701695 +0000 UTC m=+1.849394061" lastFinishedPulling="2026-04-22 14:15:57.462125421 +0000 UTC m=+19.244502537" observedRunningTime="2026-04-22 14:15:59.053155031 +0000 UTC m=+20.835532164" watchObservedRunningTime="2026-04-22 14:15:59.054468275 +0000 UTC m=+20.836845408" Apr 22 14:15:59.178823 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:59.178797 2542 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 14:15:59.542829 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:59.542793 2542 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-h4hlm" Apr 22 14:15:59.543739 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:59.543717 2542 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-h4hlm" Apr 22 14:15:59.751160 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:59.751051 2542 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T14:15:59.178811447Z","UUID":"1f9a2bf6-59be-450d-8f84-bd95acec5337","Handler":null,"Name":"","Endpoint":""} Apr 22 14:15:59.753374 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:59.753340 2542 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 14:15:59.753374 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:59.753369 2542 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 14:15:59.783017 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:59.782970 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:15:59.783234 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:59.782982 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:15:59.783234 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:59.783147 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b27s2" podUID="6259e3a6-004d-4f50-9a36-dc28c9b0cd96" Apr 22 14:15:59.783365 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:15:59.783229 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s96sk" podUID="00b1aeaa-9a7a-4380-a5aa-0891caae4c5e" Apr 22 14:15:59.970685 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:59.970649 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" event={"ID":"e3614a5a-0db9-44d9-bdc3-8d3344b36689","Type":"ContainerStarted","Data":"d1919ce11f51a62bc92a38f4da8e8f162bd3f2b4c98649ce2ef1e743d8f6aab1"} Apr 22 14:15:59.972531 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:59.972493 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal" event={"ID":"f18f1cd7406c822d6c890121cdd6a6f4","Type":"ContainerStarted","Data":"a997679ed762cef9bd5f4f16093ab297f86c76142c08ecfdbe04f317b0405e2e"} Apr 22 14:15:59.973452 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:59.973432 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-h4hlm" Apr 22 14:15:59.973919 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:59.973901 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-h4hlm" Apr 22 14:15:59.993007 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:15:59.992967 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-65.ec2.internal" podStartSLOduration=20.992955444 podStartE2EDuration="20.992955444s" podCreationTimestamp="2026-04-22 14:15:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:59.992627684 +0000 UTC m=+21.775004817" watchObservedRunningTime="2026-04-22 14:15:59.992955444 +0000 UTC m=+21.775332576" Apr 22 14:16:00.784023 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:00.783986 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:16:00.784280 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:00.784121 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g66xm" podUID="d326b6a1-5cbd-47fa-a676-90af9406d2a9" Apr 22 14:16:00.976377 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:00.976341 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" event={"ID":"e3614a5a-0db9-44d9-bdc3-8d3344b36689","Type":"ContainerStarted","Data":"17a268a9b3d350cc516604257e538eeb28b1395682aff91903272f366cff5ac0"} Apr 22 14:16:00.979332 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:00.979304 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:16:00.979670 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:00.979644 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" event={"ID":"73c63add-22e8-4809-b696-9279d2454538","Type":"ContainerStarted","Data":"7a8e01e1519b332d58c0517ff7403d93f6785c8eed97fb4a51bfe338e708dca8"} Apr 22 14:16:01.015402 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:01.015358 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lkbg4" podStartSLOduration=2.031779689 podStartE2EDuration="22.015344051s" podCreationTimestamp="2026-04-22 14:15:39 +0000 UTC" firstStartedPulling="2026-04-22 14:15:40.074286063 +0000 UTC m=+1.856663174" lastFinishedPulling="2026-04-22 14:16:00.057850424 +0000 UTC m=+21.840227536" observedRunningTime="2026-04-22 14:16:01.014976765 +0000 UTC m=+22.797353897" watchObservedRunningTime="2026-04-22 14:16:01.015344051 +0000 UTC m=+22.797721183" Apr 22 14:16:01.783437 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:01.783403 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:16:01.783606 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:01.783402 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:16:01.783606 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:01.783530 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s96sk" podUID="00b1aeaa-9a7a-4380-a5aa-0891caae4c5e" Apr 22 14:16:01.783606 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:01.783578 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b27s2" podUID="6259e3a6-004d-4f50-9a36-dc28c9b0cd96" Apr 22 14:16:02.783203 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:02.783155 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:16:02.783649 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:02.783292 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g66xm" podUID="d326b6a1-5cbd-47fa-a676-90af9406d2a9" Apr 22 14:16:03.783534 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:03.783342 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:16:03.784215 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:03.783342 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:16:03.784215 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:03.783613 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b27s2" podUID="6259e3a6-004d-4f50-9a36-dc28c9b0cd96" Apr 22 14:16:03.784215 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:03.783680 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s96sk" podUID="00b1aeaa-9a7a-4380-a5aa-0891caae4c5e" Apr 22 14:16:03.985765 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:03.985732 2542 generic.go:358] "Generic (PLEG): container finished" podID="6f09af95-f295-4dec-8131-f3dad5bd3e4d" containerID="3773d66a2c7f3f9da63573be86e52e8c9de11c71332dcfb57bd54075c73870e1" exitCode=0 Apr 22 14:16:03.985926 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:03.985820 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7x75g" event={"ID":"6f09af95-f295-4dec-8131-f3dad5bd3e4d","Type":"ContainerDied","Data":"3773d66a2c7f3f9da63573be86e52e8c9de11c71332dcfb57bd54075c73870e1"} Apr 22 14:16:03.988779 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:03.988747 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:16:03.989052 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:03.989029 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" event={"ID":"73c63add-22e8-4809-b696-9279d2454538","Type":"ContainerStarted","Data":"55632275b3811988906adf508cbc01708cc0501e0fa0e4fe6fab78ea6d168965"} Apr 22 14:16:03.989373 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:03.989356 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:16:03.989373 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:03.989376 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:16:03.989513 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:03.989388 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:16:03.989513 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:03.989491 2542 scope.go:117] "RemoveContainer" containerID="96178ed4cc5f8f1a79cdb875d97b84f55d1ae87352b826a6809d560fc6830d7b" Apr 22 14:16:04.004286 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:04.004266 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:16:04.004458 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:04.004446 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:16:04.784298 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:04.784045 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:16:04.784713 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:04.784419 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g66xm" podUID="d326b6a1-5cbd-47fa-a676-90af9406d2a9" Apr 22 14:16:04.992590 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:04.992459 2542 generic.go:358] "Generic (PLEG): container finished" podID="6f09af95-f295-4dec-8131-f3dad5bd3e4d" containerID="6701f4614da13bd4818b809dfe4fda3b4a257b073902e68167ae51c1ab86be23" exitCode=0 Apr 22 14:16:04.992590 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:04.992541 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7x75g" event={"ID":"6f09af95-f295-4dec-8131-f3dad5bd3e4d","Type":"ContainerDied","Data":"6701f4614da13bd4818b809dfe4fda3b4a257b073902e68167ae51c1ab86be23"} Apr 22 14:16:04.998407 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:04.998388 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:16:04.998700 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:04.998680 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" event={"ID":"73c63add-22e8-4809-b696-9279d2454538","Type":"ContainerStarted","Data":"1ee422215bd4a948a96a3caa9caed2ee96973723db0322705f0d149f1228acd3"} Apr 22 14:16:05.045389 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:05.045331 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" podStartSLOduration=9.453004598 podStartE2EDuration="27.045314343s" podCreationTimestamp="2026-04-22 14:15:38 +0000 UTC" firstStartedPulling="2026-04-22 14:15:40.00554638 +0000 UTC m=+1.787923491" lastFinishedPulling="2026-04-22 14:15:57.59785612 +0000 UTC m=+19.380233236" observedRunningTime="2026-04-22 14:16:05.04465772 +0000 UTC m=+26.827034853" watchObservedRunningTime="2026-04-22 14:16:05.045314343 +0000 UTC m=+26.827691472" Apr 22 14:16:05.150367 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:05.150330 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g66xm"] Apr 22 14:16:05.150509 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:05.150455 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:16:05.150552 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:05.150538 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g66xm" podUID="d326b6a1-5cbd-47fa-a676-90af9406d2a9" Apr 22 14:16:05.154096 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:05.154067 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-s96sk"] Apr 22 14:16:05.154272 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:05.154233 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:16:05.154344 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:05.154310 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s96sk" podUID="00b1aeaa-9a7a-4380-a5aa-0891caae4c5e" Apr 22 14:16:05.154736 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:05.154713 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b27s2"] Apr 22 14:16:05.154847 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:05.154818 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:16:05.154898 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:05.154887 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b27s2" podUID="6259e3a6-004d-4f50-9a36-dc28c9b0cd96" Apr 22 14:16:05.796375 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:05.796335 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret\") pod \"global-pull-secret-syncer-s96sk\" (UID: \"00b1aeaa-9a7a-4380-a5aa-0891caae4c5e\") " pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:16:05.796733 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:05.796476 2542 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:16:05.796733 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:05.796538 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret podName:00b1aeaa-9a7a-4380-a5aa-0891caae4c5e nodeName:}" failed. No retries permitted until 2026-04-22 14:16:21.796524228 +0000 UTC m=+43.578901339 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret") pod "global-pull-secret-syncer-s96sk" (UID: "00b1aeaa-9a7a-4380-a5aa-0891caae4c5e") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:16:06.002931 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:06.002896 2542 generic.go:358] "Generic (PLEG): container finished" podID="6f09af95-f295-4dec-8131-f3dad5bd3e4d" containerID="0811de703d3a94c468b9d26223bb2b4214411921726be6b0066c9ba36b5d4fcd" exitCode=0 Apr 22 14:16:06.003100 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:06.002988 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7x75g" event={"ID":"6f09af95-f295-4dec-8131-f3dad5bd3e4d","Type":"ContainerDied","Data":"0811de703d3a94c468b9d26223bb2b4214411921726be6b0066c9ba36b5d4fcd"} Apr 22 14:16:06.783332 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:06.783301 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:16:06.783332 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:06.783321 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:16:06.783587 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:06.783301 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:16:06.783587 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:06.783438 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s96sk" podUID="00b1aeaa-9a7a-4380-a5aa-0891caae4c5e" Apr 22 14:16:06.783587 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:06.783475 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b27s2" podUID="6259e3a6-004d-4f50-9a36-dc28c9b0cd96" Apr 22 14:16:06.783587 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:06.783561 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g66xm" podUID="d326b6a1-5cbd-47fa-a676-90af9406d2a9" Apr 22 14:16:08.784076 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:08.784042 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:16:08.784475 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:08.784101 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:16:08.784475 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:08.784165 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:16:08.784475 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:08.784177 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b27s2" podUID="6259e3a6-004d-4f50-9a36-dc28c9b0cd96" Apr 22 14:16:08.784475 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:08.784278 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s96sk" podUID="00b1aeaa-9a7a-4380-a5aa-0891caae4c5e" Apr 22 14:16:08.784475 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:08.784382 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g66xm" podUID="d326b6a1-5cbd-47fa-a676-90af9406d2a9" Apr 22 14:16:09.563879 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.563797 2542 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-65.ec2.internal" event="NodeReady" Apr 22 14:16:09.564045 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.563948 2542 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 14:16:09.606849 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.605725 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-857878798d-nzjn8"] Apr 22 14:16:09.635079 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.635047 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ctlhj"] Apr 22 14:16:09.635332 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.635303 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:09.638691 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.638630 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 14:16:09.638691 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.638646 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 14:16:09.638881 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.638822 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 14:16:09.638937 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.638901 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 14:16:09.639223 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.639204 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 14:16:09.639454 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.639376 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 14:16:09.639454 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.639423 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-f6csh\"" Apr 22 14:16:09.649677 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.649656 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-czjzj"] Apr 22 14:16:09.649833 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.649812 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" Apr 22 14:16:09.652871 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.652848 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-t4t7l\"" Apr 22 14:16:09.652979 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.652940 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 14:16:09.653170 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.653153 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 14:16:09.653326 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.653305 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 14:16:09.653660 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.653641 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:16:09.660995 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.660971 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 14:16:09.664970 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.664948 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6chj8"] Apr 22 14:16:09.665101 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.665081 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.668203 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.668032 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-ddt4d\"" Apr 22 14:16:09.668203 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.668054 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 14:16:09.668203 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.668084 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 14:16:09.668380 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.668359 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 14:16:09.668532 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.668517 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 14:16:09.673863 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.673845 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 14:16:09.691004 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.690975 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7fd97b775f-28299"] Apr 22 14:16:09.691199 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.691159 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6chj8" Apr 22 14:16:09.693828 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.693809 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-d6s58\"" Apr 22 14:16:09.693828 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.693820 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 14:16:09.694003 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.693820 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:16:09.717382 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.717352 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx"] Apr 22 14:16:09.717526 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.717488 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.720149 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.720123 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fd7vw\"" Apr 22 14:16:09.720289 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.720169 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 14:16:09.720289 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.720181 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 14:16:09.720590 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.720573 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 14:16:09.725598 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.725580 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 14:16:09.728889 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.728860 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:09.728986 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.728920 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-default-certificate\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:09.728986 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.728960 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:09.729081 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.728997 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9czf\" (UniqueName: \"kubernetes.io/projected/cda3d6aa-e281-4e92-adba-267754d5dd86-kube-api-access-b9czf\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:09.729081 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.729032 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/749b33a6-3425-465e-a4b7-7844646cda79-config\") pod \"console-operator-9d4b6777b-ctlhj\" (UID: \"749b33a6-3425-465e-a4b7-7844646cda79\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" Apr 22 14:16:09.729154 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.729082 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lbgb\" (UniqueName: \"kubernetes.io/projected/749b33a6-3425-465e-a4b7-7844646cda79-kube-api-access-5lbgb\") pod \"console-operator-9d4b6777b-ctlhj\" (UID: \"749b33a6-3425-465e-a4b7-7844646cda79\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" Apr 22 14:16:09.729154 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.729117 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/749b33a6-3425-465e-a4b7-7844646cda79-trusted-ca\") pod \"console-operator-9d4b6777b-ctlhj\" (UID: \"749b33a6-3425-465e-a4b7-7844646cda79\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" Apr 22 14:16:09.729264 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.729152 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-stats-auth\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:09.729264 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.729213 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/749b33a6-3425-465e-a4b7-7844646cda79-serving-cert\") pod \"console-operator-9d4b6777b-ctlhj\" (UID: \"749b33a6-3425-465e-a4b7-7844646cda79\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" Apr 22 14:16:09.741736 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.741712 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm"] Apr 22 14:16:09.741908 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.741891 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx" Apr 22 14:16:09.744499 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.744464 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 14:16:09.744758 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.744737 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 14:16:09.744838 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.744803 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:16:09.744897 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.744860 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-xrql5\"" Apr 22 14:16:09.763142 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.763121 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6chj8"] Apr 22 14:16:09.763142 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.763147 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ctlhj"] Apr 22 14:16:09.763320 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.763158 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-czjzj"] Apr 22 14:16:09.763320 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.763169 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm"] Apr 22 14:16:09.763320 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.763179 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx"] Apr 22 14:16:09.763320 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.763223 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-857878798d-nzjn8"] Apr 22 14:16:09.763320 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.763238 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk"] Apr 22 14:16:09.763320 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.763300 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" Apr 22 14:16:09.766242 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.766221 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 14:16:09.766346 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.766259 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 14:16:09.766346 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.766309 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-jnrv9\"" Apr 22 14:16:09.766452 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.766417 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 14:16:09.766536 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.766513 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 14:16:09.793462 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.793430 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vzzs6"] Apr 22 14:16:09.793872 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.793475 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk" Apr 22 14:16:09.796198 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.796163 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 14:16:09.796303 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.796278 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 14:16:09.796303 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.796284 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-q47r8\"" Apr 22 14:16:09.796426 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.796306 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 14:16:09.796426 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.796398 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:16:09.820278 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.820200 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-h8bkz"] Apr 22 14:16:09.820423 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.820349 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:09.823419 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.823093 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 14:16:09.823419 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.823109 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 14:16:09.823419 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.823230 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7gvsw\"" Apr 22 14:16:09.829867 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.829847 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7dzg\" (UniqueName: \"kubernetes.io/projected/ff59a9f8-3474-46f0-9922-f6372bd8119f-kube-api-access-f7dzg\") pod \"insights-operator-585dfdc468-czjzj\" (UID: \"ff59a9f8-3474-46f0-9922-f6372bd8119f\") " pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.829948 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.829880 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:09.829981 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.829940 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:09.830021 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.829979 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff59a9f8-3474-46f0-9922-f6372bd8119f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-czjzj\" (UID: \"ff59a9f8-3474-46f0-9922-f6372bd8119f\") " pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.830021 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:09.829987 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle podName:cda3d6aa-e281-4e92-adba-267754d5dd86 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:10.329973326 +0000 UTC m=+32.112350438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle") pod "router-default-857878798d-nzjn8" (UID: "cda3d6aa-e281-4e92-adba-267754d5dd86") : configmap references non-existent config key: service-ca.crt Apr 22 14:16:09.830098 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830024 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nqmkm\" (UID: \"b99ce1d1-2784-4681-b12f-a7fc95fa5fa1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" Apr 22 14:16:09.830098 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830075 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff59a9f8-3474-46f0-9922-f6372bd8119f-tmp\") pod \"insights-operator-585dfdc468-czjzj\" (UID: \"ff59a9f8-3474-46f0-9922-f6372bd8119f\") " pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.830177 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:09.830093 2542 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:16:09.830177 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830101 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.830177 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830132 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/749b33a6-3425-465e-a4b7-7844646cda79-config\") pod \"console-operator-9d4b6777b-ctlhj\" (UID: \"749b33a6-3425-465e-a4b7-7844646cda79\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" Apr 22 14:16:09.830177 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:09.830147 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs podName:cda3d6aa-e281-4e92-adba-267754d5dd86 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:10.330133828 +0000 UTC m=+32.112510940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs") pod "router-default-857878798d-nzjn8" (UID: "cda3d6aa-e281-4e92-adba-267754d5dd86") : secret "router-metrics-certs-default" not found Apr 22 14:16:09.830329 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830181 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-nqmkm\" (UID: \"b99ce1d1-2784-4681-b12f-a7fc95fa5fa1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" Apr 22 14:16:09.830329 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830233 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e092464a-c357-4406-b079-4c8f0ece38e9-registry-certificates\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.830329 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830252 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-bound-sa-token\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.830329 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830273 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e092464a-c357-4406-b079-4c8f0ece38e9-image-registry-private-configuration\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.830329 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830291 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqxv5\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-kube-api-access-bqxv5\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.830329 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830316 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff59a9f8-3474-46f0-9922-f6372bd8119f-service-ca-bundle\") pod \"insights-operator-585dfdc468-czjzj\" (UID: \"ff59a9f8-3474-46f0-9922-f6372bd8119f\") " pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.830584 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830336 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff59a9f8-3474-46f0-9922-f6372bd8119f-serving-cert\") pod \"insights-operator-585dfdc468-czjzj\" (UID: \"ff59a9f8-3474-46f0-9922-f6372bd8119f\") " pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.830584 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830435 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-default-certificate\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:09.830584 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830539 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e092464a-c357-4406-b079-4c8f0ece38e9-ca-trust-extracted\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.830584 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830572 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rfgzx\" (UID: \"edd11ecf-24da-4d1e-83da-e059f384b9ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx" Apr 22 14:16:09.830767 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830592 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e092464a-c357-4406-b079-4c8f0ece38e9-trusted-ca\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.830767 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830618 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e092464a-c357-4406-b079-4c8f0ece38e9-installation-pull-secrets\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.830767 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830646 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9czf\" (UniqueName: \"kubernetes.io/projected/cda3d6aa-e281-4e92-adba-267754d5dd86-kube-api-access-b9czf\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:09.830767 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830694 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ff59a9f8-3474-46f0-9922-f6372bd8119f-snapshots\") pod \"insights-operator-585dfdc468-czjzj\" (UID: \"ff59a9f8-3474-46f0-9922-f6372bd8119f\") " pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.830767 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830729 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njk9b\" (UniqueName: \"kubernetes.io/projected/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-kube-api-access-njk9b\") pod \"cluster-monitoring-operator-75587bd455-nqmkm\" (UID: \"b99ce1d1-2784-4681-b12f-a7fc95fa5fa1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" Apr 22 14:16:09.830767 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830763 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lbgb\" (UniqueName: \"kubernetes.io/projected/749b33a6-3425-465e-a4b7-7844646cda79-kube-api-access-5lbgb\") pod \"console-operator-9d4b6777b-ctlhj\" (UID: \"749b33a6-3425-465e-a4b7-7844646cda79\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" Apr 22 14:16:09.831035 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830797 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8t5k\" (UniqueName: \"kubernetes.io/projected/edd11ecf-24da-4d1e-83da-e059f384b9ac-kube-api-access-x8t5k\") pod \"cluster-samples-operator-6dc5bdb6b4-rfgzx\" (UID: \"edd11ecf-24da-4d1e-83da-e059f384b9ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx" Apr 22 14:16:09.831035 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830880 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/749b33a6-3425-465e-a4b7-7844646cda79-trusted-ca\") pod \"console-operator-9d4b6777b-ctlhj\" (UID: \"749b33a6-3425-465e-a4b7-7844646cda79\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" Apr 22 14:16:09.831035 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.830971 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-stats-auth\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:09.831035 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.831007 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vffz\" (UniqueName: \"kubernetes.io/projected/bd7bfce3-b27e-45be-bb80-1b123b5b0fef-kube-api-access-5vffz\") pod \"volume-data-source-validator-7c6cbb6c87-6chj8\" (UID: \"bd7bfce3-b27e-45be-bb80-1b123b5b0fef\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6chj8" Apr 22 14:16:09.831272 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.831044 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/749b33a6-3425-465e-a4b7-7844646cda79-serving-cert\") pod \"console-operator-9d4b6777b-ctlhj\" (UID: \"749b33a6-3425-465e-a4b7-7844646cda79\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" Apr 22 14:16:09.835166 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.835027 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/749b33a6-3425-465e-a4b7-7844646cda79-serving-cert\") pod \"console-operator-9d4b6777b-ctlhj\" (UID: \"749b33a6-3425-465e-a4b7-7844646cda79\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" Apr 22 14:16:09.835269 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.835055 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-stats-auth\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:09.835269 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.835067 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-default-certificate\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:09.839513 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.839487 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/749b33a6-3425-465e-a4b7-7844646cda79-config\") pod \"console-operator-9d4b6777b-ctlhj\" (UID: \"749b33a6-3425-465e-a4b7-7844646cda79\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" Apr 22 14:16:09.841244 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.841221 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7"] Apr 22 14:16:09.841470 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.841412 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-h8bkz" Apr 22 14:16:09.841895 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.841874 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/749b33a6-3425-465e-a4b7-7844646cda79-trusted-ca\") pod \"console-operator-9d4b6777b-ctlhj\" (UID: \"749b33a6-3425-465e-a4b7-7844646cda79\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" Apr 22 14:16:09.844814 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.844792 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 14:16:09.844925 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.844822 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 14:16:09.844925 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.844792 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 14:16:09.845091 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.845073 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9czf\" (UniqueName: \"kubernetes.io/projected/cda3d6aa-e281-4e92-adba-267754d5dd86-kube-api-access-b9czf\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:09.845161 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.845088 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lbgb\" (UniqueName: \"kubernetes.io/projected/749b33a6-3425-465e-a4b7-7844646cda79-kube-api-access-5lbgb\") pod \"console-operator-9d4b6777b-ctlhj\" (UID: \"749b33a6-3425-465e-a4b7-7844646cda79\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" Apr 22 14:16:09.845161 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.845151 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-n9tgv\"" Apr 22 14:16:09.869300 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.869277 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd"] Apr 22 14:16:09.869406 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.869355 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7" Apr 22 14:16:09.871898 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.871877 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 14:16:09.871898 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.871887 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 14:16:09.872070 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.871934 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:16:09.872070 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.871962 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-rxjmk\"" Apr 22 14:16:09.872070 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.871888 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 14:16:09.884630 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.884609 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d5f7f875-znf7p"] Apr 22 14:16:09.884745 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.884737 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd" Apr 22 14:16:09.887142 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.887120 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 14:16:09.887262 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.887202 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 14:16:09.887262 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.887235 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 14:16:09.887377 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.887120 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 14:16:09.902427 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.902405 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lqv79"] Apr 22 14:16:09.902532 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.902412 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d5f7f875-znf7p" Apr 22 14:16:09.904875 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.904852 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-gskvl\"" Apr 22 14:16:09.904989 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.904927 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 14:16:09.920771 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.920752 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv"] Apr 22 14:16:09.920931 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.920914 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lqv79" Apr 22 14:16:09.923295 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.923276 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 14:16:09.923392 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.923362 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-x4gqj\"" Apr 22 14:16:09.923392 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.923374 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 14:16:09.932135 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.932115 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njk9b\" (UniqueName: \"kubernetes.io/projected/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-kube-api-access-njk9b\") pod \"cluster-monitoring-operator-75587bd455-nqmkm\" (UID: \"b99ce1d1-2784-4681-b12f-a7fc95fa5fa1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" Apr 22 14:16:09.932253 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.932150 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8t5k\" (UniqueName: \"kubernetes.io/projected/edd11ecf-24da-4d1e-83da-e059f384b9ac-kube-api-access-x8t5k\") pod \"cluster-samples-operator-6dc5bdb6b4-rfgzx\" (UID: \"edd11ecf-24da-4d1e-83da-e059f384b9ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx" Apr 22 14:16:09.932253 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.932176 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vffz\" (UniqueName: \"kubernetes.io/projected/bd7bfce3-b27e-45be-bb80-1b123b5b0fef-kube-api-access-5vffz\") pod \"volume-data-source-validator-7c6cbb6c87-6chj8\" (UID: \"bd7bfce3-b27e-45be-bb80-1b123b5b0fef\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6chj8" Apr 22 14:16:09.932253 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.932225 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-tmp-dir\") pod \"dns-default-vzzs6\" (UID: \"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1\") " pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:09.932396 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.932253 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5kh4\" (UniqueName: \"kubernetes.io/projected/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-kube-api-access-p5kh4\") pod \"dns-default-vzzs6\" (UID: \"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1\") " pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:09.932396 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.932275 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7dzg\" (UniqueName: \"kubernetes.io/projected/ff59a9f8-3474-46f0-9922-f6372bd8119f-kube-api-access-f7dzg\") pod \"insights-operator-585dfdc468-czjzj\" (UID: \"ff59a9f8-3474-46f0-9922-f6372bd8119f\") " pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.932494 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.932437 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff59a9f8-3474-46f0-9922-f6372bd8119f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-czjzj\" (UID: \"ff59a9f8-3474-46f0-9922-f6372bd8119f\") " pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.932545 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.932501 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nqmkm\" (UID: \"b99ce1d1-2784-4681-b12f-a7fc95fa5fa1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" Apr 22 14:16:09.932592 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.932553 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff59a9f8-3474-46f0-9922-f6372bd8119f-tmp\") pod \"insights-operator-585dfdc468-czjzj\" (UID: \"ff59a9f8-3474-46f0-9922-f6372bd8119f\") " pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.932592 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.932581 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.932685 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.932621 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert\") pod \"ingress-canary-h8bkz\" (UID: \"522ef3ef-a141-4082-8f5a-55a059c52133\") " pod="openshift-ingress-canary/ingress-canary-h8bkz" Apr 22 14:16:09.932685 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:09.932637 2542 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:16:09.932685 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.932646 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43dd5ba1-95ef-4c1b-b853-cdc8032e3625-config\") pod \"service-ca-operator-d6fc45fc5-pbfkk\" (UID: \"43dd5ba1-95ef-4c1b-b853-cdc8032e3625\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk" Apr 22 14:16:09.932823 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.932683 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-nqmkm\" (UID: \"b99ce1d1-2784-4681-b12f-a7fc95fa5fa1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" Apr 22 14:16:09.932823 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:09.932706 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls podName:b99ce1d1-2784-4681-b12f-a7fc95fa5fa1 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:10.432688469 +0000 UTC m=+32.215065588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nqmkm" (UID: "b99ce1d1-2784-4681-b12f-a7fc95fa5fa1") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:16:09.932823 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:09.932716 2542 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:16:09.932823 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:09.932733 2542 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fd97b775f-28299: secret "image-registry-tls" not found Apr 22 14:16:09.932823 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:09.932777 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls podName:e092464a-c357-4406-b079-4c8f0ece38e9 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:10.432761232 +0000 UTC m=+32.215138348 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls") pod "image-registry-7fd97b775f-28299" (UID: "e092464a-c357-4406-b079-4c8f0ece38e9") : secret "image-registry-tls" not found Apr 22 14:16:09.933066 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.932878 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnb4j\" (UniqueName: \"kubernetes.io/projected/43dd5ba1-95ef-4c1b-b853-cdc8032e3625-kube-api-access-jnb4j\") pod \"service-ca-operator-d6fc45fc5-pbfkk\" (UID: \"43dd5ba1-95ef-4c1b-b853-cdc8032e3625\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk" Apr 22 14:16:09.933066 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.932915 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e092464a-c357-4406-b079-4c8f0ece38e9-registry-certificates\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.933066 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.932936 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff59a9f8-3474-46f0-9922-f6372bd8119f-tmp\") pod \"insights-operator-585dfdc468-czjzj\" (UID: \"ff59a9f8-3474-46f0-9922-f6372bd8119f\") " pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.933066 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.932940 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-bound-sa-token\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.933066 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.933001 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flc56\" (UniqueName: \"kubernetes.io/projected/522ef3ef-a141-4082-8f5a-55a059c52133-kube-api-access-flc56\") pod \"ingress-canary-h8bkz\" (UID: \"522ef3ef-a141-4082-8f5a-55a059c52133\") " pod="openshift-ingress-canary/ingress-canary-h8bkz" Apr 22 14:16:09.933066 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.933030 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-config-volume\") pod \"dns-default-vzzs6\" (UID: \"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1\") " pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:09.933066 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.933053 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43dd5ba1-95ef-4c1b-b853-cdc8032e3625-serving-cert\") pod \"service-ca-operator-d6fc45fc5-pbfkk\" (UID: \"43dd5ba1-95ef-4c1b-b853-cdc8032e3625\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk" Apr 22 14:16:09.933419 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.933081 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e092464a-c357-4406-b079-4c8f0ece38e9-image-registry-private-configuration\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.933419 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.933110 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqxv5\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-kube-api-access-bqxv5\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.933419 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.933137 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff59a9f8-3474-46f0-9922-f6372bd8119f-service-ca-bundle\") pod \"insights-operator-585dfdc468-czjzj\" (UID: \"ff59a9f8-3474-46f0-9922-f6372bd8119f\") " pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.933419 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.933159 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff59a9f8-3474-46f0-9922-f6372bd8119f-serving-cert\") pod \"insights-operator-585dfdc468-czjzj\" (UID: \"ff59a9f8-3474-46f0-9922-f6372bd8119f\") " pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.933419 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.933232 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e092464a-c357-4406-b079-4c8f0ece38e9-ca-trust-extracted\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.933419 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.933257 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rfgzx\" (UID: \"edd11ecf-24da-4d1e-83da-e059f384b9ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx" Apr 22 14:16:09.933419 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.933282 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e092464a-c357-4406-b079-4c8f0ece38e9-trusted-ca\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.933419 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.933303 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e092464a-c357-4406-b079-4c8f0ece38e9-installation-pull-secrets\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.933419 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.933335 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ff59a9f8-3474-46f0-9922-f6372bd8119f-snapshots\") pod \"insights-operator-585dfdc468-czjzj\" (UID: \"ff59a9f8-3474-46f0-9922-f6372bd8119f\") " pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.933419 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.933353 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls\") pod \"dns-default-vzzs6\" (UID: \"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1\") " pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:09.933419 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.933410 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-nqmkm\" (UID: \"b99ce1d1-2784-4681-b12f-a7fc95fa5fa1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" Apr 22 14:16:09.934609 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.934583 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e092464a-c357-4406-b079-4c8f0ece38e9-trusted-ca\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.935339 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.935309 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e092464a-c357-4406-b079-4c8f0ece38e9-ca-trust-extracted\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.935464 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:09.935446 2542 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 14:16:09.935582 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:09.935568 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls podName:edd11ecf-24da-4d1e-83da-e059f384b9ac nodeName:}" failed. No retries permitted until 2026-04-22 14:16:10.435549692 +0000 UTC m=+32.217926811 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rfgzx" (UID: "edd11ecf-24da-4d1e-83da-e059f384b9ac") : secret "samples-operator-tls" not found Apr 22 14:16:09.936461 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.936125 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ff59a9f8-3474-46f0-9922-f6372bd8119f-snapshots\") pod \"insights-operator-585dfdc468-czjzj\" (UID: \"ff59a9f8-3474-46f0-9922-f6372bd8119f\") " pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.936461 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.936286 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e092464a-c357-4406-b079-4c8f0ece38e9-image-registry-private-configuration\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.936948 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.936929 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff59a9f8-3474-46f0-9922-f6372bd8119f-serving-cert\") pod \"insights-operator-585dfdc468-czjzj\" (UID: \"ff59a9f8-3474-46f0-9922-f6372bd8119f\") " pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.938627 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.938606 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e092464a-c357-4406-b079-4c8f0ece38e9-installation-pull-secrets\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.938798 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.938737 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e092464a-c357-4406-b079-4c8f0ece38e9-registry-certificates\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.938865 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.938845 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7fd97b775f-28299"] Apr 22 14:16:09.938914 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.938877 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk"] Apr 22 14:16:09.938914 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.938892 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-h8bkz"] Apr 22 14:16:09.938914 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.938903 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd"] Apr 22 14:16:09.939035 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.938916 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d5f7f875-znf7p"] Apr 22 14:16:09.939035 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.938929 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vzzs6"] Apr 22 14:16:09.939035 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.938940 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7"] Apr 22 14:16:09.939035 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.938953 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lqv79"] Apr 22 14:16:09.939035 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.938964 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv"] Apr 22 14:16:09.939035 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.938996 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:09.939299 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.939246 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff59a9f8-3474-46f0-9922-f6372bd8119f-service-ca-bundle\") pod \"insights-operator-585dfdc468-czjzj\" (UID: \"ff59a9f8-3474-46f0-9922-f6372bd8119f\") " pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.939396 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.939375 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff59a9f8-3474-46f0-9922-f6372bd8119f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-czjzj\" (UID: \"ff59a9f8-3474-46f0-9922-f6372bd8119f\") " pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.941811 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.941478 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 14:16:09.941811 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.941717 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 14:16:09.941938 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.941906 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 14:16:09.942408 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.942383 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 14:16:09.945233 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.945212 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-bound-sa-token\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.945754 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.945730 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vffz\" (UniqueName: \"kubernetes.io/projected/bd7bfce3-b27e-45be-bb80-1b123b5b0fef-kube-api-access-5vffz\") pod \"volume-data-source-validator-7c6cbb6c87-6chj8\" (UID: \"bd7bfce3-b27e-45be-bb80-1b123b5b0fef\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6chj8" Apr 22 14:16:09.945846 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.945788 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njk9b\" (UniqueName: \"kubernetes.io/projected/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-kube-api-access-njk9b\") pod \"cluster-monitoring-operator-75587bd455-nqmkm\" (UID: \"b99ce1d1-2784-4681-b12f-a7fc95fa5fa1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" Apr 22 14:16:09.946223 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.946201 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7dzg\" (UniqueName: \"kubernetes.io/projected/ff59a9f8-3474-46f0-9922-f6372bd8119f-kube-api-access-f7dzg\") pod \"insights-operator-585dfdc468-czjzj\" (UID: \"ff59a9f8-3474-46f0-9922-f6372bd8119f\") " pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:09.946744 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.946726 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqxv5\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-kube-api-access-bqxv5\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:09.952492 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.952474 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8t5k\" (UniqueName: \"kubernetes.io/projected/edd11ecf-24da-4d1e-83da-e059f384b9ac-kube-api-access-x8t5k\") pod \"cluster-samples-operator-6dc5bdb6b4-rfgzx\" (UID: \"edd11ecf-24da-4d1e-83da-e059f384b9ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx" Apr 22 14:16:09.960655 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.960634 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" Apr 22 14:16:09.975435 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:09.975411 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-czjzj" Apr 22 14:16:10.001111 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.001081 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6chj8" Apr 22 14:16:10.034315 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034282 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert\") pod \"ingress-canary-h8bkz\" (UID: \"522ef3ef-a141-4082-8f5a-55a059c52133\") " pod="openshift-ingress-canary/ingress-canary-h8bkz" Apr 22 14:16:10.034483 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034327 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43dd5ba1-95ef-4c1b-b853-cdc8032e3625-config\") pod \"service-ca-operator-d6fc45fc5-pbfkk\" (UID: \"43dd5ba1-95ef-4c1b-b853-cdc8032e3625\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk" Apr 22 14:16:10.034483 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034359 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnb4j\" (UniqueName: \"kubernetes.io/projected/43dd5ba1-95ef-4c1b-b853-cdc8032e3625-kube-api-access-jnb4j\") pod \"service-ca-operator-d6fc45fc5-pbfkk\" (UID: \"43dd5ba1-95ef-4c1b-b853-cdc8032e3625\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk" Apr 22 14:16:10.034483 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034387 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e8cefde9-2f13-4e0b-867b-590873b6fcf8-tmp\") pod \"klusterlet-addon-workmgr-845cc56b65-hzvqd\" (UID: \"e8cefde9-2f13-4e0b-867b-590873b6fcf8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd" Apr 22 14:16:10.034483 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:10.034399 2542 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:10.034483 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034417 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flc56\" (UniqueName: \"kubernetes.io/projected/522ef3ef-a141-4082-8f5a-55a059c52133-kube-api-access-flc56\") pod \"ingress-canary-h8bkz\" (UID: \"522ef3ef-a141-4082-8f5a-55a059c52133\") " pod="openshift-ingress-canary/ingress-canary-h8bkz" Apr 22 14:16:10.034483 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034467 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8de6ebfa-8fdc-401b-b660-aaaf38fda26e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv\" (UID: \"8de6ebfa-8fdc-401b-b660-aaaf38fda26e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.034483 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:10.034473 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert podName:522ef3ef-a141-4082-8f5a-55a059c52133 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:10.534453672 +0000 UTC m=+32.316830789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert") pod "ingress-canary-h8bkz" (UID: "522ef3ef-a141-4082-8f5a-55a059c52133") : secret "canary-serving-cert" not found Apr 22 14:16:10.034792 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034524 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8de6ebfa-8fdc-401b-b660-aaaf38fda26e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv\" (UID: \"8de6ebfa-8fdc-401b-b660-aaaf38fda26e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.034792 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034556 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-config-volume\") pod \"dns-default-vzzs6\" (UID: \"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1\") " pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:10.034792 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034582 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43dd5ba1-95ef-4c1b-b853-cdc8032e3625-serving-cert\") pod \"service-ca-operator-d6fc45fc5-pbfkk\" (UID: \"43dd5ba1-95ef-4c1b-b853-cdc8032e3625\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk" Apr 22 14:16:10.034792 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034607 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8de6ebfa-8fdc-401b-b660-aaaf38fda26e-ca\") pod \"cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv\" (UID: \"8de6ebfa-8fdc-401b-b660-aaaf38fda26e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.034792 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034675 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/85ad698d-071e-44a1-ad7c-b9db3746cb83-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-d5f7f875-znf7p\" (UID: \"85ad698d-071e-44a1-ad7c-b9db3746cb83\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d5f7f875-znf7p" Apr 22 14:16:10.034792 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034705 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-652tg\" (UniqueName: \"kubernetes.io/projected/8de6ebfa-8fdc-401b-b660-aaaf38fda26e-kube-api-access-652tg\") pod \"cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv\" (UID: \"8de6ebfa-8fdc-401b-b660-aaaf38fda26e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.034792 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034743 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzlhm\" (UniqueName: \"kubernetes.io/projected/d396d453-0baa-4b43-883b-255470bf1283-kube-api-access-xzlhm\") pod \"network-check-source-8894fc9bd-lqv79\" (UID: \"d396d453-0baa-4b43-883b-255470bf1283\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lqv79" Apr 22 14:16:10.034792 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034769 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls\") pod \"dns-default-vzzs6\" (UID: \"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1\") " pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:10.035149 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034796 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8de6ebfa-8fdc-401b-b660-aaaf38fda26e-hub\") pod \"cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv\" (UID: \"8de6ebfa-8fdc-401b-b660-aaaf38fda26e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.035149 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034821 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e8cefde9-2f13-4e0b-867b-590873b6fcf8-klusterlet-config\") pod \"klusterlet-addon-workmgr-845cc56b65-hzvqd\" (UID: \"e8cefde9-2f13-4e0b-867b-590873b6fcf8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd" Apr 22 14:16:10.035149 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034846 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b578w\" (UniqueName: \"kubernetes.io/projected/85ad698d-071e-44a1-ad7c-b9db3746cb83-kube-api-access-b578w\") pod \"managed-serviceaccount-addon-agent-d5f7f875-znf7p\" (UID: \"85ad698d-071e-44a1-ad7c-b9db3746cb83\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d5f7f875-znf7p" Apr 22 14:16:10.035149 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:10.034913 2542 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:10.035149 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:10.034948 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls podName:fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:10.5349359 +0000 UTC m=+32.317313011 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls") pod "dns-default-vzzs6" (UID: "fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1") : secret "dns-default-metrics-tls" not found Apr 22 14:16:10.035149 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034910 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-tmp-dir\") pod \"dns-default-vzzs6\" (UID: \"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1\") " pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:10.035149 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.034985 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5kh4\" (UniqueName: \"kubernetes.io/projected/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-kube-api-access-p5kh4\") pod \"dns-default-vzzs6\" (UID: \"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1\") " pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:10.035149 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.035007 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-7rsj7\" (UID: \"3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7" Apr 22 14:16:10.035149 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.035023 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8de6ebfa-8fdc-401b-b660-aaaf38fda26e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv\" (UID: \"8de6ebfa-8fdc-401b-b660-aaaf38fda26e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.035149 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.035060 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-7rsj7\" (UID: \"3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7" Apr 22 14:16:10.035149 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.035087 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk9g7\" (UniqueName: \"kubernetes.io/projected/e8cefde9-2f13-4e0b-867b-590873b6fcf8-kube-api-access-hk9g7\") pod \"klusterlet-addon-workmgr-845cc56b65-hzvqd\" (UID: \"e8cefde9-2f13-4e0b-867b-590873b6fcf8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd" Apr 22 14:16:10.035149 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.035114 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vqzk\" (UniqueName: \"kubernetes.io/projected/3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9-kube-api-access-2vqzk\") pod \"kube-storage-version-migrator-operator-6769c5d45-7rsj7\" (UID: \"3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7" Apr 22 14:16:10.035149 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.035119 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-tmp-dir\") pod \"dns-default-vzzs6\" (UID: \"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1\") " pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:10.035149 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.035133 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-config-volume\") pod \"dns-default-vzzs6\" (UID: \"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1\") " pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:10.035773 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.035581 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43dd5ba1-95ef-4c1b-b853-cdc8032e3625-config\") pod \"service-ca-operator-d6fc45fc5-pbfkk\" (UID: \"43dd5ba1-95ef-4c1b-b853-cdc8032e3625\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk" Apr 22 14:16:10.037735 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.037713 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43dd5ba1-95ef-4c1b-b853-cdc8032e3625-serving-cert\") pod \"service-ca-operator-d6fc45fc5-pbfkk\" (UID: \"43dd5ba1-95ef-4c1b-b853-cdc8032e3625\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk" Apr 22 14:16:10.043927 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.043902 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5kh4\" (UniqueName: \"kubernetes.io/projected/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-kube-api-access-p5kh4\") pod \"dns-default-vzzs6\" (UID: \"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1\") " pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:10.044061 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.044048 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnb4j\" (UniqueName: \"kubernetes.io/projected/43dd5ba1-95ef-4c1b-b853-cdc8032e3625-kube-api-access-jnb4j\") pod \"service-ca-operator-d6fc45fc5-pbfkk\" (UID: \"43dd5ba1-95ef-4c1b-b853-cdc8032e3625\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk" Apr 22 14:16:10.047981 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.047960 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flc56\" (UniqueName: \"kubernetes.io/projected/522ef3ef-a141-4082-8f5a-55a059c52133-kube-api-access-flc56\") pod \"ingress-canary-h8bkz\" (UID: \"522ef3ef-a141-4082-8f5a-55a059c52133\") " pod="openshift-ingress-canary/ingress-canary-h8bkz" Apr 22 14:16:10.103265 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.103173 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk" Apr 22 14:16:10.136546 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.136513 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/85ad698d-071e-44a1-ad7c-b9db3746cb83-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-d5f7f875-znf7p\" (UID: \"85ad698d-071e-44a1-ad7c-b9db3746cb83\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d5f7f875-znf7p" Apr 22 14:16:10.136694 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.136582 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-652tg\" (UniqueName: \"kubernetes.io/projected/8de6ebfa-8fdc-401b-b660-aaaf38fda26e-kube-api-access-652tg\") pod \"cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv\" (UID: \"8de6ebfa-8fdc-401b-b660-aaaf38fda26e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.136694 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.136604 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzlhm\" (UniqueName: \"kubernetes.io/projected/d396d453-0baa-4b43-883b-255470bf1283-kube-api-access-xzlhm\") pod \"network-check-source-8894fc9bd-lqv79\" (UID: \"d396d453-0baa-4b43-883b-255470bf1283\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lqv79" Apr 22 14:16:10.136694 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.136635 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8de6ebfa-8fdc-401b-b660-aaaf38fda26e-hub\") pod \"cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv\" (UID: \"8de6ebfa-8fdc-401b-b660-aaaf38fda26e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.136845 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.136755 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e8cefde9-2f13-4e0b-867b-590873b6fcf8-klusterlet-config\") pod \"klusterlet-addon-workmgr-845cc56b65-hzvqd\" (UID: \"e8cefde9-2f13-4e0b-867b-590873b6fcf8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd" Apr 22 14:16:10.136845 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.136799 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b578w\" (UniqueName: \"kubernetes.io/projected/85ad698d-071e-44a1-ad7c-b9db3746cb83-kube-api-access-b578w\") pod \"managed-serviceaccount-addon-agent-d5f7f875-znf7p\" (UID: \"85ad698d-071e-44a1-ad7c-b9db3746cb83\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d5f7f875-znf7p" Apr 22 14:16:10.136948 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.136852 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-7rsj7\" (UID: \"3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7" Apr 22 14:16:10.136948 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.136880 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8de6ebfa-8fdc-401b-b660-aaaf38fda26e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv\" (UID: \"8de6ebfa-8fdc-401b-b660-aaaf38fda26e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.136948 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.136933 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-7rsj7\" (UID: \"3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7" Apr 22 14:16:10.137083 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.136958 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hk9g7\" (UniqueName: \"kubernetes.io/projected/e8cefde9-2f13-4e0b-867b-590873b6fcf8-kube-api-access-hk9g7\") pod \"klusterlet-addon-workmgr-845cc56b65-hzvqd\" (UID: \"e8cefde9-2f13-4e0b-867b-590873b6fcf8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd" Apr 22 14:16:10.137083 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.137000 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vqzk\" (UniqueName: \"kubernetes.io/projected/3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9-kube-api-access-2vqzk\") pod \"kube-storage-version-migrator-operator-6769c5d45-7rsj7\" (UID: \"3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7" Apr 22 14:16:10.137814 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.137345 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e8cefde9-2f13-4e0b-867b-590873b6fcf8-tmp\") pod \"klusterlet-addon-workmgr-845cc56b65-hzvqd\" (UID: \"e8cefde9-2f13-4e0b-867b-590873b6fcf8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd" Apr 22 14:16:10.137814 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.137384 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8de6ebfa-8fdc-401b-b660-aaaf38fda26e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv\" (UID: \"8de6ebfa-8fdc-401b-b660-aaaf38fda26e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.137814 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.137415 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8de6ebfa-8fdc-401b-b660-aaaf38fda26e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv\" (UID: \"8de6ebfa-8fdc-401b-b660-aaaf38fda26e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.137814 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.137447 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8de6ebfa-8fdc-401b-b660-aaaf38fda26e-ca\") pod \"cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv\" (UID: \"8de6ebfa-8fdc-401b-b660-aaaf38fda26e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.137814 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.137680 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8de6ebfa-8fdc-401b-b660-aaaf38fda26e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv\" (UID: \"8de6ebfa-8fdc-401b-b660-aaaf38fda26e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.138102 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.137985 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e8cefde9-2f13-4e0b-867b-590873b6fcf8-tmp\") pod \"klusterlet-addon-workmgr-845cc56b65-hzvqd\" (UID: \"e8cefde9-2f13-4e0b-867b-590873b6fcf8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd" Apr 22 14:16:10.139830 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.139775 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e8cefde9-2f13-4e0b-867b-590873b6fcf8-klusterlet-config\") pod \"klusterlet-addon-workmgr-845cc56b65-hzvqd\" (UID: \"e8cefde9-2f13-4e0b-867b-590873b6fcf8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd" Apr 22 14:16:10.139830 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.139803 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/85ad698d-071e-44a1-ad7c-b9db3746cb83-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-d5f7f875-znf7p\" (UID: \"85ad698d-071e-44a1-ad7c-b9db3746cb83\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d5f7f875-znf7p" Apr 22 14:16:10.139985 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.139941 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-7rsj7\" (UID: \"3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7" Apr 22 14:16:10.140070 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.140048 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8de6ebfa-8fdc-401b-b660-aaaf38fda26e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv\" (UID: \"8de6ebfa-8fdc-401b-b660-aaaf38fda26e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.140616 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.140592 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8de6ebfa-8fdc-401b-b660-aaaf38fda26e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv\" (UID: \"8de6ebfa-8fdc-401b-b660-aaaf38fda26e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.141083 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.141063 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8de6ebfa-8fdc-401b-b660-aaaf38fda26e-ca\") pod \"cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv\" (UID: \"8de6ebfa-8fdc-401b-b660-aaaf38fda26e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.141383 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.141360 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8de6ebfa-8fdc-401b-b660-aaaf38fda26e-hub\") pod \"cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv\" (UID: \"8de6ebfa-8fdc-401b-b660-aaaf38fda26e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.145303 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.145274 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-7rsj7\" (UID: \"3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7" Apr 22 14:16:10.145940 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.145918 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vqzk\" (UniqueName: \"kubernetes.io/projected/3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9-kube-api-access-2vqzk\") pod \"kube-storage-version-migrator-operator-6769c5d45-7rsj7\" (UID: \"3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7" Apr 22 14:16:10.146180 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.146161 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk9g7\" (UniqueName: \"kubernetes.io/projected/e8cefde9-2f13-4e0b-867b-590873b6fcf8-kube-api-access-hk9g7\") pod \"klusterlet-addon-workmgr-845cc56b65-hzvqd\" (UID: \"e8cefde9-2f13-4e0b-867b-590873b6fcf8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd" Apr 22 14:16:10.146405 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.146387 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzlhm\" (UniqueName: \"kubernetes.io/projected/d396d453-0baa-4b43-883b-255470bf1283-kube-api-access-xzlhm\") pod \"network-check-source-8894fc9bd-lqv79\" (UID: \"d396d453-0baa-4b43-883b-255470bf1283\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lqv79" Apr 22 14:16:10.147160 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.147141 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b578w\" (UniqueName: \"kubernetes.io/projected/85ad698d-071e-44a1-ad7c-b9db3746cb83-kube-api-access-b578w\") pod \"managed-serviceaccount-addon-agent-d5f7f875-znf7p\" (UID: \"85ad698d-071e-44a1-ad7c-b9db3746cb83\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d5f7f875-znf7p" Apr 22 14:16:10.147311 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.147160 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-652tg\" (UniqueName: \"kubernetes.io/projected/8de6ebfa-8fdc-401b-b660-aaaf38fda26e-kube-api-access-652tg\") pod \"cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv\" (UID: \"8de6ebfa-8fdc-401b-b660-aaaf38fda26e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.178464 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.178432 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7" Apr 22 14:16:10.194197 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.194157 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd" Apr 22 14:16:10.219934 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.219904 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d5f7f875-znf7p" Apr 22 14:16:10.229611 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.229589 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lqv79" Apr 22 14:16:10.257562 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.257528 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:16:10.339476 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.339441 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:10.339476 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.339478 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:10.339684 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:10.339637 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle podName:cda3d6aa-e281-4e92-adba-267754d5dd86 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:11.339612117 +0000 UTC m=+33.121989272 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle") pod "router-default-857878798d-nzjn8" (UID: "cda3d6aa-e281-4e92-adba-267754d5dd86") : configmap references non-existent config key: service-ca.crt Apr 22 14:16:10.339754 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:10.339736 2542 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:16:10.339835 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:10.339823 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs podName:cda3d6aa-e281-4e92-adba-267754d5dd86 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:11.339791862 +0000 UTC m=+33.122168976 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs") pod "router-default-857878798d-nzjn8" (UID: "cda3d6aa-e281-4e92-adba-267754d5dd86") : secret "router-metrics-certs-default" not found Apr 22 14:16:10.440605 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.440564 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:10.440794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.440714 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rfgzx\" (UID: \"edd11ecf-24da-4d1e-83da-e059f384b9ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx" Apr 22 14:16:10.440794 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:10.440722 2542 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:16:10.440794 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:10.440748 2542 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fd97b775f-28299: secret "image-registry-tls" not found Apr 22 14:16:10.440971 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:10.440805 2542 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 14:16:10.440971 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:10.440814 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls podName:e092464a-c357-4406-b079-4c8f0ece38e9 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:11.440794209 +0000 UTC m=+33.223171338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls") pod "image-registry-7fd97b775f-28299" (UID: "e092464a-c357-4406-b079-4c8f0ece38e9") : secret "image-registry-tls" not found Apr 22 14:16:10.440971 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.440892 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nqmkm\" (UID: \"b99ce1d1-2784-4681-b12f-a7fc95fa5fa1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" Apr 22 14:16:10.440971 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:10.440922 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls podName:edd11ecf-24da-4d1e-83da-e059f384b9ac nodeName:}" failed. No retries permitted until 2026-04-22 14:16:11.440881783 +0000 UTC m=+33.223258916 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rfgzx" (UID: "edd11ecf-24da-4d1e-83da-e059f384b9ac") : secret "samples-operator-tls" not found Apr 22 14:16:10.441145 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:10.440974 2542 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:16:10.441145 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:10.441025 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls podName:b99ce1d1-2784-4681-b12f-a7fc95fa5fa1 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:11.44101409 +0000 UTC m=+33.223391218 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nqmkm" (UID: "b99ce1d1-2784-4681-b12f-a7fc95fa5fa1") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:16:10.543088 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.543047 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert\") pod \"ingress-canary-h8bkz\" (UID: \"522ef3ef-a141-4082-8f5a-55a059c52133\") " pod="openshift-ingress-canary/ingress-canary-h8bkz" Apr 22 14:16:10.543285 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.543216 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls\") pod \"dns-default-vzzs6\" (UID: \"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1\") " pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:10.543440 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:10.543424 2542 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:10.543520 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:10.543510 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls podName:fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:11.543490739 +0000 UTC m=+33.325867864 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls") pod "dns-default-vzzs6" (UID: "fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1") : secret "dns-default-metrics-tls" not found Apr 22 14:16:10.543704 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:10.543679 2542 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:10.543818 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:10.543739 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert podName:522ef3ef-a141-4082-8f5a-55a059c52133 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:11.543721841 +0000 UTC m=+33.326098952 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert") pod "ingress-canary-h8bkz" (UID: "522ef3ef-a141-4082-8f5a-55a059c52133") : secret "canary-serving-cert" not found Apr 22 14:16:10.787083 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.787005 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:16:10.787285 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.787261 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:16:10.787285 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.787283 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:16:10.790029 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.790008 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 14:16:10.790123 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.790029 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7xm6q\"" Apr 22 14:16:10.790123 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.790077 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 14:16:10.790273 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:10.790154 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4bnk6\"" Apr 22 14:16:11.350321 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.350279 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:11.350806 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.350333 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:11.350806 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:11.350457 2542 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:16:11.350806 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:11.350472 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle podName:cda3d6aa-e281-4e92-adba-267754d5dd86 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:13.35045023 +0000 UTC m=+35.132827341 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle") pod "router-default-857878798d-nzjn8" (UID: "cda3d6aa-e281-4e92-adba-267754d5dd86") : configmap references non-existent config key: service-ca.crt Apr 22 14:16:11.350806 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:11.350515 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs podName:cda3d6aa-e281-4e92-adba-267754d5dd86 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:13.350500857 +0000 UTC m=+35.132877968 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs") pod "router-default-857878798d-nzjn8" (UID: "cda3d6aa-e281-4e92-adba-267754d5dd86") : secret "router-metrics-certs-default" not found Apr 22 14:16:11.451717 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.451678 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nqmkm\" (UID: \"b99ce1d1-2784-4681-b12f-a7fc95fa5fa1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" Apr 22 14:16:11.451885 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.451719 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs\") pod \"network-metrics-daemon-g66xm\" (UID: \"d326b6a1-5cbd-47fa-a676-90af9406d2a9\") " pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:16:11.451885 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.451774 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:11.451885 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.451831 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rfgzx\" (UID: \"edd11ecf-24da-4d1e-83da-e059f384b9ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx" Apr 22 14:16:11.451885 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:11.451850 2542 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:16:11.452082 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:11.451899 2542 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:16:11.452082 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:11.451940 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls podName:b99ce1d1-2784-4681-b12f-a7fc95fa5fa1 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:13.451913739 +0000 UTC m=+35.234290852 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nqmkm" (UID: "b99ce1d1-2784-4681-b12f-a7fc95fa5fa1") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:16:11.452082 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:11.451946 2542 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:16:11.452082 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:11.451971 2542 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fd97b775f-28299: secret "image-registry-tls" not found Apr 22 14:16:11.452082 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:11.451972 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs podName:d326b6a1-5cbd-47fa-a676-90af9406d2a9 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:43.451960966 +0000 UTC m=+65.234338083 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs") pod "network-metrics-daemon-g66xm" (UID: "d326b6a1-5cbd-47fa-a676-90af9406d2a9") : secret "metrics-daemon-secret" not found Apr 22 14:16:11.452082 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:11.451976 2542 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 14:16:11.452082 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:11.452016 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls podName:edd11ecf-24da-4d1e-83da-e059f384b9ac nodeName:}" failed. No retries permitted until 2026-04-22 14:16:13.451999958 +0000 UTC m=+35.234377070 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rfgzx" (UID: "edd11ecf-24da-4d1e-83da-e059f384b9ac") : secret "samples-operator-tls" not found Apr 22 14:16:11.452082 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:11.452082 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls podName:e092464a-c357-4406-b079-4c8f0ece38e9 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:13.452066587 +0000 UTC m=+35.234443698 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls") pod "image-registry-7fd97b775f-28299" (UID: "e092464a-c357-4406-b079-4c8f0ece38e9") : secret "image-registry-tls" not found Apr 22 14:16:11.553952 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.553752 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert\") pod \"ingress-canary-h8bkz\" (UID: \"522ef3ef-a141-4082-8f5a-55a059c52133\") " pod="openshift-ingress-canary/ingress-canary-h8bkz" Apr 22 14:16:11.553952 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.553842 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8b2k\" (UniqueName: \"kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k\") pod \"network-check-target-b27s2\" (UID: \"6259e3a6-004d-4f50-9a36-dc28c9b0cd96\") " pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:16:11.553952 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.553908 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls\") pod \"dns-default-vzzs6\" (UID: \"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1\") " pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:11.553952 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:11.553936 2542 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:11.554316 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:11.554016 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert podName:522ef3ef-a141-4082-8f5a-55a059c52133 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:13.553989583 +0000 UTC m=+35.336366700 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert") pod "ingress-canary-h8bkz" (UID: "522ef3ef-a141-4082-8f5a-55a059c52133") : secret "canary-serving-cert" not found Apr 22 14:16:11.554316 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:11.554037 2542 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:11.554316 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:11.554083 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls podName:fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:13.554068666 +0000 UTC m=+35.336445791 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls") pod "dns-default-vzzs6" (UID: "fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1") : secret "dns-default-metrics-tls" not found Apr 22 14:16:11.557156 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.557105 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8b2k\" (UniqueName: \"kubernetes.io/projected/6259e3a6-004d-4f50-9a36-dc28c9b0cd96-kube-api-access-b8b2k\") pod \"network-check-target-b27s2\" (UID: \"6259e3a6-004d-4f50-9a36-dc28c9b0cd96\") " pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:16:11.703861 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.703393 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:16:11.780667 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.780619 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ctlhj"] Apr 22 14:16:11.783228 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.783200 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lqv79"] Apr 22 14:16:11.789799 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.789755 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk"] Apr 22 14:16:11.796410 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.796390 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd"] Apr 22 14:16:11.803310 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.803288 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6chj8"] Apr 22 14:16:11.812995 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.812504 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv"] Apr 22 14:16:11.813988 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.813971 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d5f7f875-znf7p"] Apr 22 14:16:11.815920 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.815871 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7"] Apr 22 14:16:11.815920 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:11.815900 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-czjzj"] Apr 22 14:16:11.844603 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:16:11.844547 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod749b33a6_3425_465e_a4b7_7844646cda79.slice/crio-b24c63e6300b4e5bed9ff7caa626653ed21623e2112ab3f800824ff611bf154c WatchSource:0}: Error finding container b24c63e6300b4e5bed9ff7caa626653ed21623e2112ab3f800824ff611bf154c: Status 404 returned error can't find the container with id b24c63e6300b4e5bed9ff7caa626653ed21623e2112ab3f800824ff611bf154c Apr 22 14:16:11.845559 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:16:11.845307 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd396d453_0baa_4b43_883b_255470bf1283.slice/crio-88a257c61f33e720dfdcc45ce48ae3123386238501a63a09e626f55cdaa52d87 WatchSource:0}: Error finding container 88a257c61f33e720dfdcc45ce48ae3123386238501a63a09e626f55cdaa52d87: Status 404 returned error can't find the container with id 88a257c61f33e720dfdcc45ce48ae3123386238501a63a09e626f55cdaa52d87 Apr 22 14:16:11.846045 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:16:11.845971 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43dd5ba1_95ef_4c1b_b853_cdc8032e3625.slice/crio-0baa13b7c88188efeee74389e6aa98b97b11b1f689288071720e7afc39e60b37 WatchSource:0}: Error finding container 0baa13b7c88188efeee74389e6aa98b97b11b1f689288071720e7afc39e60b37: Status 404 returned error can't find the container with id 0baa13b7c88188efeee74389e6aa98b97b11b1f689288071720e7afc39e60b37 Apr 22 14:16:11.847016 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:16:11.846977 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8cefde9_2f13_4e0b_867b_590873b6fcf8.slice/crio-6e7a907af767fd9c82848cffcc2ada9d8394dd993297af0f489e5fddc33ea2d7 WatchSource:0}: Error finding container 6e7a907af767fd9c82848cffcc2ada9d8394dd993297af0f489e5fddc33ea2d7: Status 404 returned error can't find the container with id 6e7a907af767fd9c82848cffcc2ada9d8394dd993297af0f489e5fddc33ea2d7 Apr 22 14:16:12.002867 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:12.002666 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b27s2"] Apr 22 14:16:12.008600 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:16:12.008568 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6259e3a6_004d_4f50_9a36_dc28c9b0cd96.slice/crio-a395b3c5495d6269e6421e26ece012985ea601c2ecf359451c683a17784d1836 WatchSource:0}: Error finding container a395b3c5495d6269e6421e26ece012985ea601c2ecf359451c683a17784d1836: Status 404 returned error can't find the container with id a395b3c5495d6269e6421e26ece012985ea601c2ecf359451c683a17784d1836 Apr 22 14:16:12.013003 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:12.012970 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" event={"ID":"749b33a6-3425-465e-a4b7-7844646cda79","Type":"ContainerStarted","Data":"b24c63e6300b4e5bed9ff7caa626653ed21623e2112ab3f800824ff611bf154c"} Apr 22 14:16:12.014047 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:12.014014 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b27s2" event={"ID":"6259e3a6-004d-4f50-9a36-dc28c9b0cd96","Type":"ContainerStarted","Data":"a395b3c5495d6269e6421e26ece012985ea601c2ecf359451c683a17784d1836"} Apr 22 14:16:12.014927 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:12.014908 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6chj8" event={"ID":"bd7bfce3-b27e-45be-bb80-1b123b5b0fef","Type":"ContainerStarted","Data":"acd5a47572955c808a61000b0ce754fa3b519a7da950a07ffc133e4d0bd65321"} Apr 22 14:16:12.015795 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:12.015769 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-czjzj" event={"ID":"ff59a9f8-3474-46f0-9922-f6372bd8119f","Type":"ContainerStarted","Data":"b13d07193ed46ae2c7ea9b277856cf2077a87da226f5871c9aa65384ba9a9590"} Apr 22 14:16:12.016730 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:12.016704 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" event={"ID":"8de6ebfa-8fdc-401b-b660-aaaf38fda26e","Type":"ContainerStarted","Data":"a261713a8b36865d6babdc569080537c6404ee176b070c1f51170ec3e56aa08a"} Apr 22 14:16:12.017630 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:12.017602 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7" event={"ID":"3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9","Type":"ContainerStarted","Data":"ab315a1e656c3bba4e151de236be7324021472082f9f295f71f84e9da4b580c1"} Apr 22 14:16:12.018573 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:12.018549 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk" event={"ID":"43dd5ba1-95ef-4c1b-b853-cdc8032e3625","Type":"ContainerStarted","Data":"0baa13b7c88188efeee74389e6aa98b97b11b1f689288071720e7afc39e60b37"} Apr 22 14:16:12.019453 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:12.019437 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lqv79" event={"ID":"d396d453-0baa-4b43-883b-255470bf1283","Type":"ContainerStarted","Data":"88a257c61f33e720dfdcc45ce48ae3123386238501a63a09e626f55cdaa52d87"} Apr 22 14:16:12.020300 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:12.020278 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d5f7f875-znf7p" event={"ID":"85ad698d-071e-44a1-ad7c-b9db3746cb83","Type":"ContainerStarted","Data":"09c95ab34a319fd0b690d0f38c846b040975779dcbb96147c9711e7175408c75"} Apr 22 14:16:12.021020 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:12.021002 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd" event={"ID":"e8cefde9-2f13-4e0b-867b-590873b6fcf8","Type":"ContainerStarted","Data":"6e7a907af767fd9c82848cffcc2ada9d8394dd993297af0f489e5fddc33ea2d7"} Apr 22 14:16:13.055806 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:13.054761 2542 generic.go:358] "Generic (PLEG): container finished" podID="6f09af95-f295-4dec-8131-f3dad5bd3e4d" containerID="b327396cdfeebc4243361f055ce61122038dc7a07d4bc1fff52f37024b56b57c" exitCode=0 Apr 22 14:16:13.055806 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:13.054837 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7x75g" event={"ID":"6f09af95-f295-4dec-8131-f3dad5bd3e4d","Type":"ContainerDied","Data":"b327396cdfeebc4243361f055ce61122038dc7a07d4bc1fff52f37024b56b57c"} Apr 22 14:16:13.373623 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:13.373593 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:13.373745 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:13.373647 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:13.373884 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:13.373868 2542 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:16:13.373957 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:13.373935 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs podName:cda3d6aa-e281-4e92-adba-267754d5dd86 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:17.373916853 +0000 UTC m=+39.156293967 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs") pod "router-default-857878798d-nzjn8" (UID: "cda3d6aa-e281-4e92-adba-267754d5dd86") : secret "router-metrics-certs-default" not found Apr 22 14:16:13.374524 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:13.374386 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle podName:cda3d6aa-e281-4e92-adba-267754d5dd86 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:17.37436904 +0000 UTC m=+39.156746157 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle") pod "router-default-857878798d-nzjn8" (UID: "cda3d6aa-e281-4e92-adba-267754d5dd86") : configmap references non-existent config key: service-ca.crt Apr 22 14:16:13.476554 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:13.475761 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rfgzx\" (UID: \"edd11ecf-24da-4d1e-83da-e059f384b9ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx" Apr 22 14:16:13.476554 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:13.475896 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nqmkm\" (UID: \"b99ce1d1-2784-4681-b12f-a7fc95fa5fa1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" Apr 22 14:16:13.476554 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:13.475942 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:13.476554 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:13.476153 2542 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:16:13.476554 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:13.476169 2542 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fd97b775f-28299: secret "image-registry-tls" not found Apr 22 14:16:13.476554 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:13.476249 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls podName:e092464a-c357-4406-b079-4c8f0ece38e9 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:17.476227605 +0000 UTC m=+39.258604731 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls") pod "image-registry-7fd97b775f-28299" (UID: "e092464a-c357-4406-b079-4c8f0ece38e9") : secret "image-registry-tls" not found Apr 22 14:16:13.476554 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:13.476460 2542 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:16:13.476554 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:13.476487 2542 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 14:16:13.476554 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:13.476512 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls podName:b99ce1d1-2784-4681-b12f-a7fc95fa5fa1 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:17.476498084 +0000 UTC m=+39.258875199 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nqmkm" (UID: "b99ce1d1-2784-4681-b12f-a7fc95fa5fa1") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:16:13.476554 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:13.476554 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls podName:edd11ecf-24da-4d1e-83da-e059f384b9ac nodeName:}" failed. No retries permitted until 2026-04-22 14:16:17.476536156 +0000 UTC m=+39.258913273 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rfgzx" (UID: "edd11ecf-24da-4d1e-83da-e059f384b9ac") : secret "samples-operator-tls" not found Apr 22 14:16:13.577307 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:13.576431 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert\") pod \"ingress-canary-h8bkz\" (UID: \"522ef3ef-a141-4082-8f5a-55a059c52133\") " pod="openshift-ingress-canary/ingress-canary-h8bkz" Apr 22 14:16:13.577307 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:13.576538 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls\") pod \"dns-default-vzzs6\" (UID: \"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1\") " pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:13.577307 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:13.576761 2542 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:13.577307 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:13.576820 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls podName:fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:17.576802173 +0000 UTC m=+39.359179290 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls") pod "dns-default-vzzs6" (UID: "fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1") : secret "dns-default-metrics-tls" not found Apr 22 14:16:13.577307 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:13.577220 2542 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:13.577307 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:13.577268 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert podName:522ef3ef-a141-4082-8f5a-55a059c52133 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:17.577254076 +0000 UTC m=+39.359631190 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert") pod "ingress-canary-h8bkz" (UID: "522ef3ef-a141-4082-8f5a-55a059c52133") : secret "canary-serving-cert" not found Apr 22 14:16:14.107767 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:14.106357 2542 generic.go:358] "Generic (PLEG): container finished" podID="6f09af95-f295-4dec-8131-f3dad5bd3e4d" containerID="8cd84445b538115c21ba5aaea65aad2b0abd0b34819b0beac748417244722b69" exitCode=0 Apr 22 14:16:14.107767 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:14.106417 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7x75g" event={"ID":"6f09af95-f295-4dec-8131-f3dad5bd3e4d","Type":"ContainerDied","Data":"8cd84445b538115c21ba5aaea65aad2b0abd0b34819b0beac748417244722b69"} Apr 22 14:16:17.416523 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:17.416489 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:17.416523 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:17.416537 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:17.417122 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:17.416692 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle podName:cda3d6aa-e281-4e92-adba-267754d5dd86 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:25.416673527 +0000 UTC m=+47.199050639 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle") pod "router-default-857878798d-nzjn8" (UID: "cda3d6aa-e281-4e92-adba-267754d5dd86") : configmap references non-existent config key: service-ca.crt Apr 22 14:16:17.417122 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:17.416792 2542 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:16:17.417122 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:17.416860 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs podName:cda3d6aa-e281-4e92-adba-267754d5dd86 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:25.416843565 +0000 UTC m=+47.199220692 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs") pod "router-default-857878798d-nzjn8" (UID: "cda3d6aa-e281-4e92-adba-267754d5dd86") : secret "router-metrics-certs-default" not found Apr 22 14:16:17.517777 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:17.517743 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rfgzx\" (UID: \"edd11ecf-24da-4d1e-83da-e059f384b9ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx" Apr 22 14:16:17.517964 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:17.517827 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nqmkm\" (UID: \"b99ce1d1-2784-4681-b12f-a7fc95fa5fa1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" Apr 22 14:16:17.517964 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:17.517855 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:17.517964 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:17.517941 2542 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 14:16:17.518119 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:17.518001 2542 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:16:17.518119 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:17.518007 2542 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:16:17.518119 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:17.518045 2542 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fd97b775f-28299: secret "image-registry-tls" not found Apr 22 14:16:17.518119 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:17.518046 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls podName:edd11ecf-24da-4d1e-83da-e059f384b9ac nodeName:}" failed. No retries permitted until 2026-04-22 14:16:25.518023958 +0000 UTC m=+47.300401086 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rfgzx" (UID: "edd11ecf-24da-4d1e-83da-e059f384b9ac") : secret "samples-operator-tls" not found Apr 22 14:16:17.518119 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:17.518079 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls podName:b99ce1d1-2784-4681-b12f-a7fc95fa5fa1 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:25.518066644 +0000 UTC m=+47.300443755 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nqmkm" (UID: "b99ce1d1-2784-4681-b12f-a7fc95fa5fa1") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:16:17.518119 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:17.518091 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls podName:e092464a-c357-4406-b079-4c8f0ece38e9 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:25.51808409 +0000 UTC m=+47.300461201 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls") pod "image-registry-7fd97b775f-28299" (UID: "e092464a-c357-4406-b079-4c8f0ece38e9") : secret "image-registry-tls" not found Apr 22 14:16:17.618389 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:17.618356 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert\") pod \"ingress-canary-h8bkz\" (UID: \"522ef3ef-a141-4082-8f5a-55a059c52133\") " pod="openshift-ingress-canary/ingress-canary-h8bkz" Apr 22 14:16:17.618563 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:17.618450 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls\") pod \"dns-default-vzzs6\" (UID: \"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1\") " pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:17.618563 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:17.618513 2542 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:17.618682 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:17.618578 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert podName:522ef3ef-a141-4082-8f5a-55a059c52133 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:25.618553729 +0000 UTC m=+47.400930845 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert") pod "ingress-canary-h8bkz" (UID: "522ef3ef-a141-4082-8f5a-55a059c52133") : secret "canary-serving-cert" not found Apr 22 14:16:17.618682 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:17.618631 2542 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:17.618769 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:17.618699 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls podName:fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:25.618681981 +0000 UTC m=+47.401059096 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls") pod "dns-default-vzzs6" (UID: "fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1") : secret "dns-default-metrics-tls" not found Apr 22 14:16:18.042926 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:18.042886 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk"] Apr 22 14:16:18.081554 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:18.081520 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk"] Apr 22 14:16:18.081554 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:18.081548 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk" Apr 22 14:16:18.084238 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:18.084219 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 14:16:18.084347 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:18.084308 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-d9s8x\"" Apr 22 14:16:18.085334 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:18.085315 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 14:16:18.224793 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:18.224749 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fnfhk\" (UID: \"195e9aad-8ddd-4d14-84ae-1158d9c78159\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk" Apr 22 14:16:18.224970 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:18.224890 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/195e9aad-8ddd-4d14-84ae-1158d9c78159-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fnfhk\" (UID: \"195e9aad-8ddd-4d14-84ae-1158d9c78159\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk" Apr 22 14:16:18.325651 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:18.325549 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fnfhk\" (UID: \"195e9aad-8ddd-4d14-84ae-1158d9c78159\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk" Apr 22 14:16:18.325651 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:18.325645 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/195e9aad-8ddd-4d14-84ae-1158d9c78159-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fnfhk\" (UID: \"195e9aad-8ddd-4d14-84ae-1158d9c78159\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk" Apr 22 14:16:18.325838 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:18.325712 2542 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 14:16:18.325838 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:18.325796 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert podName:195e9aad-8ddd-4d14-84ae-1158d9c78159 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:18.825773854 +0000 UTC m=+40.608150987 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fnfhk" (UID: "195e9aad-8ddd-4d14-84ae-1158d9c78159") : secret "networking-console-plugin-cert" not found Apr 22 14:16:18.326387 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:18.326366 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/195e9aad-8ddd-4d14-84ae-1158d9c78159-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fnfhk\" (UID: \"195e9aad-8ddd-4d14-84ae-1158d9c78159\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk" Apr 22 14:16:18.829849 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:18.829819 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fnfhk\" (UID: \"195e9aad-8ddd-4d14-84ae-1158d9c78159\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk" Apr 22 14:16:18.830375 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:18.829954 2542 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 14:16:18.830375 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:18.830013 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert podName:195e9aad-8ddd-4d14-84ae-1158d9c78159 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:19.829996891 +0000 UTC m=+41.612374003 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fnfhk" (UID: "195e9aad-8ddd-4d14-84ae-1158d9c78159") : secret "networking-console-plugin-cert" not found Apr 22 14:16:19.839267 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:19.839226 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fnfhk\" (UID: \"195e9aad-8ddd-4d14-84ae-1158d9c78159\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk" Apr 22 14:16:19.839713 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:19.839403 2542 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 14:16:19.839713 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:19.839501 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert podName:195e9aad-8ddd-4d14-84ae-1158d9c78159 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:21.83947959 +0000 UTC m=+43.621856717 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fnfhk" (UID: "195e9aad-8ddd-4d14-84ae-1158d9c78159") : secret "networking-console-plugin-cert" not found Apr 22 14:16:21.856434 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:21.856390 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret\") pod \"global-pull-secret-syncer-s96sk\" (UID: \"00b1aeaa-9a7a-4380-a5aa-0891caae4c5e\") " pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:16:21.856894 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:21.856497 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fnfhk\" (UID: \"195e9aad-8ddd-4d14-84ae-1158d9c78159\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk" Apr 22 14:16:21.856894 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:21.856673 2542 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 14:16:21.856894 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:21.856775 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert podName:195e9aad-8ddd-4d14-84ae-1158d9c78159 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:25.856754731 +0000 UTC m=+47.639131860 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fnfhk" (UID: "195e9aad-8ddd-4d14-84ae-1158d9c78159") : secret "networking-console-plugin-cert" not found Apr 22 14:16:21.860659 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:21.860638 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/00b1aeaa-9a7a-4380-a5aa-0891caae4c5e-original-pull-secret\") pod \"global-pull-secret-syncer-s96sk\" (UID: \"00b1aeaa-9a7a-4380-a5aa-0891caae4c5e\") " pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:16:21.924994 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:21.924952 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s96sk" Apr 22 14:16:24.627478 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:24.627070 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-s96sk"] Apr 22 14:16:25.141062 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.140557 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7" event={"ID":"3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9","Type":"ContainerStarted","Data":"c01b2382e7816a2496cb1be4f2c2749df065f30d6ff5fcb2a04392db237a7031"} Apr 22 14:16:25.142737 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.142705 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk" event={"ID":"43dd5ba1-95ef-4c1b-b853-cdc8032e3625","Type":"ContainerStarted","Data":"90dcca55c5a4c37debf94a75e02e40ab5ae42e54f3bd01ea65cbb2a0dff5f20b"} Apr 22 14:16:25.144739 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.144714 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lqv79" event={"ID":"d396d453-0baa-4b43-883b-255470bf1283","Type":"ContainerStarted","Data":"7c14dbf87e80c95e6a12d75bfbb4e49b986e170d43e57f670208488ca0dcc7bd"} Apr 22 14:16:25.146115 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.146087 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-s96sk" event={"ID":"00b1aeaa-9a7a-4380-a5aa-0891caae4c5e","Type":"ContainerStarted","Data":"1869be8a7bcd29b2749b14e69bdb9a6c7ebfd0eaf0d92072d6a0ee38f3bf1082"} Apr 22 14:16:25.147660 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.147621 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d5f7f875-znf7p" event={"ID":"85ad698d-071e-44a1-ad7c-b9db3746cb83","Type":"ContainerStarted","Data":"00deaa6cd589bc84622d991e235e6c04b3f8a67c11603755d0bbab4b855abc91"} Apr 22 14:16:25.151078 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.151054 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7x75g" event={"ID":"6f09af95-f295-4dec-8131-f3dad5bd3e4d","Type":"ContainerStarted","Data":"df01222f01d158078ae949a384236ddd9bea622efd11ff94e45c6709c9054c08"} Apr 22 14:16:25.153012 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.152893 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd" event={"ID":"e8cefde9-2f13-4e0b-867b-590873b6fcf8","Type":"ContainerStarted","Data":"369787a46e674f2c243a81850dcb9663fead5c008e95a9a39245288c64f8f1b5"} Apr 22 14:16:25.153321 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.153277 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd" Apr 22 14:16:25.154535 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.154514 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/0.log" Apr 22 14:16:25.154622 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.154548 2542 generic.go:358] "Generic (PLEG): container finished" podID="749b33a6-3425-465e-a4b7-7844646cda79" containerID="d63c8e1ac9fb37adeba52149d9c46798ccbd026ae58927546af4ecdceb7c1203" exitCode=255 Apr 22 14:16:25.154622 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.154602 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" event={"ID":"749b33a6-3425-465e-a4b7-7844646cda79","Type":"ContainerDied","Data":"d63c8e1ac9fb37adeba52149d9c46798ccbd026ae58927546af4ecdceb7c1203"} Apr 22 14:16:25.155084 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.154844 2542 scope.go:117] "RemoveContainer" containerID="d63c8e1ac9fb37adeba52149d9c46798ccbd026ae58927546af4ecdceb7c1203" Apr 22 14:16:25.155841 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.155803 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd" Apr 22 14:16:25.157394 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.156806 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b27s2" event={"ID":"6259e3a6-004d-4f50-9a36-dc28c9b0cd96","Type":"ContainerStarted","Data":"149fcada21c9b45120e32ec3b32a296ff8371a81f56ccba5460e0e35d52bb649"} Apr 22 14:16:25.157394 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.157177 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:16:25.160062 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.159917 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6chj8" event={"ID":"bd7bfce3-b27e-45be-bb80-1b123b5b0fef","Type":"ContainerStarted","Data":"b3e24a2606f7d6767440efde1abd70d7f3f2d14ec2c7368cffd9981813ff192b"} Apr 22 14:16:25.162941 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.162919 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-czjzj" event={"ID":"ff59a9f8-3474-46f0-9922-f6372bd8119f","Type":"ContainerStarted","Data":"40fa5088a75ac77b4b15c3a227045fc6256c44fdd765dab124ecf9d85577ef37"} Apr 22 14:16:25.163135 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.163088 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7" podStartSLOduration=27.546548119 podStartE2EDuration="40.16307217s" podCreationTimestamp="2026-04-22 14:15:45 +0000 UTC" firstStartedPulling="2026-04-22 14:16:11.881371523 +0000 UTC m=+33.663748640" lastFinishedPulling="2026-04-22 14:16:24.497895565 +0000 UTC m=+46.280272691" observedRunningTime="2026-04-22 14:16:25.159259368 +0000 UTC m=+46.941636505" watchObservedRunningTime="2026-04-22 14:16:25.16307217 +0000 UTC m=+46.945449309" Apr 22 14:16:25.164846 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.164818 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" event={"ID":"8de6ebfa-8fdc-401b-b660-aaaf38fda26e","Type":"ContainerStarted","Data":"6053814e56fd7167ba622939fadb8d0f688e0564f77601fa3359bff4fc5595ed"} Apr 22 14:16:25.208059 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.207025 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d5f7f875-znf7p" podStartSLOduration=30.589039884 podStartE2EDuration="43.207007692s" podCreationTimestamp="2026-04-22 14:15:42 +0000 UTC" firstStartedPulling="2026-04-22 14:16:11.881708848 +0000 UTC m=+33.664085961" lastFinishedPulling="2026-04-22 14:16:24.499676634 +0000 UTC m=+46.282053769" observedRunningTime="2026-04-22 14:16:25.17769549 +0000 UTC m=+46.960072624" watchObservedRunningTime="2026-04-22 14:16:25.207007692 +0000 UTC m=+46.989384826" Apr 22 14:16:25.208691 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.208348 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7x75g" podStartSLOduration=15.328192146 podStartE2EDuration="47.208336205s" podCreationTimestamp="2026-04-22 14:15:38 +0000 UTC" firstStartedPulling="2026-04-22 14:15:40.024342084 +0000 UTC m=+1.806719195" lastFinishedPulling="2026-04-22 14:16:11.904486135 +0000 UTC m=+33.686863254" observedRunningTime="2026-04-22 14:16:25.203876838 +0000 UTC m=+46.986253972" watchObservedRunningTime="2026-04-22 14:16:25.208336205 +0000 UTC m=+46.990713339" Apr 22 14:16:25.234282 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.228128 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-b27s2" podStartSLOduration=33.723620602 podStartE2EDuration="46.228109023s" podCreationTimestamp="2026-04-22 14:15:39 +0000 UTC" firstStartedPulling="2026-04-22 14:16:12.010891999 +0000 UTC m=+33.793269110" lastFinishedPulling="2026-04-22 14:16:24.515380417 +0000 UTC m=+46.297757531" observedRunningTime="2026-04-22 14:16:25.226904923 +0000 UTC m=+47.009282054" watchObservedRunningTime="2026-04-22 14:16:25.228109023 +0000 UTC m=+47.010486156" Apr 22 14:16:25.248596 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.248252 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-845cc56b65-hzvqd" podStartSLOduration=30.598670924 podStartE2EDuration="43.248232913s" podCreationTimestamp="2026-04-22 14:15:42 +0000 UTC" firstStartedPulling="2026-04-22 14:16:11.850604499 +0000 UTC m=+33.632981623" lastFinishedPulling="2026-04-22 14:16:24.500166484 +0000 UTC m=+46.282543612" observedRunningTime="2026-04-22 14:16:25.246285351 +0000 UTC m=+47.028662487" watchObservedRunningTime="2026-04-22 14:16:25.248232913 +0000 UTC m=+47.030610047" Apr 22 14:16:25.305421 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.305233 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk" podStartSLOduration=25.790356238 podStartE2EDuration="38.305064221s" podCreationTimestamp="2026-04-22 14:15:47 +0000 UTC" firstStartedPulling="2026-04-22 14:16:11.848560027 +0000 UTC m=+33.630937141" lastFinishedPulling="2026-04-22 14:16:24.363267998 +0000 UTC m=+46.145645124" observedRunningTime="2026-04-22 14:16:25.302686929 +0000 UTC m=+47.085064059" watchObservedRunningTime="2026-04-22 14:16:25.305064221 +0000 UTC m=+47.087441357" Apr 22 14:16:25.316036 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.314667 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lqv79" podStartSLOduration=23.662409689 podStartE2EDuration="36.314648695s" podCreationTimestamp="2026-04-22 14:15:49 +0000 UTC" firstStartedPulling="2026-04-22 14:16:11.847846554 +0000 UTC m=+33.630223677" lastFinishedPulling="2026-04-22 14:16:24.500085553 +0000 UTC m=+46.282462683" observedRunningTime="2026-04-22 14:16:25.28539757 +0000 UTC m=+47.067774704" watchObservedRunningTime="2026-04-22 14:16:25.314648695 +0000 UTC m=+47.097025827" Apr 22 14:16:25.339717 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.338371 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-czjzj" podStartSLOduration=30.719940703 podStartE2EDuration="43.338351396s" podCreationTimestamp="2026-04-22 14:15:42 +0000 UTC" firstStartedPulling="2026-04-22 14:16:11.881702553 +0000 UTC m=+33.664079682" lastFinishedPulling="2026-04-22 14:16:24.500113247 +0000 UTC m=+46.282490375" observedRunningTime="2026-04-22 14:16:25.337573293 +0000 UTC m=+47.119950429" watchObservedRunningTime="2026-04-22 14:16:25.338351396 +0000 UTC m=+47.120728527" Apr 22 14:16:25.490720 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.490625 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:25.490985 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.490947 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:25.491360 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:25.491330 2542 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:16:25.491594 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:25.491528 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs podName:cda3d6aa-e281-4e92-adba-267754d5dd86 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:41.491491304 +0000 UTC m=+63.273868416 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs") pod "router-default-857878798d-nzjn8" (UID: "cda3d6aa-e281-4e92-adba-267754d5dd86") : secret "router-metrics-certs-default" not found Apr 22 14:16:25.492259 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:25.492229 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle podName:cda3d6aa-e281-4e92-adba-267754d5dd86 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:41.492212531 +0000 UTC m=+63.274589648 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle") pod "router-default-857878798d-nzjn8" (UID: "cda3d6aa-e281-4e92-adba-267754d5dd86") : configmap references non-existent config key: service-ca.crt Apr 22 14:16:25.592122 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.592068 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nqmkm\" (UID: \"b99ce1d1-2784-4681-b12f-a7fc95fa5fa1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" Apr 22 14:16:25.592297 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.592220 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:25.592367 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.592304 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rfgzx\" (UID: \"edd11ecf-24da-4d1e-83da-e059f384b9ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx" Apr 22 14:16:25.592492 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:25.592476 2542 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 14:16:25.592552 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:25.592542 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls podName:edd11ecf-24da-4d1e-83da-e059f384b9ac nodeName:}" failed. No retries permitted until 2026-04-22 14:16:41.592523896 +0000 UTC m=+63.374901023 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rfgzx" (UID: "edd11ecf-24da-4d1e-83da-e059f384b9ac") : secret "samples-operator-tls" not found Apr 22 14:16:25.593237 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:25.593121 2542 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:16:25.593237 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:25.593166 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls podName:b99ce1d1-2784-4681-b12f-a7fc95fa5fa1 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:41.593153239 +0000 UTC m=+63.375530357 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nqmkm" (UID: "b99ce1d1-2784-4681-b12f-a7fc95fa5fa1") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:16:25.593373 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:25.593237 2542 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:16:25.593373 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:25.593260 2542 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fd97b775f-28299: secret "image-registry-tls" not found Apr 22 14:16:25.593373 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:25.593312 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls podName:e092464a-c357-4406-b079-4c8f0ece38e9 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:41.593298722 +0000 UTC m=+63.375675836 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls") pod "image-registry-7fd97b775f-28299" (UID: "e092464a-c357-4406-b079-4c8f0ece38e9") : secret "image-registry-tls" not found Apr 22 14:16:25.694205 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.693804 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert\") pod \"ingress-canary-h8bkz\" (UID: \"522ef3ef-a141-4082-8f5a-55a059c52133\") " pod="openshift-ingress-canary/ingress-canary-h8bkz" Apr 22 14:16:25.694205 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.693896 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls\") pod \"dns-default-vzzs6\" (UID: \"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1\") " pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:25.694205 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:25.694025 2542 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:25.694205 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:25.694082 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls podName:fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:41.694065492 +0000 UTC m=+63.476442609 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls") pod "dns-default-vzzs6" (UID: "fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1") : secret "dns-default-metrics-tls" not found Apr 22 14:16:25.694785 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:25.694580 2542 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:25.694785 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:25.694624 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert podName:522ef3ef-a141-4082-8f5a-55a059c52133 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:41.694610139 +0000 UTC m=+63.476987256 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert") pod "ingress-canary-h8bkz" (UID: "522ef3ef-a141-4082-8f5a-55a059c52133") : secret "canary-serving-cert" not found Apr 22 14:16:25.895702 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:25.895280 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fnfhk\" (UID: \"195e9aad-8ddd-4d14-84ae-1158d9c78159\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk" Apr 22 14:16:25.895702 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:25.895570 2542 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 14:16:25.895702 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:25.895636 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert podName:195e9aad-8ddd-4d14-84ae-1158d9c78159 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:33.895617482 +0000 UTC m=+55.677994609 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fnfhk" (UID: "195e9aad-8ddd-4d14-84ae-1158d9c78159") : secret "networking-console-plugin-cert" not found Apr 22 14:16:26.180982 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:26.180851 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:16:26.184430 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:26.182540 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/0.log" Apr 22 14:16:26.184430 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:26.182576 2542 generic.go:358] "Generic (PLEG): container finished" podID="749b33a6-3425-465e-a4b7-7844646cda79" containerID="985717c4b266b4214c3b8c9f4fc2272cb506101f87300d5d8656e81e3ce6a871" exitCode=255 Apr 22 14:16:26.184430 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:26.183781 2542 scope.go:117] "RemoveContainer" containerID="985717c4b266b4214c3b8c9f4fc2272cb506101f87300d5d8656e81e3ce6a871" Apr 22 14:16:26.184430 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:26.183952 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ctlhj_openshift-console-operator(749b33a6-3425-465e-a4b7-7844646cda79)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" podUID="749b33a6-3425-465e-a4b7-7844646cda79" Apr 22 14:16:26.185552 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:26.184175 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" event={"ID":"749b33a6-3425-465e-a4b7-7844646cda79","Type":"ContainerDied","Data":"985717c4b266b4214c3b8c9f4fc2272cb506101f87300d5d8656e81e3ce6a871"} Apr 22 14:16:26.185552 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:26.185444 2542 scope.go:117] "RemoveContainer" containerID="d63c8e1ac9fb37adeba52149d9c46798ccbd026ae58927546af4ecdceb7c1203" Apr 22 14:16:26.208736 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:26.208551 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6chj8" podStartSLOduration=31.876014749 podStartE2EDuration="44.208533104s" podCreationTimestamp="2026-04-22 14:15:42 +0000 UTC" firstStartedPulling="2026-04-22 14:16:11.881450539 +0000 UTC m=+33.663827664" lastFinishedPulling="2026-04-22 14:16:24.213968905 +0000 UTC m=+45.996346019" observedRunningTime="2026-04-22 14:16:25.359986402 +0000 UTC m=+47.142363536" watchObservedRunningTime="2026-04-22 14:16:26.208533104 +0000 UTC m=+47.990910235" Apr 22 14:16:27.023481 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.023442 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5tvtw"] Apr 22 14:16:27.050136 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.050092 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5tvtw"] Apr 22 14:16:27.050332 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.050263 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:27.053068 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.053044 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 14:16:27.053211 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.053111 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4htz2\"" Apr 22 14:16:27.053211 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.053118 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 14:16:27.107352 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.107308 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/63a7ba57-56d4-4bc2-a700-601433fd8838-crio-socket\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:27.107524 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.107389 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/63a7ba57-56d4-4bc2-a700-601433fd8838-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:27.107524 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.107427 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/63a7ba57-56d4-4bc2-a700-601433fd8838-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:27.107617 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.107558 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/63a7ba57-56d4-4bc2-a700-601433fd8838-data-volume\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:27.107686 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.107669 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj2wf\" (UniqueName: \"kubernetes.io/projected/63a7ba57-56d4-4bc2-a700-601433fd8838-kube-api-access-bj2wf\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:27.186929 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.186899 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:16:27.187341 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.187319 2542 scope.go:117] "RemoveContainer" containerID="985717c4b266b4214c3b8c9f4fc2272cb506101f87300d5d8656e81e3ce6a871" Apr 22 14:16:27.187586 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:27.187542 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ctlhj_openshift-console-operator(749b33a6-3425-465e-a4b7-7844646cda79)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" podUID="749b33a6-3425-465e-a4b7-7844646cda79" Apr 22 14:16:27.208744 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.208712 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bj2wf\" (UniqueName: \"kubernetes.io/projected/63a7ba57-56d4-4bc2-a700-601433fd8838-kube-api-access-bj2wf\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:27.208892 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.208796 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/63a7ba57-56d4-4bc2-a700-601433fd8838-crio-socket\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:27.208892 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.208844 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/63a7ba57-56d4-4bc2-a700-601433fd8838-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:27.208998 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:27.208935 2542 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 14:16:27.208998 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.208940 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/63a7ba57-56d4-4bc2-a700-601433fd8838-crio-socket\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:27.208998 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:27.208990 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63a7ba57-56d4-4bc2-a700-601433fd8838-insights-runtime-extractor-tls podName:63a7ba57-56d4-4bc2-a700-601433fd8838 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:27.708972983 +0000 UTC m=+49.491350101 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/63a7ba57-56d4-4bc2-a700-601433fd8838-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5tvtw" (UID: "63a7ba57-56d4-4bc2-a700-601433fd8838") : secret "insights-runtime-extractor-tls" not found Apr 22 14:16:27.209157 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.209017 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/63a7ba57-56d4-4bc2-a700-601433fd8838-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:27.209157 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.209100 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/63a7ba57-56d4-4bc2-a700-601433fd8838-data-volume\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:27.209619 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.209594 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/63a7ba57-56d4-4bc2-a700-601433fd8838-data-volume\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:27.217928 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.217883 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj2wf\" (UniqueName: \"kubernetes.io/projected/63a7ba57-56d4-4bc2-a700-601433fd8838-kube-api-access-bj2wf\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:27.224031 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.224002 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/63a7ba57-56d4-4bc2-a700-601433fd8838-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:27.713744 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:27.713718 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/63a7ba57-56d4-4bc2-a700-601433fd8838-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:27.713914 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:27.713893 2542 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 14:16:27.713988 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:27.713974 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63a7ba57-56d4-4bc2-a700-601433fd8838-insights-runtime-extractor-tls podName:63a7ba57-56d4-4bc2-a700-601433fd8838 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:28.713952807 +0000 UTC m=+50.496329918 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/63a7ba57-56d4-4bc2-a700-601433fd8838-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5tvtw" (UID: "63a7ba57-56d4-4bc2-a700-601433fd8838") : secret "insights-runtime-extractor-tls" not found Apr 22 14:16:28.191831 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:28.191788 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" event={"ID":"8de6ebfa-8fdc-401b-b660-aaaf38fda26e","Type":"ContainerStarted","Data":"f085aace21bc81bfb92af19f451ef9fc15a714819b14566979d8ead4f742ae63"} Apr 22 14:16:28.191831 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:28.191826 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" event={"ID":"8de6ebfa-8fdc-401b-b660-aaaf38fda26e","Type":"ContainerStarted","Data":"5ec9c95d6d541d002652171bd7954986ed6367f8ba32b3cbe863dab482dc72f8"} Apr 22 14:16:28.213707 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:28.213656 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" podStartSLOduration=30.591306984 podStartE2EDuration="46.213637444s" podCreationTimestamp="2026-04-22 14:15:42 +0000 UTC" firstStartedPulling="2026-04-22 14:16:11.881224939 +0000 UTC m=+33.663602050" lastFinishedPulling="2026-04-22 14:16:27.503555396 +0000 UTC m=+49.285932510" observedRunningTime="2026-04-22 14:16:28.213284402 +0000 UTC m=+49.995661536" watchObservedRunningTime="2026-04-22 14:16:28.213637444 +0000 UTC m=+49.996014576" Apr 22 14:16:28.723460 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:28.723422 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/63a7ba57-56d4-4bc2-a700-601433fd8838-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:28.723713 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:28.723563 2542 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 14:16:28.723713 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:28.723645 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63a7ba57-56d4-4bc2-a700-601433fd8838-insights-runtime-extractor-tls podName:63a7ba57-56d4-4bc2-a700-601433fd8838 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:30.72362406 +0000 UTC m=+52.506001182 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/63a7ba57-56d4-4bc2-a700-601433fd8838-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5tvtw" (UID: "63a7ba57-56d4-4bc2-a700-601433fd8838") : secret "insights-runtime-extractor-tls" not found Apr 22 14:16:29.131774 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:29.131750 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-k4tqk_6d9a3188-bb23-4be3-b39b-234bee924217/dns-node-resolver/0.log" Apr 22 14:16:29.961004 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:29.960953 2542 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" Apr 22 14:16:29.961004 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:29.961006 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" Apr 22 14:16:29.961457 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:29.961381 2542 scope.go:117] "RemoveContainer" containerID="985717c4b266b4214c3b8c9f4fc2272cb506101f87300d5d8656e81e3ce6a871" Apr 22 14:16:29.961561 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:29.961543 2542 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ctlhj_openshift-console-operator(749b33a6-3425-465e-a4b7-7844646cda79)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" podUID="749b33a6-3425-465e-a4b7-7844646cda79" Apr 22 14:16:30.198613 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:30.198582 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-s96sk" event={"ID":"00b1aeaa-9a7a-4380-a5aa-0891caae4c5e","Type":"ContainerStarted","Data":"c10f6115303abf2be56b923794698845c9960caeff6f58666d6b2ccea46f2ddd"} Apr 22 14:16:30.220808 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:30.220707 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-s96sk" podStartSLOduration=36.573783852 podStartE2EDuration="41.220693697s" podCreationTimestamp="2026-04-22 14:15:49 +0000 UTC" firstStartedPulling="2026-04-22 14:16:24.635223449 +0000 UTC m=+46.417600576" lastFinishedPulling="2026-04-22 14:16:29.282133307 +0000 UTC m=+51.064510421" observedRunningTime="2026-04-22 14:16:30.21960387 +0000 UTC m=+52.001981003" watchObservedRunningTime="2026-04-22 14:16:30.220693697 +0000 UTC m=+52.003070829" Apr 22 14:16:30.331974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:30.331944 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wkvtd_3dd1c758-7fe6-4a4e-b170-8e5c199c937c/node-ca/0.log" Apr 22 14:16:30.741262 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:30.741226 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/63a7ba57-56d4-4bc2-a700-601433fd8838-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:30.741420 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:30.741373 2542 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 14:16:30.741456 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:30.741437 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63a7ba57-56d4-4bc2-a700-601433fd8838-insights-runtime-extractor-tls podName:63a7ba57-56d4-4bc2-a700-601433fd8838 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:34.741423185 +0000 UTC m=+56.523800297 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/63a7ba57-56d4-4bc2-a700-601433fd8838-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5tvtw" (UID: "63a7ba57-56d4-4bc2-a700-601433fd8838") : secret "insights-runtime-extractor-tls" not found Apr 22 14:16:31.732933 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:31.732900 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-7rsj7_3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9/kube-storage-version-migrator-operator/0.log" Apr 22 14:16:33.969643 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:33.969601 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fnfhk\" (UID: \"195e9aad-8ddd-4d14-84ae-1158d9c78159\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk" Apr 22 14:16:33.970055 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:33.969764 2542 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 14:16:33.970055 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:33.969835 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert podName:195e9aad-8ddd-4d14-84ae-1158d9c78159 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:49.969820502 +0000 UTC m=+71.752197613 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fnfhk" (UID: "195e9aad-8ddd-4d14-84ae-1158d9c78159") : secret "networking-console-plugin-cert" not found Apr 22 14:16:34.775543 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:34.775489 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/63a7ba57-56d4-4bc2-a700-601433fd8838-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:34.775784 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:34.775642 2542 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 14:16:34.775784 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:34.775738 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63a7ba57-56d4-4bc2-a700-601433fd8838-insights-runtime-extractor-tls podName:63a7ba57-56d4-4bc2-a700-601433fd8838 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:42.775716521 +0000 UTC m=+64.558093634 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/63a7ba57-56d4-4bc2-a700-601433fd8838-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5tvtw" (UID: "63a7ba57-56d4-4bc2-a700-601433fd8838") : secret "insights-runtime-extractor-tls" not found Apr 22 14:16:36.022234 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:36.022204 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q7q7t" Apr 22 14:16:41.535770 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.535725 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:41.536181 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.535931 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:41.536529 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.536512 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda3d6aa-e281-4e92-adba-267754d5dd86-service-ca-bundle\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:41.538149 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.538131 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cda3d6aa-e281-4e92-adba-267754d5dd86-metrics-certs\") pod \"router-default-857878798d-nzjn8\" (UID: \"cda3d6aa-e281-4e92-adba-267754d5dd86\") " pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:41.636806 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.636759 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rfgzx\" (UID: \"edd11ecf-24da-4d1e-83da-e059f384b9ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx" Apr 22 14:16:41.637004 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.636848 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nqmkm\" (UID: \"b99ce1d1-2784-4681-b12f-a7fc95fa5fa1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" Apr 22 14:16:41.637004 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.636876 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:41.637163 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:41.637015 2542 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:16:41.637163 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:16:41.637120 2542 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls podName:b99ce1d1-2784-4681-b12f-a7fc95fa5fa1 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:13.637097869 +0000 UTC m=+95.419474996 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nqmkm" (UID: "b99ce1d1-2784-4681-b12f-a7fc95fa5fa1") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:16:41.639162 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.639140 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls\") pod \"image-registry-7fd97b775f-28299\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:41.639162 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.639155 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edd11ecf-24da-4d1e-83da-e059f384b9ac-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rfgzx\" (UID: \"edd11ecf-24da-4d1e-83da-e059f384b9ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx" Apr 22 14:16:41.737465 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.737422 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert\") pod \"ingress-canary-h8bkz\" (UID: \"522ef3ef-a141-4082-8f5a-55a059c52133\") " pod="openshift-ingress-canary/ingress-canary-h8bkz" Apr 22 14:16:41.737631 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.737485 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls\") pod \"dns-default-vzzs6\" (UID: \"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1\") " pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:41.739732 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.739702 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1-metrics-tls\") pod \"dns-default-vzzs6\" (UID: \"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1\") " pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:41.739838 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.739799 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/522ef3ef-a141-4082-8f5a-55a059c52133-cert\") pod \"ingress-canary-h8bkz\" (UID: \"522ef3ef-a141-4082-8f5a-55a059c52133\") " pod="openshift-ingress-canary/ingress-canary-h8bkz" Apr 22 14:16:41.750703 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.750678 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-f6csh\"" Apr 22 14:16:41.759227 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.759207 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:41.830302 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.830269 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fd7vw\"" Apr 22 14:16:41.838455 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.838429 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:41.854482 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.854382 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-xrql5\"" Apr 22 14:16:41.861692 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.861667 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx" Apr 22 14:16:41.882783 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.882754 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-857878798d-nzjn8"] Apr 22 14:16:41.885255 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:16:41.885220 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcda3d6aa_e281_4e92_adba_267754d5dd86.slice/crio-722c0bd7f3ef3a7af105ac7d9e0406ad805245fafdf4951bfca46a5125612770 WatchSource:0}: Error finding container 722c0bd7f3ef3a7af105ac7d9e0406ad805245fafdf4951bfca46a5125612770: Status 404 returned error can't find the container with id 722c0bd7f3ef3a7af105ac7d9e0406ad805245fafdf4951bfca46a5125612770 Apr 22 14:16:41.937701 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.937628 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7gvsw\"" Apr 22 14:16:41.942364 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.942335 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:41.962903 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.962860 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-n9tgv\"" Apr 22 14:16:41.970967 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.970940 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-h8bkz" Apr 22 14:16:41.979804 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:41.979780 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7fd97b775f-28299"] Apr 22 14:16:41.982938 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:16:41.982900 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode092464a_c357_4406_b079_4c8f0ece38e9.slice/crio-88de2026adb99848272a63b94383f243171970e38d8e374bc805570f0169ef74 WatchSource:0}: Error finding container 88de2026adb99848272a63b94383f243171970e38d8e374bc805570f0169ef74: Status 404 returned error can't find the container with id 88de2026adb99848272a63b94383f243171970e38d8e374bc805570f0169ef74 Apr 22 14:16:42.009428 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.009377 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx"] Apr 22 14:16:42.101693 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.101667 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vzzs6"] Apr 22 14:16:42.104497 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:16:42.104468 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa2f50e4_69ed_4a6b_9fe0_3d70269a73d1.slice/crio-7d8d295f1281ab557683627888bd845abb31d707f9f1f9be72f4c9e30485bb66 WatchSource:0}: Error finding container 7d8d295f1281ab557683627888bd845abb31d707f9f1f9be72f4c9e30485bb66: Status 404 returned error can't find the container with id 7d8d295f1281ab557683627888bd845abb31d707f9f1f9be72f4c9e30485bb66 Apr 22 14:16:42.114918 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.114895 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-h8bkz"] Apr 22 14:16:42.117608 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:16:42.117581 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod522ef3ef_a141_4082_8f5a_55a059c52133.slice/crio-78e8620fd597c81070cf83393d3d2a1c065b4ee4ace04b69f036c5a1513c0569 WatchSource:0}: Error finding container 78e8620fd597c81070cf83393d3d2a1c065b4ee4ace04b69f036c5a1513c0569: Status 404 returned error can't find the container with id 78e8620fd597c81070cf83393d3d2a1c065b4ee4ace04b69f036c5a1513c0569 Apr 22 14:16:42.233839 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.233807 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-h8bkz" event={"ID":"522ef3ef-a141-4082-8f5a-55a059c52133","Type":"ContainerStarted","Data":"78e8620fd597c81070cf83393d3d2a1c065b4ee4ace04b69f036c5a1513c0569"} Apr 22 14:16:42.234961 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.234935 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx" event={"ID":"edd11ecf-24da-4d1e-83da-e059f384b9ac","Type":"ContainerStarted","Data":"9b5b7faed84e3818b7fc02f00319ceac7c64543d06fda2e708419e560fb0985b"} Apr 22 14:16:42.236429 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.236402 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fd97b775f-28299" event={"ID":"e092464a-c357-4406-b079-4c8f0ece38e9","Type":"ContainerStarted","Data":"4fd5f9dc2b6bf362e402ee4d37ab64038979bd26036da714394f5661ee0d08e7"} Apr 22 14:16:42.236552 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.236432 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fd97b775f-28299" event={"ID":"e092464a-c357-4406-b079-4c8f0ece38e9","Type":"ContainerStarted","Data":"88de2026adb99848272a63b94383f243171970e38d8e374bc805570f0169ef74"} Apr 22 14:16:42.236552 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.236465 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:16:42.237442 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.237422 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vzzs6" event={"ID":"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1","Type":"ContainerStarted","Data":"7d8d295f1281ab557683627888bd845abb31d707f9f1f9be72f4c9e30485bb66"} Apr 22 14:16:42.238641 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.238624 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-857878798d-nzjn8" event={"ID":"cda3d6aa-e281-4e92-adba-267754d5dd86","Type":"ContainerStarted","Data":"41ac0f4e1c7672abe86ad1bcec04731b1a444ced2c992544b933adbd9e925519"} Apr 22 14:16:42.238720 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.238645 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-857878798d-nzjn8" event={"ID":"cda3d6aa-e281-4e92-adba-267754d5dd86","Type":"ContainerStarted","Data":"722c0bd7f3ef3a7af105ac7d9e0406ad805245fafdf4951bfca46a5125612770"} Apr 22 14:16:42.256321 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.256285 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7fd97b775f-28299" podStartSLOduration=60.256272541 podStartE2EDuration="1m0.256272541s" podCreationTimestamp="2026-04-22 14:15:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:16:42.255076345 +0000 UTC m=+64.037453555" watchObservedRunningTime="2026-04-22 14:16:42.256272541 +0000 UTC m=+64.038649674" Apr 22 14:16:42.274832 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.274779 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-857878798d-nzjn8" podStartSLOduration=60.274764925 podStartE2EDuration="1m0.274764925s" podCreationTimestamp="2026-04-22 14:15:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:16:42.273531027 +0000 UTC m=+64.055908160" watchObservedRunningTime="2026-04-22 14:16:42.274764925 +0000 UTC m=+64.057142037" Apr 22 14:16:42.760333 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.759917 2542 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:42.762888 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.762696 2542 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:42.848335 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.848287 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/63a7ba57-56d4-4bc2-a700-601433fd8838-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:42.851478 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.851449 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/63a7ba57-56d4-4bc2-a700-601433fd8838-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5tvtw\" (UID: \"63a7ba57-56d4-4bc2-a700-601433fd8838\") " pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:42.965484 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.965451 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4htz2\"" Apr 22 14:16:42.973475 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:42.972905 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5tvtw" Apr 22 14:16:43.131111 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:43.131081 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5tvtw"] Apr 22 14:16:43.138641 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:16:43.138603 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63a7ba57_56d4_4bc2_a700_601433fd8838.slice/crio-800ae202b19439936b42e7da036fe33af4bfacea2b851933e78d8209865b1e3d WatchSource:0}: Error finding container 800ae202b19439936b42e7da036fe33af4bfacea2b851933e78d8209865b1e3d: Status 404 returned error can't find the container with id 800ae202b19439936b42e7da036fe33af4bfacea2b851933e78d8209865b1e3d Apr 22 14:16:43.243367 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:43.243316 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5tvtw" event={"ID":"63a7ba57-56d4-4bc2-a700-601433fd8838","Type":"ContainerStarted","Data":"800ae202b19439936b42e7da036fe33af4bfacea2b851933e78d8209865b1e3d"} Apr 22 14:16:43.243663 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:43.243641 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:43.244942 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:43.244922 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-857878798d-nzjn8" Apr 22 14:16:43.455227 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:43.454963 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs\") pod \"network-metrics-daemon-g66xm\" (UID: \"d326b6a1-5cbd-47fa-a676-90af9406d2a9\") " pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:16:43.457765 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:43.457719 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d326b6a1-5cbd-47fa-a676-90af9406d2a9-metrics-certs\") pod \"network-metrics-daemon-g66xm\" (UID: \"d326b6a1-5cbd-47fa-a676-90af9406d2a9\") " pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:16:43.512810 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:43.512768 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4bnk6\"" Apr 22 14:16:43.520250 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:43.520221 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g66xm" Apr 22 14:16:43.783305 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:43.783276 2542 scope.go:117] "RemoveContainer" containerID="985717c4b266b4214c3b8c9f4fc2272cb506101f87300d5d8656e81e3ce6a871" Apr 22 14:16:44.247899 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:44.247857 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5tvtw" event={"ID":"63a7ba57-56d4-4bc2-a700-601433fd8838","Type":"ContainerStarted","Data":"9d600dd7a51cfd1ad60fbdfdba450579369c29111f0c37208cdbdf71366cf733"} Apr 22 14:16:45.568107 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:45.568052 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g66xm"] Apr 22 14:16:45.616983 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:16:45.616846 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd326b6a1_5cbd_47fa_a676_90af9406d2a9.slice/crio-93d467c754526c215699381e838dd8b81844b027b75a3e4a9dc4c5b06f37274f WatchSource:0}: Error finding container 93d467c754526c215699381e838dd8b81844b027b75a3e4a9dc4c5b06f37274f: Status 404 returned error can't find the container with id 93d467c754526c215699381e838dd8b81844b027b75a3e4a9dc4c5b06f37274f Apr 22 14:16:46.255369 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:46.255330 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g66xm" event={"ID":"d326b6a1-5cbd-47fa-a676-90af9406d2a9","Type":"ContainerStarted","Data":"93d467c754526c215699381e838dd8b81844b027b75a3e4a9dc4c5b06f37274f"} Apr 22 14:16:46.257752 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:46.257481 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-h8bkz" event={"ID":"522ef3ef-a141-4082-8f5a-55a059c52133","Type":"ContainerStarted","Data":"ec93f78da7b6f638d373ba5f9377ca86ec7f0cf2b8646a622f8cdec2ae78dbc4"} Apr 22 14:16:46.260167 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:46.260140 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx" event={"ID":"edd11ecf-24da-4d1e-83da-e059f384b9ac","Type":"ContainerStarted","Data":"0bd73c6b5daa9c45bcbed63be5cbcf870657c18d4d38d7a82b0c1efa6a02c2e0"} Apr 22 14:16:46.260273 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:46.260174 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx" event={"ID":"edd11ecf-24da-4d1e-83da-e059f384b9ac","Type":"ContainerStarted","Data":"cf868588e66df3e712714ad65afd925322919a52fee63d24ec673fd7f1b7cce5"} Apr 22 14:16:46.263045 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:46.262686 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:16:46.263045 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:46.262771 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" event={"ID":"749b33a6-3425-465e-a4b7-7844646cda79","Type":"ContainerStarted","Data":"22cc7fa32bbe43a7bdb43a4163d061e0169e95f84bc787304d2c0b167eb2108f"} Apr 22 14:16:46.263259 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:46.263139 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" Apr 22 14:16:46.264996 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:46.264973 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5tvtw" event={"ID":"63a7ba57-56d4-4bc2-a700-601433fd8838","Type":"ContainerStarted","Data":"3f5956126528d6779b7cc08ffb2a1a02d90128ffe68e90a09c5e9b71e5a2fb55"} Apr 22 14:16:46.267009 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:46.266962 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vzzs6" event={"ID":"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1","Type":"ContainerStarted","Data":"08df9b6bddca0f9e19c586f24ed5cdff2279f6908a509a49d4b070580edd7134"} Apr 22 14:16:46.267009 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:46.266989 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vzzs6" event={"ID":"fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1","Type":"ContainerStarted","Data":"c415072329b1675f0a8aa6c720e0925b5ba34c70b1ddad59277d508a72c58e3c"} Apr 22 14:16:46.267438 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:46.267418 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:46.276199 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:46.276145 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-h8bkz" podStartSLOduration=33.848525506 podStartE2EDuration="37.276128966s" podCreationTimestamp="2026-04-22 14:16:09 +0000 UTC" firstStartedPulling="2026-04-22 14:16:42.119210577 +0000 UTC m=+63.901587690" lastFinishedPulling="2026-04-22 14:16:45.546814035 +0000 UTC m=+67.329191150" observedRunningTime="2026-04-22 14:16:46.273831051 +0000 UTC m=+68.056208185" watchObservedRunningTime="2026-04-22 14:16:46.276128966 +0000 UTC m=+68.058506104" Apr 22 14:16:46.310919 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:46.310747 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" podStartSLOduration=51.687102006 podStartE2EDuration="1m4.310728736s" podCreationTimestamp="2026-04-22 14:15:42 +0000 UTC" firstStartedPulling="2026-04-22 14:16:11.847108503 +0000 UTC m=+33.629485623" lastFinishedPulling="2026-04-22 14:16:24.470735237 +0000 UTC m=+46.253112353" observedRunningTime="2026-04-22 14:16:46.290328681 +0000 UTC m=+68.072705821" watchObservedRunningTime="2026-04-22 14:16:46.310728736 +0000 UTC m=+68.093105871" Apr 22 14:16:46.311083 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:46.310968 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vzzs6" podStartSLOduration=33.967878703 podStartE2EDuration="37.310960567s" podCreationTimestamp="2026-04-22 14:16:09 +0000 UTC" firstStartedPulling="2026-04-22 14:16:42.10645287 +0000 UTC m=+63.888829987" lastFinishedPulling="2026-04-22 14:16:45.449534721 +0000 UTC m=+67.231911851" observedRunningTime="2026-04-22 14:16:46.310148562 +0000 UTC m=+68.092525695" watchObservedRunningTime="2026-04-22 14:16:46.310960567 +0000 UTC m=+68.093337702" Apr 22 14:16:46.327478 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:46.327225 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rfgzx" podStartSLOduration=60.893619515 podStartE2EDuration="1m4.327207888s" podCreationTimestamp="2026-04-22 14:15:42 +0000 UTC" firstStartedPulling="2026-04-22 14:16:42.113408254 +0000 UTC m=+63.895785373" lastFinishedPulling="2026-04-22 14:16:45.546996622 +0000 UTC m=+67.329373746" observedRunningTime="2026-04-22 14:16:46.326739096 +0000 UTC m=+68.109116243" watchObservedRunningTime="2026-04-22 14:16:46.327207888 +0000 UTC m=+68.109585023" Apr 22 14:16:47.263839 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:47.263789 2542 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-ctlhj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.132.0.7:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 22 14:16:47.264437 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:47.263863 2542 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" podUID="749b33a6-3425-465e-a4b7-7844646cda79" containerName="console-operator" probeResult="failure" output="Get \"https://10.132.0.7:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 22 14:16:47.273651 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:47.273598 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5tvtw" event={"ID":"63a7ba57-56d4-4bc2-a700-601433fd8838","Type":"ContainerStarted","Data":"d21fc5cc8443c2465cb0cd8f66aa9afeb63022f50e461e3ed271187df8b0892a"} Apr 22 14:16:47.276064 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:47.275339 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g66xm" event={"ID":"d326b6a1-5cbd-47fa-a676-90af9406d2a9","Type":"ContainerStarted","Data":"097785093f57c4948ff12b2aaa48ac8fff7aeb439459b1feb2b1c7ae720283c2"} Apr 22 14:16:47.293282 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:47.293243 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5tvtw" podStartSLOduration=16.492921559 podStartE2EDuration="20.293227303s" podCreationTimestamp="2026-04-22 14:16:27 +0000 UTC" firstStartedPulling="2026-04-22 14:16:43.287919539 +0000 UTC m=+65.070296656" lastFinishedPulling="2026-04-22 14:16:47.088225288 +0000 UTC m=+68.870602400" observedRunningTime="2026-04-22 14:16:47.292096241 +0000 UTC m=+69.074473375" watchObservedRunningTime="2026-04-22 14:16:47.293227303 +0000 UTC m=+69.075604434" Apr 22 14:16:47.462240 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:47.462210 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-ctlhj" Apr 22 14:16:48.279921 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:48.279806 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g66xm" event={"ID":"d326b6a1-5cbd-47fa-a676-90af9406d2a9","Type":"ContainerStarted","Data":"15e482952bd27341abad58982da29a788119b2bcd082fa395fd93a06ae80f5c2"} Apr 22 14:16:48.300754 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:48.300702 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-g66xm" podStartSLOduration=67.840471248 podStartE2EDuration="1m9.30068629s" podCreationTimestamp="2026-04-22 14:15:39 +0000 UTC" firstStartedPulling="2026-04-22 14:16:45.625508768 +0000 UTC m=+67.407885879" lastFinishedPulling="2026-04-22 14:16:47.085723806 +0000 UTC m=+68.868100921" observedRunningTime="2026-04-22 14:16:48.298708491 +0000 UTC m=+70.081085622" watchObservedRunningTime="2026-04-22 14:16:48.30068629 +0000 UTC m=+70.083063424" Apr 22 14:16:50.014103 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:50.014065 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fnfhk\" (UID: \"195e9aad-8ddd-4d14-84ae-1158d9c78159\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk" Apr 22 14:16:50.016458 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:50.016438 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/195e9aad-8ddd-4d14-84ae-1158d9c78159-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fnfhk\" (UID: \"195e9aad-8ddd-4d14-84ae-1158d9c78159\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk" Apr 22 14:16:50.192937 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:50.192906 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-d9s8x\"" Apr 22 14:16:50.200841 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:50.200811 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk" Apr 22 14:16:50.321081 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:50.320967 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk"] Apr 22 14:16:50.323910 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:16:50.323885 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod195e9aad_8ddd_4d14_84ae_1158d9c78159.slice/crio-13cfa5f2447ddb82c086bde5e28ddf7807641dcc62cccf0de714971b5bda88de WatchSource:0}: Error finding container 13cfa5f2447ddb82c086bde5e28ddf7807641dcc62cccf0de714971b5bda88de: Status 404 returned error can't find the container with id 13cfa5f2447ddb82c086bde5e28ddf7807641dcc62cccf0de714971b5bda88de Apr 22 14:16:51.291171 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:51.291132 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk" event={"ID":"195e9aad-8ddd-4d14-84ae-1158d9c78159","Type":"ContainerStarted","Data":"13cfa5f2447ddb82c086bde5e28ddf7807641dcc62cccf0de714971b5bda88de"} Apr 22 14:16:51.739388 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:51.739358 2542 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7fd97b775f-28299"] Apr 22 14:16:52.295448 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:52.295415 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk" event={"ID":"195e9aad-8ddd-4d14-84ae-1158d9c78159","Type":"ContainerStarted","Data":"346f7b57e2e12efb468fc585fff20483a34e2648de9238e97d9b9d5a6d586752"} Apr 22 14:16:52.314804 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:52.314750 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fnfhk" podStartSLOduration=32.9544524 podStartE2EDuration="34.314735124s" podCreationTimestamp="2026-04-22 14:16:18 +0000 UTC" firstStartedPulling="2026-04-22 14:16:50.326317353 +0000 UTC m=+72.108694465" lastFinishedPulling="2026-04-22 14:16:51.686600063 +0000 UTC m=+73.468977189" observedRunningTime="2026-04-22 14:16:52.313332885 +0000 UTC m=+74.095710019" watchObservedRunningTime="2026-04-22 14:16:52.314735124 +0000 UTC m=+74.097112257" Apr 22 14:16:56.278775 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:56.278738 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vzzs6" Apr 22 14:16:57.189207 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:16:57.189151 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-b27s2" Apr 22 14:17:01.745341 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:01.745307 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:17:13.709992 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:13.709948 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nqmkm\" (UID: \"b99ce1d1-2784-4681-b12f-a7fc95fa5fa1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" Apr 22 14:17:13.712493 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:13.712467 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b99ce1d1-2784-4681-b12f-a7fc95fa5fa1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nqmkm\" (UID: \"b99ce1d1-2784-4681-b12f-a7fc95fa5fa1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" Apr 22 14:17:13.975146 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:13.975059 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-jnrv9\"" Apr 22 14:17:13.983237 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:13.983207 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" Apr 22 14:17:14.108747 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:14.108716 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm"] Apr 22 14:17:14.111726 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:17:14.111658 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb99ce1d1_2784_4681_b12f_a7fc95fa5fa1.slice/crio-00e139437dc64ceecd55b6d569203f3666cb6323170ba8108dc5ce074d6c5b79 WatchSource:0}: Error finding container 00e139437dc64ceecd55b6d569203f3666cb6323170ba8108dc5ce074d6c5b79: Status 404 returned error can't find the container with id 00e139437dc64ceecd55b6d569203f3666cb6323170ba8108dc5ce074d6c5b79 Apr 22 14:17:14.363447 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:14.363360 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" event={"ID":"b99ce1d1-2784-4681-b12f-a7fc95fa5fa1","Type":"ContainerStarted","Data":"00e139437dc64ceecd55b6d569203f3666cb6323170ba8108dc5ce074d6c5b79"} Apr 22 14:17:16.371067 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:16.370721 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" event={"ID":"b99ce1d1-2784-4681-b12f-a7fc95fa5fa1","Type":"ContainerStarted","Data":"1fe4b5ed231cb8bd3806c324c02c8c6171d92d60cfdfc27e9f909d63af712c1b"} Apr 22 14:17:16.390657 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:16.390601 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nqmkm" podStartSLOduration=92.685579195 podStartE2EDuration="1m34.390582511s" podCreationTimestamp="2026-04-22 14:15:42 +0000 UTC" firstStartedPulling="2026-04-22 14:17:14.113823626 +0000 UTC m=+95.896200737" lastFinishedPulling="2026-04-22 14:17:15.818826928 +0000 UTC m=+97.601204053" observedRunningTime="2026-04-22 14:17:16.389518488 +0000 UTC m=+98.171895623" watchObservedRunningTime="2026-04-22 14:17:16.390582511 +0000 UTC m=+98.172959645" Apr 22 14:17:16.765866 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:16.765814 2542 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7fd97b775f-28299" podUID="e092464a-c357-4406-b079-4c8f0ece38e9" containerName="registry" containerID="cri-o://4fd5f9dc2b6bf362e402ee4d37ab64038979bd26036da714394f5661ee0d08e7" gracePeriod=30 Apr 22 14:17:17.003762 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.003736 2542 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:17:17.030794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.030720 2542 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e092464a-c357-4406-b079-4c8f0ece38e9-registry-certificates\") pod \"e092464a-c357-4406-b079-4c8f0ece38e9\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " Apr 22 14:17:17.030794 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.030770 2542 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e092464a-c357-4406-b079-4c8f0ece38e9-image-registry-private-configuration\") pod \"e092464a-c357-4406-b079-4c8f0ece38e9\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " Apr 22 14:17:17.031031 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.030857 2542 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqxv5\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-kube-api-access-bqxv5\") pod \"e092464a-c357-4406-b079-4c8f0ece38e9\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " Apr 22 14:17:17.031031 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.030905 2542 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e092464a-c357-4406-b079-4c8f0ece38e9-installation-pull-secrets\") pod \"e092464a-c357-4406-b079-4c8f0ece38e9\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " Apr 22 14:17:17.031031 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.030940 2542 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e092464a-c357-4406-b079-4c8f0ece38e9-ca-trust-extracted\") pod \"e092464a-c357-4406-b079-4c8f0ece38e9\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " Apr 22 14:17:17.031031 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.030970 2542 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-bound-sa-token\") pod \"e092464a-c357-4406-b079-4c8f0ece38e9\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " Apr 22 14:17:17.031031 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.031000 2542 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls\") pod \"e092464a-c357-4406-b079-4c8f0ece38e9\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " Apr 22 14:17:17.031287 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.031039 2542 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e092464a-c357-4406-b079-4c8f0ece38e9-trusted-ca\") pod \"e092464a-c357-4406-b079-4c8f0ece38e9\" (UID: \"e092464a-c357-4406-b079-4c8f0ece38e9\") " Apr 22 14:17:17.031621 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.031510 2542 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e092464a-c357-4406-b079-4c8f0ece38e9-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e092464a-c357-4406-b079-4c8f0ece38e9" (UID: "e092464a-c357-4406-b079-4c8f0ece38e9"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:17:17.031621 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.031602 2542 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e092464a-c357-4406-b079-4c8f0ece38e9-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e092464a-c357-4406-b079-4c8f0ece38e9" (UID: "e092464a-c357-4406-b079-4c8f0ece38e9"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:17:17.033899 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.033861 2542 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e092464a-c357-4406-b079-4c8f0ece38e9-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e092464a-c357-4406-b079-4c8f0ece38e9" (UID: "e092464a-c357-4406-b079-4c8f0ece38e9"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:17:17.034098 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.034056 2542 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e092464a-c357-4406-b079-4c8f0ece38e9" (UID: "e092464a-c357-4406-b079-4c8f0ece38e9"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:17:17.034425 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.034242 2542 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e092464a-c357-4406-b079-4c8f0ece38e9" (UID: "e092464a-c357-4406-b079-4c8f0ece38e9"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:17:17.034425 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.034411 2542 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e092464a-c357-4406-b079-4c8f0ece38e9-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e092464a-c357-4406-b079-4c8f0ece38e9" (UID: "e092464a-c357-4406-b079-4c8f0ece38e9"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:17:17.034545 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.034464 2542 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-kube-api-access-bqxv5" (OuterVolumeSpecName: "kube-api-access-bqxv5") pod "e092464a-c357-4406-b079-4c8f0ece38e9" (UID: "e092464a-c357-4406-b079-4c8f0ece38e9"). InnerVolumeSpecName "kube-api-access-bqxv5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:17:17.042711 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.042678 2542 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e092464a-c357-4406-b079-4c8f0ece38e9-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e092464a-c357-4406-b079-4c8f0ece38e9" (UID: "e092464a-c357-4406-b079-4c8f0ece38e9"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:17:17.131776 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.131743 2542 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e092464a-c357-4406-b079-4c8f0ece38e9-image-registry-private-configuration\") on node \"ip-10-0-133-65.ec2.internal\" DevicePath \"\"" Apr 22 14:17:17.131776 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.131770 2542 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bqxv5\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-kube-api-access-bqxv5\") on node \"ip-10-0-133-65.ec2.internal\" DevicePath \"\"" Apr 22 14:17:17.131776 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.131781 2542 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e092464a-c357-4406-b079-4c8f0ece38e9-installation-pull-secrets\") on node \"ip-10-0-133-65.ec2.internal\" DevicePath \"\"" Apr 22 14:17:17.131994 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.131793 2542 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e092464a-c357-4406-b079-4c8f0ece38e9-ca-trust-extracted\") on node \"ip-10-0-133-65.ec2.internal\" DevicePath \"\"" Apr 22 14:17:17.131994 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.131804 2542 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-bound-sa-token\") on node \"ip-10-0-133-65.ec2.internal\" DevicePath \"\"" Apr 22 14:17:17.131994 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.131812 2542 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e092464a-c357-4406-b079-4c8f0ece38e9-registry-tls\") on node \"ip-10-0-133-65.ec2.internal\" DevicePath \"\"" Apr 22 14:17:17.131994 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.131820 2542 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e092464a-c357-4406-b079-4c8f0ece38e9-trusted-ca\") on node \"ip-10-0-133-65.ec2.internal\" DevicePath \"\"" Apr 22 14:17:17.131994 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.131827 2542 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e092464a-c357-4406-b079-4c8f0ece38e9-registry-certificates\") on node \"ip-10-0-133-65.ec2.internal\" DevicePath \"\"" Apr 22 14:17:17.374595 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.374565 2542 generic.go:358] "Generic (PLEG): container finished" podID="e092464a-c357-4406-b079-4c8f0ece38e9" containerID="4fd5f9dc2b6bf362e402ee4d37ab64038979bd26036da714394f5661ee0d08e7" exitCode=0 Apr 22 14:17:17.375080 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.374630 2542 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fd97b775f-28299" Apr 22 14:17:17.375080 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.374646 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fd97b775f-28299" event={"ID":"e092464a-c357-4406-b079-4c8f0ece38e9","Type":"ContainerDied","Data":"4fd5f9dc2b6bf362e402ee4d37ab64038979bd26036da714394f5661ee0d08e7"} Apr 22 14:17:17.375080 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.374683 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fd97b775f-28299" event={"ID":"e092464a-c357-4406-b079-4c8f0ece38e9","Type":"ContainerDied","Data":"88de2026adb99848272a63b94383f243171970e38d8e374bc805570f0169ef74"} Apr 22 14:17:17.375080 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.374702 2542 scope.go:117] "RemoveContainer" containerID="4fd5f9dc2b6bf362e402ee4d37ab64038979bd26036da714394f5661ee0d08e7" Apr 22 14:17:17.383167 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.383146 2542 scope.go:117] "RemoveContainer" containerID="4fd5f9dc2b6bf362e402ee4d37ab64038979bd26036da714394f5661ee0d08e7" Apr 22 14:17:17.383443 ip-10-0-133-65 kubenswrapper[2542]: E0422 14:17:17.383422 2542 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fd5f9dc2b6bf362e402ee4d37ab64038979bd26036da714394f5661ee0d08e7\": container with ID starting with 4fd5f9dc2b6bf362e402ee4d37ab64038979bd26036da714394f5661ee0d08e7 not found: ID does not exist" containerID="4fd5f9dc2b6bf362e402ee4d37ab64038979bd26036da714394f5661ee0d08e7" Apr 22 14:17:17.383507 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.383450 2542 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd5f9dc2b6bf362e402ee4d37ab64038979bd26036da714394f5661ee0d08e7"} err="failed to get container status \"4fd5f9dc2b6bf362e402ee4d37ab64038979bd26036da714394f5661ee0d08e7\": rpc error: code = NotFound desc = could not find container \"4fd5f9dc2b6bf362e402ee4d37ab64038979bd26036da714394f5661ee0d08e7\": container with ID starting with 4fd5f9dc2b6bf362e402ee4d37ab64038979bd26036da714394f5661ee0d08e7 not found: ID does not exist" Apr 22 14:17:17.399135 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.399108 2542 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7fd97b775f-28299"] Apr 22 14:17:17.408776 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:17.408751 2542 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7fd97b775f-28299"] Apr 22 14:17:18.788423 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:18.788389 2542 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e092464a-c357-4406-b079-4c8f0ece38e9" path="/var/lib/kubelet/pods/e092464a-c357-4406-b079-4c8f0ece38e9/volumes" Apr 22 14:17:25.070251 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.070209 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tqbsm"] Apr 22 14:17:25.070728 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.070513 2542 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e092464a-c357-4406-b079-4c8f0ece38e9" containerName="registry" Apr 22 14:17:25.070728 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.070524 2542 state_mem.go:107] "Deleted CPUSet assignment" podUID="e092464a-c357-4406-b079-4c8f0ece38e9" containerName="registry" Apr 22 14:17:25.070728 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.070579 2542 memory_manager.go:356] "RemoveStaleState removing state" podUID="e092464a-c357-4406-b079-4c8f0ece38e9" containerName="registry" Apr 22 14:17:25.075306 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.075289 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.078518 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.078491 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 14:17:25.079504 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.079469 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b8946a4b-2607-45a5-847d-76318056ce20-node-exporter-tls\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.079610 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.079529 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b8946a4b-2607-45a5-847d-76318056ce20-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.079610 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.079572 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b8946a4b-2607-45a5-847d-76318056ce20-node-exporter-accelerators-collector-config\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.079610 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.079604 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlg5k\" (UniqueName: \"kubernetes.io/projected/b8946a4b-2607-45a5-847d-76318056ce20-kube-api-access-zlg5k\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.079754 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.079632 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b8946a4b-2607-45a5-847d-76318056ce20-node-exporter-wtmp\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.079754 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.079671 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b8946a4b-2607-45a5-847d-76318056ce20-node-exporter-textfile\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.079754 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.079719 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8946a4b-2607-45a5-847d-76318056ce20-metrics-client-ca\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.079893 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.079758 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b8946a4b-2607-45a5-847d-76318056ce20-root\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.079893 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.079793 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b8946a4b-2607-45a5-847d-76318056ce20-sys\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.080035 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.080016 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 14:17:25.080120 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.080024 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s69rl\"" Apr 22 14:17:25.080339 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.080320 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 14:17:25.080595 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.080578 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 14:17:25.180830 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.180789 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b8946a4b-2607-45a5-847d-76318056ce20-node-exporter-textfile\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.181004 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.180845 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8946a4b-2607-45a5-847d-76318056ce20-metrics-client-ca\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.181004 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.180868 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b8946a4b-2607-45a5-847d-76318056ce20-root\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.181004 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.180884 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b8946a4b-2607-45a5-847d-76318056ce20-sys\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.181004 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.180923 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b8946a4b-2607-45a5-847d-76318056ce20-node-exporter-tls\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.181004 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.180984 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b8946a4b-2607-45a5-847d-76318056ce20-root\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.181351 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.180991 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b8946a4b-2607-45a5-847d-76318056ce20-sys\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.181351 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.181018 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b8946a4b-2607-45a5-847d-76318056ce20-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.181351 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.181071 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b8946a4b-2607-45a5-847d-76318056ce20-node-exporter-accelerators-collector-config\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.181351 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.181098 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlg5k\" (UniqueName: \"kubernetes.io/projected/b8946a4b-2607-45a5-847d-76318056ce20-kube-api-access-zlg5k\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.181351 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.181125 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b8946a4b-2607-45a5-847d-76318056ce20-node-exporter-wtmp\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.181351 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.181169 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b8946a4b-2607-45a5-847d-76318056ce20-node-exporter-textfile\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.181351 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.181302 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b8946a4b-2607-45a5-847d-76318056ce20-node-exporter-wtmp\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.181647 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.181525 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8946a4b-2607-45a5-847d-76318056ce20-metrics-client-ca\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.181682 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.181649 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b8946a4b-2607-45a5-847d-76318056ce20-node-exporter-accelerators-collector-config\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.183363 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.183342 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b8946a4b-2607-45a5-847d-76318056ce20-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.183471 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.183368 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b8946a4b-2607-45a5-847d-76318056ce20-node-exporter-tls\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.195271 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.195245 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlg5k\" (UniqueName: \"kubernetes.io/projected/b8946a4b-2607-45a5-847d-76318056ce20-kube-api-access-zlg5k\") pod \"node-exporter-tqbsm\" (UID: \"b8946a4b-2607-45a5-847d-76318056ce20\") " pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.387631 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:25.387599 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tqbsm" Apr 22 14:17:25.397949 ip-10-0-133-65 kubenswrapper[2542]: W0422 14:17:25.397922 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8946a4b_2607_45a5_847d_76318056ce20.slice/crio-6227a0b1c5a45db49b1020659d663ca12e499b2fd5e156d27beb6a73a0a6d579 WatchSource:0}: Error finding container 6227a0b1c5a45db49b1020659d663ca12e499b2fd5e156d27beb6a73a0a6d579: Status 404 returned error can't find the container with id 6227a0b1c5a45db49b1020659d663ca12e499b2fd5e156d27beb6a73a0a6d579 Apr 22 14:17:26.401294 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:26.401263 2542 generic.go:358] "Generic (PLEG): container finished" podID="b8946a4b-2607-45a5-847d-76318056ce20" containerID="5d5e9dd8e15f31674f1ab6374573307a175e8f1a35ee8001c2678d2361c15796" exitCode=0 Apr 22 14:17:26.401692 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:26.401345 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tqbsm" event={"ID":"b8946a4b-2607-45a5-847d-76318056ce20","Type":"ContainerDied","Data":"5d5e9dd8e15f31674f1ab6374573307a175e8f1a35ee8001c2678d2361c15796"} Apr 22 14:17:26.401692 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:26.401395 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tqbsm" event={"ID":"b8946a4b-2607-45a5-847d-76318056ce20","Type":"ContainerStarted","Data":"6227a0b1c5a45db49b1020659d663ca12e499b2fd5e156d27beb6a73a0a6d579"} Apr 22 14:17:27.406047 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:27.406014 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tqbsm" event={"ID":"b8946a4b-2607-45a5-847d-76318056ce20","Type":"ContainerStarted","Data":"a9946b9ced7bf6a851c5197114fd2df010037edbb0a7378a7fbc3363ca507fe7"} Apr 22 14:17:27.406465 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:27.406052 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tqbsm" event={"ID":"b8946a4b-2607-45a5-847d-76318056ce20","Type":"ContainerStarted","Data":"9c65e56e92a4c6f2e324acd38203a932a88d1dc30980994faef0f307ec179c3b"} Apr 22 14:17:27.438526 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:27.438469 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tqbsm" podStartSLOduration=1.643714615 podStartE2EDuration="2.438456566s" podCreationTimestamp="2026-04-22 14:17:25 +0000 UTC" firstStartedPulling="2026-04-22 14:17:25.39942431 +0000 UTC m=+107.181801422" lastFinishedPulling="2026-04-22 14:17:26.194166258 +0000 UTC m=+107.976543373" observedRunningTime="2026-04-22 14:17:27.436763342 +0000 UTC m=+109.219140476" watchObservedRunningTime="2026-04-22 14:17:27.438456566 +0000 UTC m=+109.220833699" Apr 22 14:17:45.456293 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:45.456262 2542 generic.go:358] "Generic (PLEG): container finished" podID="3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9" containerID="c01b2382e7816a2496cb1be4f2c2749df065f30d6ff5fcb2a04392db237a7031" exitCode=0 Apr 22 14:17:45.456604 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:45.456338 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7" event={"ID":"3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9","Type":"ContainerDied","Data":"c01b2382e7816a2496cb1be4f2c2749df065f30d6ff5fcb2a04392db237a7031"} Apr 22 14:17:45.456658 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:45.456644 2542 scope.go:117] "RemoveContainer" containerID="c01b2382e7816a2496cb1be4f2c2749df065f30d6ff5fcb2a04392db237a7031" Apr 22 14:17:46.460818 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:46.460786 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7rsj7" event={"ID":"3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9","Type":"ContainerStarted","Data":"a2d8643d353ba5e5326faaaf812699688da0e0920f33b8cd9060138a26e6907d"} Apr 22 14:17:50.258984 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:50.258939 2542 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" podUID="8de6ebfa-8fdc-401b-b660-aaaf38fda26e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 14:17:52.477703 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:52.477661 2542 generic.go:358] "Generic (PLEG): container finished" podID="43dd5ba1-95ef-4c1b-b853-cdc8032e3625" containerID="90dcca55c5a4c37debf94a75e02e40ab5ae42e54f3bd01ea65cbb2a0dff5f20b" exitCode=0 Apr 22 14:17:52.478075 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:52.477732 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk" event={"ID":"43dd5ba1-95ef-4c1b-b853-cdc8032e3625","Type":"ContainerDied","Data":"90dcca55c5a4c37debf94a75e02e40ab5ae42e54f3bd01ea65cbb2a0dff5f20b"} Apr 22 14:17:52.478075 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:52.478058 2542 scope.go:117] "RemoveContainer" containerID="90dcca55c5a4c37debf94a75e02e40ab5ae42e54f3bd01ea65cbb2a0dff5f20b" Apr 22 14:17:53.482525 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:53.482490 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pbfkk" event={"ID":"43dd5ba1-95ef-4c1b-b853-cdc8032e3625","Type":"ContainerStarted","Data":"3e9f111be9be621621ccb86e4cc214c4567979d73b47688d1897d790902594ad"} Apr 22 14:17:56.491974 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:56.491934 2542 generic.go:358] "Generic (PLEG): container finished" podID="ff59a9f8-3474-46f0-9922-f6372bd8119f" containerID="40fa5088a75ac77b4b15c3a227045fc6256c44fdd765dab124ecf9d85577ef37" exitCode=0 Apr 22 14:17:56.492447 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:56.492008 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-czjzj" event={"ID":"ff59a9f8-3474-46f0-9922-f6372bd8119f","Type":"ContainerDied","Data":"40fa5088a75ac77b4b15c3a227045fc6256c44fdd765dab124ecf9d85577ef37"} Apr 22 14:17:56.492447 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:56.492429 2542 scope.go:117] "RemoveContainer" containerID="40fa5088a75ac77b4b15c3a227045fc6256c44fdd765dab124ecf9d85577ef37" Apr 22 14:17:57.495936 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:17:57.495904 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-czjzj" event={"ID":"ff59a9f8-3474-46f0-9922-f6372bd8119f","Type":"ContainerStarted","Data":"1e785346a09ebfec692cfce0b2c635d7e4fa0e7d77dd5e18351664dfdef2bad7"} Apr 22 14:18:00.258576 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:18:00.258530 2542 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" podUID="8de6ebfa-8fdc-401b-b660-aaaf38fda26e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 14:18:10.259023 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:18:10.258981 2542 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" podUID="8de6ebfa-8fdc-401b-b660-aaaf38fda26e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 14:18:10.259502 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:18:10.259063 2542 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" Apr 22 14:18:10.259611 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:18:10.259594 2542 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"f085aace21bc81bfb92af19f451ef9fc15a714819b14566979d8ead4f742ae63"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 14:18:10.259649 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:18:10.259631 2542 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" podUID="8de6ebfa-8fdc-401b-b660-aaaf38fda26e" containerName="service-proxy" containerID="cri-o://f085aace21bc81bfb92af19f451ef9fc15a714819b14566979d8ead4f742ae63" gracePeriod=30 Apr 22 14:18:10.541547 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:18:10.540409 2542 generic.go:358] "Generic (PLEG): container finished" podID="8de6ebfa-8fdc-401b-b660-aaaf38fda26e" containerID="f085aace21bc81bfb92af19f451ef9fc15a714819b14566979d8ead4f742ae63" exitCode=2 Apr 22 14:18:10.541547 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:18:10.540485 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" event={"ID":"8de6ebfa-8fdc-401b-b660-aaaf38fda26e","Type":"ContainerDied","Data":"f085aace21bc81bfb92af19f451ef9fc15a714819b14566979d8ead4f742ae63"} Apr 22 14:18:10.541547 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:18:10.540514 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8c9c7b7f7-gdzsv" event={"ID":"8de6ebfa-8fdc-401b-b660-aaaf38fda26e","Type":"ContainerStarted","Data":"07c8df3481f4edd166cc4f5e2ca9d5c50ef1abf0f5603e5d636de6e1f20a72ad"} Apr 22 14:20:38.699799 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:20:38.699769 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:20:38.700358 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:20:38.699771 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:20:38.707052 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:20:38.707032 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:20:38.707226 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:20:38.707205 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:20:38.709870 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:20:38.709849 2542 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 14:25:38.720642 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:25:38.720611 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:25:38.721842 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:25:38.721818 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:25:38.726874 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:25:38.726855 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:25:38.727796 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:25:38.727773 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:30:38.739722 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:30:38.739692 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:30:38.745856 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:30:38.745831 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:30:38.748981 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:30:38.748951 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:30:38.754769 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:30:38.754750 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:35:38.762061 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:35:38.762028 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:35:38.768846 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:35:38.768823 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:35:38.769309 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:35:38.769288 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:35:38.775482 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:35:38.775461 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:40:38.781812 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:40:38.781779 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:40:38.794109 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:40:38.794088 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:40:38.794493 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:40:38.794473 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:40:38.799882 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:40:38.799864 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:45:38.806995 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:45:38.806879 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:45:38.813122 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:45:38.813104 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:45:38.813289 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:45:38.813273 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:45:38.818765 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:45:38.818748 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:50:38.825954 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:50:38.825853 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:50:38.831805 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:50:38.831784 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:50:38.831942 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:50:38.831785 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:50:38.837623 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:50:38.837604 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:55:38.843868 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:55:38.843761 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:55:38.849935 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:55:38.849913 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 14:55:38.850657 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:55:38.850636 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 14:55:38.856356 ip-10-0-133-65 kubenswrapper[2542]: I0422 14:55:38.856339 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 15:00:38.862949 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:00:38.862840 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 15:00:38.868852 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:00:38.868828 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 15:00:38.869207 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:00:38.869169 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 15:00:38.874852 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:00:38.874834 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 15:04:58.129245 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:04:58.129138 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zbnb6/must-gather-sr56l"] Apr 22 15:04:58.132365 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:04:58.132347 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zbnb6/must-gather-sr56l" Apr 22 15:04:58.135460 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:04:58.135438 2542 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zbnb6\"/\"default-dockercfg-g9xjx\"" Apr 22 15:04:58.135598 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:04:58.135495 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zbnb6\"/\"kube-root-ca.crt\"" Apr 22 15:04:58.135598 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:04:58.135549 2542 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zbnb6\"/\"openshift-service-ca.crt\"" Apr 22 15:04:58.143709 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:04:58.143683 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zbnb6/must-gather-sr56l"] Apr 22 15:04:58.211214 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:04:58.211143 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/739bb006-1820-497f-b1b8-02fb5c1ff036-must-gather-output\") pod \"must-gather-sr56l\" (UID: \"739bb006-1820-497f-b1b8-02fb5c1ff036\") " pod="openshift-must-gather-zbnb6/must-gather-sr56l" Apr 22 15:04:58.211388 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:04:58.211231 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldcwd\" (UniqueName: \"kubernetes.io/projected/739bb006-1820-497f-b1b8-02fb5c1ff036-kube-api-access-ldcwd\") pod \"must-gather-sr56l\" (UID: \"739bb006-1820-497f-b1b8-02fb5c1ff036\") " pod="openshift-must-gather-zbnb6/must-gather-sr56l" Apr 22 15:04:58.311691 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:04:58.311650 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/739bb006-1820-497f-b1b8-02fb5c1ff036-must-gather-output\") pod \"must-gather-sr56l\" (UID: \"739bb006-1820-497f-b1b8-02fb5c1ff036\") " pod="openshift-must-gather-zbnb6/must-gather-sr56l" Apr 22 15:04:58.311864 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:04:58.311700 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldcwd\" (UniqueName: \"kubernetes.io/projected/739bb006-1820-497f-b1b8-02fb5c1ff036-kube-api-access-ldcwd\") pod \"must-gather-sr56l\" (UID: \"739bb006-1820-497f-b1b8-02fb5c1ff036\") " pod="openshift-must-gather-zbnb6/must-gather-sr56l" Apr 22 15:04:58.312012 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:04:58.311991 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/739bb006-1820-497f-b1b8-02fb5c1ff036-must-gather-output\") pod \"must-gather-sr56l\" (UID: \"739bb006-1820-497f-b1b8-02fb5c1ff036\") " pod="openshift-must-gather-zbnb6/must-gather-sr56l" Apr 22 15:04:58.323321 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:04:58.323287 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldcwd\" (UniqueName: \"kubernetes.io/projected/739bb006-1820-497f-b1b8-02fb5c1ff036-kube-api-access-ldcwd\") pod \"must-gather-sr56l\" (UID: \"739bb006-1820-497f-b1b8-02fb5c1ff036\") " pod="openshift-must-gather-zbnb6/must-gather-sr56l" Apr 22 15:04:58.441505 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:04:58.441409 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zbnb6/must-gather-sr56l" Apr 22 15:04:58.578301 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:04:58.578268 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zbnb6/must-gather-sr56l"] Apr 22 15:04:58.581334 ip-10-0-133-65 kubenswrapper[2542]: W0422 15:04:58.581304 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod739bb006_1820_497f_b1b8_02fb5c1ff036.slice/crio-1abbf4b983a3d95a6465fddb812beacf5712b9c68d1c84d918f5ed22aa9eeae2 WatchSource:0}: Error finding container 1abbf4b983a3d95a6465fddb812beacf5712b9c68d1c84d918f5ed22aa9eeae2: Status 404 returned error can't find the container with id 1abbf4b983a3d95a6465fddb812beacf5712b9c68d1c84d918f5ed22aa9eeae2 Apr 22 15:04:58.583004 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:04:58.582985 2542 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:04:59.296851 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:04:59.296795 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zbnb6/must-gather-sr56l" event={"ID":"739bb006-1820-497f-b1b8-02fb5c1ff036","Type":"ContainerStarted","Data":"1abbf4b983a3d95a6465fddb812beacf5712b9c68d1c84d918f5ed22aa9eeae2"} Apr 22 15:05:00.302118 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:00.302071 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zbnb6/must-gather-sr56l" event={"ID":"739bb006-1820-497f-b1b8-02fb5c1ff036","Type":"ContainerStarted","Data":"4fefe92efcf9c30165ae3929782cac7183e37b7e9e2c0bdbf8e8a31dbb3844bd"} Apr 22 15:05:00.302118 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:00.302124 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zbnb6/must-gather-sr56l" event={"ID":"739bb006-1820-497f-b1b8-02fb5c1ff036","Type":"ContainerStarted","Data":"025f4366e031b07a4f819b6b3ef5abe72b2a5cb613e0b4f3fd70b4a669f606e0"} Apr 22 15:05:00.329831 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:00.329771 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zbnb6/must-gather-sr56l" podStartSLOduration=1.488480024 podStartE2EDuration="2.329751014s" podCreationTimestamp="2026-04-22 15:04:58 +0000 UTC" firstStartedPulling="2026-04-22 15:04:58.583108852 +0000 UTC m=+2960.365485964" lastFinishedPulling="2026-04-22 15:04:59.424379828 +0000 UTC m=+2961.206756954" observedRunningTime="2026-04-22 15:05:00.328134643 +0000 UTC m=+2962.110511777" watchObservedRunningTime="2026-04-22 15:05:00.329751014 +0000 UTC m=+2962.112128149" Apr 22 15:05:01.469093 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:01.469055 2542 ???:1] "http: TLS handshake error from 10.0.130.255:55810: EOF" Apr 22 15:05:01.476982 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:01.476951 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-s96sk_00b1aeaa-9a7a-4380-a5aa-0891caae4c5e/global-pull-secret-syncer/0.log" Apr 22 15:05:01.650821 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:01.650783 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-h4hlm_2096774e-6172-439a-a5c0-779a91d43a80/konnectivity-agent/0.log" Apr 22 15:05:01.855685 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:01.855609 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-65.ec2.internal_0c43fa24b7e06ba8965ac528ac76a464/haproxy/0.log" Apr 22 15:05:05.045873 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:05.045838 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-nqmkm_b99ce1d1-2784-4681-b12f-a7fc95fa5fa1/cluster-monitoring-operator/0.log" Apr 22 15:05:05.435466 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:05.435425 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tqbsm_b8946a4b-2607-45a5-847d-76318056ce20/node-exporter/0.log" Apr 22 15:05:05.459232 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:05.459166 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tqbsm_b8946a4b-2607-45a5-847d-76318056ce20/kube-rbac-proxy/0.log" Apr 22 15:05:05.485608 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:05.485569 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tqbsm_b8946a4b-2607-45a5-847d-76318056ce20/init-textfile/0.log" Apr 22 15:05:07.281001 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:07.280961 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-fnfhk_195e9aad-8ddd-4d14-84ae-1158d9c78159/networking-console-plugin/0.log" Apr 22 15:05:07.734809 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:07.734781 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/1.log" Apr 22 15:05:07.739512 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:07.739478 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctlhj_749b33a6-3425-465e-a4b7-7844646cda79/console-operator/2.log" Apr 22 15:05:08.132550 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.132511 2542 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst"] Apr 22 15:05:08.135333 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.135309 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:08.147702 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.147672 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst"] Apr 22 15:05:08.201963 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.201924 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh5hh\" (UniqueName: \"kubernetes.io/projected/3da72ea5-5e21-45ba-9ada-f365e0bbff60-kube-api-access-fh5hh\") pod \"perf-node-gather-daemonset-j5gst\" (UID: \"3da72ea5-5e21-45ba-9ada-f365e0bbff60\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:08.202149 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.202006 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3da72ea5-5e21-45ba-9ada-f365e0bbff60-podres\") pod \"perf-node-gather-daemonset-j5gst\" (UID: \"3da72ea5-5e21-45ba-9ada-f365e0bbff60\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:08.202149 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.202056 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3da72ea5-5e21-45ba-9ada-f365e0bbff60-sys\") pod \"perf-node-gather-daemonset-j5gst\" (UID: \"3da72ea5-5e21-45ba-9ada-f365e0bbff60\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:08.202149 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.202085 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3da72ea5-5e21-45ba-9ada-f365e0bbff60-lib-modules\") pod \"perf-node-gather-daemonset-j5gst\" (UID: \"3da72ea5-5e21-45ba-9ada-f365e0bbff60\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:08.202149 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.202122 2542 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3da72ea5-5e21-45ba-9ada-f365e0bbff60-proc\") pod \"perf-node-gather-daemonset-j5gst\" (UID: \"3da72ea5-5e21-45ba-9ada-f365e0bbff60\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:08.303489 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.303453 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fh5hh\" (UniqueName: \"kubernetes.io/projected/3da72ea5-5e21-45ba-9ada-f365e0bbff60-kube-api-access-fh5hh\") pod \"perf-node-gather-daemonset-j5gst\" (UID: \"3da72ea5-5e21-45ba-9ada-f365e0bbff60\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:08.303908 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.303513 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3da72ea5-5e21-45ba-9ada-f365e0bbff60-podres\") pod \"perf-node-gather-daemonset-j5gst\" (UID: \"3da72ea5-5e21-45ba-9ada-f365e0bbff60\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:08.303908 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.303544 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3da72ea5-5e21-45ba-9ada-f365e0bbff60-sys\") pod \"perf-node-gather-daemonset-j5gst\" (UID: \"3da72ea5-5e21-45ba-9ada-f365e0bbff60\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:08.303908 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.303564 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3da72ea5-5e21-45ba-9ada-f365e0bbff60-lib-modules\") pod \"perf-node-gather-daemonset-j5gst\" (UID: \"3da72ea5-5e21-45ba-9ada-f365e0bbff60\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:08.303908 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.303592 2542 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3da72ea5-5e21-45ba-9ada-f365e0bbff60-proc\") pod \"perf-node-gather-daemonset-j5gst\" (UID: \"3da72ea5-5e21-45ba-9ada-f365e0bbff60\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:08.303908 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.303651 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3da72ea5-5e21-45ba-9ada-f365e0bbff60-podres\") pod \"perf-node-gather-daemonset-j5gst\" (UID: \"3da72ea5-5e21-45ba-9ada-f365e0bbff60\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:08.303908 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.303674 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3da72ea5-5e21-45ba-9ada-f365e0bbff60-sys\") pod \"perf-node-gather-daemonset-j5gst\" (UID: \"3da72ea5-5e21-45ba-9ada-f365e0bbff60\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:08.303908 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.303662 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3da72ea5-5e21-45ba-9ada-f365e0bbff60-proc\") pod \"perf-node-gather-daemonset-j5gst\" (UID: \"3da72ea5-5e21-45ba-9ada-f365e0bbff60\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:08.303908 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.303731 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3da72ea5-5e21-45ba-9ada-f365e0bbff60-lib-modules\") pod \"perf-node-gather-daemonset-j5gst\" (UID: \"3da72ea5-5e21-45ba-9ada-f365e0bbff60\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:08.313392 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.313368 2542 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh5hh\" (UniqueName: \"kubernetes.io/projected/3da72ea5-5e21-45ba-9ada-f365e0bbff60-kube-api-access-fh5hh\") pod \"perf-node-gather-daemonset-j5gst\" (UID: \"3da72ea5-5e21-45ba-9ada-f365e0bbff60\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:08.445575 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.445497 2542 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:08.587132 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.587110 2542 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst"] Apr 22 15:05:08.588659 ip-10-0-133-65 kubenswrapper[2542]: W0422 15:05:08.588621 2542 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3da72ea5_5e21_45ba_9ada_f365e0bbff60.slice/crio-1fe37c3804e3be53be406ff9517eeebae508957c544e62f58e1ceab383eb1702 WatchSource:0}: Error finding container 1fe37c3804e3be53be406ff9517eeebae508957c544e62f58e1ceab383eb1702: Status 404 returned error can't find the container with id 1fe37c3804e3be53be406ff9517eeebae508957c544e62f58e1ceab383eb1702 Apr 22 15:05:08.681964 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:08.681936 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-6chj8_bd7bfce3-b27e-45be-bb80-1b123b5b0fef/volume-data-source-validator/0.log" Apr 22 15:05:09.351582 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:09.351541 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" event={"ID":"3da72ea5-5e21-45ba-9ada-f365e0bbff60","Type":"ContainerStarted","Data":"a0747c0bae82817aa4c9b0464ee1e032ff200832d60bc28c02be676a3f3b8cea"} Apr 22 15:05:09.352078 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:09.351589 2542 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" event={"ID":"3da72ea5-5e21-45ba-9ada-f365e0bbff60","Type":"ContainerStarted","Data":"1fe37c3804e3be53be406ff9517eeebae508957c544e62f58e1ceab383eb1702"} Apr 22 15:05:09.352078 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:09.351697 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:09.371982 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:09.371928 2542 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" podStartSLOduration=1.371911364 podStartE2EDuration="1.371911364s" podCreationTimestamp="2026-04-22 15:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:05:09.369902153 +0000 UTC m=+2971.152279300" watchObservedRunningTime="2026-04-22 15:05:09.371911364 +0000 UTC m=+2971.154288559" Apr 22 15:05:09.547239 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:09.547206 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vzzs6_fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1/dns/0.log" Apr 22 15:05:09.583208 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:09.583165 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vzzs6_fa2f50e4-69ed-4a6b-9fe0-3d70269a73d1/kube-rbac-proxy/0.log" Apr 22 15:05:09.651279 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:09.651246 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-k4tqk_6d9a3188-bb23-4be3-b39b-234bee924217/dns-node-resolver/0.log" Apr 22 15:05:10.310495 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:10.310466 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wkvtd_3dd1c758-7fe6-4a4e-b170-8e5c199c937c/node-ca/0.log" Apr 22 15:05:11.202437 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:11.202404 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-857878798d-nzjn8_cda3d6aa-e281-4e92-adba-267754d5dd86/router/0.log" Apr 22 15:05:11.605750 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:11.605630 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-h8bkz_522ef3ef-a141-4082-8f5a-55a059c52133/serve-healthcheck-canary/0.log" Apr 22 15:05:12.020747 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:12.020697 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-czjzj_ff59a9f8-3474-46f0-9922-f6372bd8119f/insights-operator/0.log" Apr 22 15:05:12.021318 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:12.021296 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-czjzj_ff59a9f8-3474-46f0-9922-f6372bd8119f/insights-operator/1.log" Apr 22 15:05:12.045579 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:12.045542 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5tvtw_63a7ba57-56d4-4bc2-a700-601433fd8838/kube-rbac-proxy/0.log" Apr 22 15:05:12.071435 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:12.071386 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5tvtw_63a7ba57-56d4-4bc2-a700-601433fd8838/exporter/0.log" Apr 22 15:05:12.097801 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:12.097755 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5tvtw_63a7ba57-56d4-4bc2-a700-601433fd8838/extractor/0.log" Apr 22 15:05:15.364562 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:15.364533 2542 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-j5gst" Apr 22 15:05:19.977353 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:19.977253 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-7rsj7_3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9/kube-storage-version-migrator-operator/1.log" Apr 22 15:05:19.978162 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:19.978123 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-7rsj7_3967e40c-bb30-4b21-bdcf-1e5d9ef87ba9/kube-storage-version-migrator-operator/0.log" Apr 22 15:05:21.300175 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:21.300143 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7x75g_6f09af95-f295-4dec-8131-f3dad5bd3e4d/kube-multus-additional-cni-plugins/0.log" Apr 22 15:05:21.332257 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:21.332230 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7x75g_6f09af95-f295-4dec-8131-f3dad5bd3e4d/egress-router-binary-copy/0.log" Apr 22 15:05:21.365962 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:21.365926 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7x75g_6f09af95-f295-4dec-8131-f3dad5bd3e4d/cni-plugins/0.log" Apr 22 15:05:21.397296 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:21.397269 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7x75g_6f09af95-f295-4dec-8131-f3dad5bd3e4d/bond-cni-plugin/0.log" Apr 22 15:05:21.426178 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:21.426148 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7x75g_6f09af95-f295-4dec-8131-f3dad5bd3e4d/routeoverride-cni/0.log" Apr 22 15:05:21.458317 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:21.458272 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7x75g_6f09af95-f295-4dec-8131-f3dad5bd3e4d/whereabouts-cni-bincopy/0.log" Apr 22 15:05:21.487956 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:21.487927 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7x75g_6f09af95-f295-4dec-8131-f3dad5bd3e4d/whereabouts-cni/0.log" Apr 22 15:05:21.872768 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:21.872714 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r7sbq_8db7d438-84db-45bb-919c-709bca043fd8/kube-multus/0.log" Apr 22 15:05:22.069338 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:22.069309 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-g66xm_d326b6a1-5cbd-47fa-a676-90af9406d2a9/network-metrics-daemon/0.log" Apr 22 15:05:22.101039 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:22.101013 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-g66xm_d326b6a1-5cbd-47fa-a676-90af9406d2a9/kube-rbac-proxy/0.log" Apr 22 15:05:23.293116 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:23.293085 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-controller/0.log" Apr 22 15:05:23.331337 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:23.331306 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/0.log" Apr 22 15:05:23.345583 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:23.345560 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovn-acl-logging/1.log" Apr 22 15:05:23.366325 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:23.366294 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/kube-rbac-proxy-node/0.log" Apr 22 15:05:23.403477 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:23.403452 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 15:05:23.437000 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:23.436973 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/northd/0.log" Apr 22 15:05:23.468725 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:23.468699 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/nbdb/0.log" Apr 22 15:05:23.495268 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:23.495245 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/sbdb/0.log" Apr 22 15:05:23.606720 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:23.606632 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q7q7t_73c63add-22e8-4809-b696-9279d2454538/ovnkube-controller/0.log" Apr 22 15:05:25.189451 ip-10-0-133-65 kubenswrapper[2542]: I0422 15:05:25.189422 2542 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-lqv79_d396d453-0baa-4b43-883b-255470bf1283/check-endpoints/0.log"