Apr 16 18:09:37.406727 ip-10-0-137-102 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:09:37.875582 ip-10-0-137-102 kubenswrapper[2563]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:09:37.875582 ip-10-0-137-102 kubenswrapper[2563]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:09:37.875582 ip-10-0-137-102 kubenswrapper[2563]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:09:37.875582 ip-10-0-137-102 kubenswrapper[2563]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:09:37.875582 ip-10-0-137-102 kubenswrapper[2563]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:09:37.879345 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.879250 2563 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:09:37.882664 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882646 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:37.882664 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882663 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882667 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882671 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882674 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882677 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882680 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882683 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882686 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882689 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882692 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882695 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882698 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882700 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882706 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882711 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882714 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882717 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882719 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882722 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:37.882740 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882726 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882730 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882734 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882738 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882740 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882743 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882747 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882750 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882752 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882755 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882758 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882761 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882763 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882766 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882769 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882772 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882775 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882778 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882780 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:37.883195 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882783 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882786 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882788 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882791 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882794 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882796 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882799 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882801 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882804 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882806 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882809 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882812 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882814 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882817 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882819 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882823 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882826 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882829 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882831 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882834 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:37.883672 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882837 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882839 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882842 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882844 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882846 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882849 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882852 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882856 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882859 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882861 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882864 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882866 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882869 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882872 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882874 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882877 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882879 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882882 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882884 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:37.884159 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882886 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882889 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882891 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882894 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882896 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882899 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882901 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.882905 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883337 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883345 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883348 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883351 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883354 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883357 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883360 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883364 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883367 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883370 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883372 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:37.884643 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883375 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883379 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883381 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883384 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883387 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883389 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883392 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883406 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883408 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883411 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883414 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883416 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883419 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883422 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883424 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883427 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883430 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883432 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883435 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:37.885194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883438 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883441 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883444 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883447 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883451 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883454 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883456 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883459 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883461 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883464 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883467 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883469 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883472 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883474 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883477 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883480 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883483 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883485 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883488 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883490 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:37.885696 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883493 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883495 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883498 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883500 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883503 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883505 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883509 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883512 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883515 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883517 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883520 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883523 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883525 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883528 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883532 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883534 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883537 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883540 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883542 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883545 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:37.886194 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883547 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883550 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883554 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883558 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883561 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883563 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883566 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883569 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883572 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883574 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883577 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883579 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883581 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883584 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883587 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.883589 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.884955 2563 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.884967 2563 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885642 2563 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885648 2563 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885653 2563 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:09:37.886701 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885656 2563 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885661 2563 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885665 2563 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885669 2563 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885672 2563 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885676 2563 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885680 2563 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885683 2563 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885687 2563 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885690 2563 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885693 2563 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885695 2563 flags.go:64] FLAG: --cloud-config="" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885698 2563 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885701 2563 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885706 2563 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885709 2563 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885712 2563 flags.go:64] FLAG: --config-dir="" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885715 2563 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885719 2563 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885723 2563 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885727 2563 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885730 2563 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885733 2563 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885737 2563 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:09:37.887213 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885740 2563 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885742 2563 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885746 2563 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885749 2563 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885753 2563 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885756 2563 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885759 2563 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885762 2563 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885765 2563 flags.go:64] FLAG: --enable-server="true" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885769 2563 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885774 2563 flags.go:64] FLAG: --event-burst="100" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885777 2563 flags.go:64] FLAG: --event-qps="50" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885780 2563 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885784 2563 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885787 2563 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885791 2563 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885794 2563 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885797 2563 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885800 2563 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885803 2563 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885806 2563 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885810 2563 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885813 2563 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885816 2563 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885819 2563 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:09:37.887808 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885822 2563 flags.go:64] FLAG: --feature-gates="" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885827 2563 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885830 2563 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885833 2563 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885837 2563 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885840 2563 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885844 2563 flags.go:64] FLAG: --help="false" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885847 2563 flags.go:64] FLAG: --hostname-override="ip-10-0-137-102.ec2.internal" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885850 2563 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885853 2563 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885856 2563 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885859 2563 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885863 2563 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885866 2563 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885870 2563 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885873 2563 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885875 2563 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885878 2563 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885882 2563 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885885 2563 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885888 2563 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885891 2563 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885894 2563 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885897 2563 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:09:37.888486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885900 2563 flags.go:64] FLAG: --lock-file="" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885903 2563 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885906 2563 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885909 2563 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885915 2563 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885918 2563 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885921 2563 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885924 2563 flags.go:64] FLAG: --logging-format="text" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885927 2563 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885930 2563 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885933 2563 flags.go:64] FLAG: --manifest-url="" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885936 2563 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885941 2563 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885944 2563 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885949 2563 flags.go:64] FLAG: --max-pods="110" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885952 2563 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885955 2563 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885958 2563 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885961 2563 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885966 2563 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885970 2563 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885973 2563 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885982 2563 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885985 2563 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885988 2563 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:09:37.889058 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885991 2563 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.885994 2563 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886000 2563 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886002 2563 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886006 2563 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886009 2563 flags.go:64] FLAG: --port="10250" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886012 2563 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886014 2563 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02e8fdd0a46628629" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886018 2563 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886021 2563 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886024 2563 flags.go:64] FLAG: --register-node="true" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886026 2563 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886029 2563 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886033 2563 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886036 2563 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886047 2563 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886050 2563 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886054 2563 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886057 2563 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886060 2563 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886063 2563 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886067 2563 flags.go:64] FLAG: --runonce="false" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886070 2563 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886073 2563 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886076 2563 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:09:37.889729 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886079 2563 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886083 2563 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886087 2563 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886090 2563 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886093 2563 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886097 2563 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886099 2563 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886102 2563 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886105 2563 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886108 2563 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886112 2563 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886115 2563 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886121 2563 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886125 2563 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886128 2563 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886132 2563 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886135 2563 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886138 2563 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886141 2563 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886144 2563 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886147 2563 flags.go:64] FLAG: --v="2" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886151 2563 flags.go:64] FLAG: --version="false" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886155 2563 flags.go:64] FLAG: --vmodule="" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886161 2563 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.886164 2563 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:09:37.890336 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886267 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886271 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886274 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886278 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886281 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886283 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886286 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886289 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886292 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886295 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886298 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886301 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886303 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886306 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886309 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886311 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886314 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886317 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886319 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:37.891004 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886322 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886325 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886327 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886330 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886333 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886336 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886338 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886341 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886343 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886346 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886348 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886351 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886354 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886356 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886359 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886362 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886364 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886367 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886370 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886373 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:37.891476 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886375 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886378 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886380 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886384 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886387 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886389 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886392 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886410 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886413 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886416 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886418 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886421 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886423 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886426 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886429 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886432 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886434 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886437 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886439 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886442 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:37.891990 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886445 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886447 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886450 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886452 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886455 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886457 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886460 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886464 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886468 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886471 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886473 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886477 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886480 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886482 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886485 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886491 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886494 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886497 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886500 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:37.892532 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886502 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:37.893038 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886506 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:37.893038 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886510 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:37.893038 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886513 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:37.893038 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886516 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:37.893038 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886518 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:37.893038 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886521 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:37.893038 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.886524 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:37.893038 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.887343 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:09:37.894764 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.894741 2563 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:09:37.894810 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.894765 2563 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:09:37.894842 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894832 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:37.894842 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894838 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:37.894842 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894841 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894845 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894848 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894851 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894853 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894856 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894860 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894862 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894865 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894868 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894871 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894875 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894877 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894880 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894883 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894885 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894888 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894891 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894894 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:37.894923 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894896 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894899 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894901 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894904 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894907 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894909 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894912 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894914 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894917 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894920 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894923 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894925 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894928 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894931 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894934 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894937 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894940 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894943 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894945 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894948 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:37.895387 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894950 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894953 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894955 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894958 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894962 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894968 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894971 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894973 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894976 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894979 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894982 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894985 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894987 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894990 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894993 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894996 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.894999 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895002 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895004 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895007 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:37.895897 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895009 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895012 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895015 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895017 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895020 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895023 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895025 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895028 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895030 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895033 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895035 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895038 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895041 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895043 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895047 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895051 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895054 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895056 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895059 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:37.896448 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895062 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:37.896908 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895065 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:37.896908 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895068 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:37.896908 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895070 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:37.896908 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895073 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:37.896908 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895075 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:37.896908 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.895080 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:09:37.896908 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895189 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:37.896908 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895194 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:37.896908 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895197 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:37.896908 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895200 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:37.896908 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895203 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:37.896908 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895206 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:37.896908 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895209 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:37.896908 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895212 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:37.896908 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895215 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:37.897280 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895219 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:37.897280 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895223 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:37.897280 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895225 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:37.897280 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895228 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:37.897280 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895230 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:37.897280 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895233 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:37.897280 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895235 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:37.897280 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895239 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:37.897280 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895243 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:37.897280 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895245 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:37.897280 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895248 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:37.897280 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895251 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:37.897280 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895254 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:37.897280 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895256 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:37.897280 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895259 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:37.897280 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895261 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:37.897280 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895264 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:37.897280 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895267 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895270 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895272 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895275 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895277 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895280 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895284 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895287 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895289 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895292 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895295 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895297 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895300 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895302 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895305 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895308 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895310 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895313 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895316 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895318 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:37.897815 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895321 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895323 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895326 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895329 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895332 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895334 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895337 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895340 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895342 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895345 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895348 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895350 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895353 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895356 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895358 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895361 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895363 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895366 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895369 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895371 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:37.898303 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895374 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895377 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895379 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895381 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895384 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895386 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895389 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895392 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895411 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895415 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895420 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895423 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895426 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895429 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895431 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895434 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895437 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895440 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895443 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:37.898814 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:37.895445 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:37.899280 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.895451 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:09:37.899280 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.896236 2563 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:09:37.901670 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.901655 2563 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:09:37.902722 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.902710 2563 server.go:1019] "Starting client certificate rotation" Apr 16 18:09:37.902829 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.902810 2563 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:09:37.902893 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.902865 2563 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:09:37.929949 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.929915 2563 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:09:37.934445 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.934418 2563 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:09:37.955702 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.955678 2563 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:09:37.961097 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.961076 2563 log.go:25] "Validated CRI v1 image API" Apr 16 18:09:37.961814 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.961797 2563 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:09:37.963556 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.963535 2563 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:09:37.966847 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.966821 2563 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7bbfb82b-e511-49e4-ae31-b1762a77fc66:/dev/nvme0n1p3 89e9b9db-52ca-49da-ba2b-a635748e273e:/dev/nvme0n1p4] Apr 16 18:09:37.966929 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.966846 2563 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:09:37.972381 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.972269 2563 manager.go:217] Machine: {Timestamp:2026-04-16 18:09:37.971045257 +0000 UTC m=+0.439654024 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3085158 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2abbaa3b36987dd5feb8cd5cea4c0b SystemUUID:ec2abbaa-3b36-987d-d5fe-b8cd5cea4c0b BootID:8776a50b-c79f-439d-9961-43936cc97f50 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:89:0d:4a:eb:29 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:89:0d:4a:eb:29 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:e6:da:7f:32:d9:d9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:09:37.972381 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.972376 2563 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:09:37.972499 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.972490 2563 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:09:37.974232 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.974206 2563 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:09:37.974368 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.974234 2563 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-102.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:09:37.974432 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.974381 2563 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:09:37.974432 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.974390 2563 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:09:37.974432 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.974419 2563 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:09:37.975337 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.975325 2563 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:09:37.976728 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.976717 2563 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:09:37.976842 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.976833 2563 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:09:37.979159 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.979148 2563 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:09:37.979196 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.979162 2563 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:09:37.979196 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.979174 2563 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:09:37.979196 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.979183 2563 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:09:37.979196 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.979191 2563 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:09:37.980291 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.980278 2563 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:09:37.980333 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.980297 2563 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:09:37.983547 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.983531 2563 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:09:37.985281 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.985263 2563 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:09:37.986161 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.986147 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:09:37.986241 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.986168 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:09:37.986241 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.986177 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:09:37.986241 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.986185 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:09:37.986241 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.986194 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:09:37.986241 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.986202 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:09:37.986241 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.986223 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:09:37.986241 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.986234 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:09:37.986241 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.986243 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:09:37.986526 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.986252 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:09:37.986526 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.986273 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:09:37.986612 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.986550 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:09:37.987544 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.987528 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:09:37.987544 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.987543 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:09:37.991220 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.991203 2563 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:09:37.991316 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.991246 2563 server.go:1295] "Started kubelet" Apr 16 18:09:37.991409 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.991356 2563 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:09:37.991451 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.991409 2563 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:09:37.991494 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.991472 2563 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:09:37.992181 ip-10-0-137-102 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:09:37.995550 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.995526 2563 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:09:37.997497 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.997477 2563 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-102.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:09:37.997497 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:37.997495 2563 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:09:37.997644 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:37.997475 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:09:37.997644 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:37.997596 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-102.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:09:38.002326 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.001316 2563 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-102.ec2.internal.18a6e8b787524ecb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-102.ec2.internal,UID:ip-10-0-137-102.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-102.ec2.internal,},FirstTimestamp:2026-04-16 18:09:37.991216843 +0000 UTC m=+0.459825620,LastTimestamp:2026-04-16 18:09:37.991216843 +0000 UTC m=+0.459825620,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-102.ec2.internal,}" Apr 16 18:09:38.004375 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.004357 2563 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-v6dz4" Apr 16 18:09:38.004756 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.004736 2563 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:09:38.006904 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.006885 2563 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:09:38.007415 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.007389 2563 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:09:38.008210 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.008188 2563 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:09:38.008210 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.008191 2563 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:09:38.008343 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.008220 2563 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:09:38.008343 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.008251 2563 factory.go:55] Registering systemd factory Apr 16 18:09:38.008343 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.008299 2563 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:09:38.008343 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.008337 2563 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:09:38.008525 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.008348 2563 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:09:38.008525 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.008366 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 16 18:09:38.008624 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.008602 2563 factory.go:153] Registering CRI-O factory Apr 16 18:09:38.008624 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.008614 2563 factory.go:223] Registration of the crio container factory successfully Apr 16 18:09:38.008692 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.008675 2563 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:09:38.008733 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.008697 2563 factory.go:103] Registering Raw factory Apr 16 18:09:38.008733 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.008712 2563 manager.go:1196] Started watching for new ooms in manager Apr 16 18:09:38.009195 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.009171 2563 manager.go:319] Starting recovery of all containers Apr 16 18:09:38.009195 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.009175 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:09:38.009733 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.009703 2563 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-v6dz4" Apr 16 18:09:38.015969 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.015928 2563 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:09:38.020597 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.020464 2563 manager.go:324] Recovery completed Apr 16 18:09:38.024752 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.024736 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:38.025498 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.025478 2563 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-102.ec2.internal\" not found" node="ip-10-0-137-102.ec2.internal" Apr 16 18:09:38.027174 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.027161 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:38.027261 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.027192 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:38.027261 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.027207 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:38.027791 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.027777 2563 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:09:38.027848 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.027791 2563 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:09:38.027848 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.027811 2563 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:09:38.030046 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.030028 2563 policy_none.go:49] "None policy: Start" Apr 16 18:09:38.030090 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.030054 2563 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:09:38.030090 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.030064 2563 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:09:38.071104 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.071072 2563 manager.go:341] "Starting Device Plugin manager" Apr 16 18:09:38.083485 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.071144 2563 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:09:38.083485 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.071161 2563 server.go:85] "Starting device plugin registration server" Apr 16 18:09:38.083485 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.071439 2563 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:09:38.083485 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.071449 2563 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:09:38.083485 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.071559 2563 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:09:38.083485 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.071652 2563 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:09:38.083485 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.071664 2563 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:09:38.083485 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.073371 2563 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:09:38.083485 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.073428 2563 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-102.ec2.internal\" not found" Apr 16 18:09:38.149574 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.149493 2563 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:09:38.149574 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.149532 2563 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:09:38.149574 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.149556 2563 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:09:38.149574 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.149565 2563 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:09:38.149811 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.149607 2563 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:09:38.153970 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.153946 2563 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:38.171677 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.171656 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:38.173647 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.173629 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:38.173750 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.173661 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:38.173750 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.173673 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:38.173750 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.173698 2563 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-102.ec2.internal" Apr 16 18:09:38.183204 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.183182 2563 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-102.ec2.internal" Apr 16 18:09:38.183286 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.183209 2563 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-102.ec2.internal\": node \"ip-10-0-137-102.ec2.internal\" not found" Apr 16 18:09:38.200847 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.200820 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 16 18:09:38.250888 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.250856 2563 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal"] Apr 16 18:09:38.250985 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.250938 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:38.252477 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.252461 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:38.252580 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.252487 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:38.252580 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.252497 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:38.253651 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.253639 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:38.253787 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.253772 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" Apr 16 18:09:38.253838 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.253800 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:38.254443 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.254427 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:38.254522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.254459 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:38.254522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.254473 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:38.254522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.254430 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:38.254522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.254516 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:38.254697 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.254525 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:38.256421 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.256391 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal" Apr 16 18:09:38.256501 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.256439 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:38.257184 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.257166 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:38.257283 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.257197 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:38.257283 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.257210 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:38.278263 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.278240 2563 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-102.ec2.internal\" not found" node="ip-10-0-137-102.ec2.internal" Apr 16 18:09:38.282725 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.282708 2563 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-102.ec2.internal\" not found" node="ip-10-0-137-102.ec2.internal" Apr 16 18:09:38.301701 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.301669 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 16 18:09:38.310372 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.310347 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/51e2afdeb323f70af661db7224d1a09f-config\") pod \"kube-apiserver-proxy-ip-10-0-137-102.ec2.internal\" (UID: \"51e2afdeb323f70af661db7224d1a09f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal" Apr 16 18:09:38.310455 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.310376 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/99460a2b083cb8262a03a4f6b644b00a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal\" (UID: \"99460a2b083cb8262a03a4f6b644b00a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" Apr 16 18:09:38.310455 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.310412 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99460a2b083cb8262a03a4f6b644b00a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal\" (UID: \"99460a2b083cb8262a03a4f6b644b00a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" Apr 16 18:09:38.402016 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.401937 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 16 18:09:38.411433 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.411388 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/99460a2b083cb8262a03a4f6b644b00a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal\" (UID: \"99460a2b083cb8262a03a4f6b644b00a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" Apr 16 18:09:38.411498 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.411418 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/99460a2b083cb8262a03a4f6b644b00a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal\" (UID: \"99460a2b083cb8262a03a4f6b644b00a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" Apr 16 18:09:38.411498 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.411468 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99460a2b083cb8262a03a4f6b644b00a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal\" (UID: \"99460a2b083cb8262a03a4f6b644b00a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" Apr 16 18:09:38.411498 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.411486 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/51e2afdeb323f70af661db7224d1a09f-config\") pod \"kube-apiserver-proxy-ip-10-0-137-102.ec2.internal\" (UID: \"51e2afdeb323f70af661db7224d1a09f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal" Apr 16 18:09:38.411594 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.411532 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/51e2afdeb323f70af661db7224d1a09f-config\") pod \"kube-apiserver-proxy-ip-10-0-137-102.ec2.internal\" (UID: \"51e2afdeb323f70af661db7224d1a09f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal" Apr 16 18:09:38.411594 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.411548 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99460a2b083cb8262a03a4f6b644b00a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal\" (UID: \"99460a2b083cb8262a03a4f6b644b00a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" Apr 16 18:09:38.502783 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.502735 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 16 18:09:38.582310 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.582283 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" Apr 16 18:09:38.585410 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.585379 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal" Apr 16 18:09:38.603817 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.603791 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 16 18:09:38.704435 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.704317 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 16 18:09:38.804845 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.804811 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 16 18:09:38.902366 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.902334 2563 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:09:38.902836 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:38.902545 2563 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:09:38.905489 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:38.905470 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 16 18:09:39.006367 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:39.006290 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 16 18:09:39.007390 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.007373 2563 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:09:39.011954 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.011913 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:04:38 +0000 UTC" deadline="2028-01-24 02:48:01.251483374 +0000 UTC" Apr 16 18:09:39.011954 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.011950 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15536h38m22.239536408s" Apr 16 18:09:39.020547 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.020529 2563 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:09:39.046590 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.046563 2563 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-29p2b" Apr 16 18:09:39.052531 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.052009 2563 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-29p2b" Apr 16 18:09:39.106889 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:39.106866 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 16 18:09:39.143560 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:39.143512 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51e2afdeb323f70af661db7224d1a09f.slice/crio-1e68bb6b9bef96c3aae8d9fc2206f866e5df7def9b0df691a6f6da7401583bf2 WatchSource:0}: Error finding container 1e68bb6b9bef96c3aae8d9fc2206f866e5df7def9b0df691a6f6da7401583bf2: Status 404 returned error can't find the container with id 1e68bb6b9bef96c3aae8d9fc2206f866e5df7def9b0df691a6f6da7401583bf2 Apr 16 18:09:39.144152 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:39.144134 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99460a2b083cb8262a03a4f6b644b00a.slice/crio-4655e74128b67f00374b695df57d530bdd3f2fed310e5f9ff89e44f7ff24357d WatchSource:0}: Error finding container 4655e74128b67f00374b695df57d530bdd3f2fed310e5f9ff89e44f7ff24357d: Status 404 returned error can't find the container with id 4655e74128b67f00374b695df57d530bdd3f2fed310e5f9ff89e44f7ff24357d Apr 16 18:09:39.147704 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.147687 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:09:39.152832 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.152795 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal" event={"ID":"51e2afdeb323f70af661db7224d1a09f","Type":"ContainerStarted","Data":"1e68bb6b9bef96c3aae8d9fc2206f866e5df7def9b0df691a6f6da7401583bf2"} Apr 16 18:09:39.154189 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.154170 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" event={"ID":"99460a2b083cb8262a03a4f6b644b00a","Type":"ContainerStarted","Data":"4655e74128b67f00374b695df57d530bdd3f2fed310e5f9ff89e44f7ff24357d"} Apr 16 18:09:39.165581 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.165561 2563 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:39.175837 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.175818 2563 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:39.207253 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:39.207221 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 16 18:09:39.307735 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:39.307653 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 16 18:09:39.408122 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:39.408091 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 16 18:09:39.490071 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.490040 2563 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:39.508585 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.508550 2563 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" Apr 16 18:09:39.521783 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.521755 2563 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:09:39.522864 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.522843 2563 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal" Apr 16 18:09:39.531116 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.531096 2563 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:09:39.980483 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.980452 2563 apiserver.go:52] "Watching apiserver" Apr 16 18:09:39.993867 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.993836 2563 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:09:39.994988 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.994962 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5lcj4","openshift-image-registry/node-ca-4fj5x","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal","openshift-network-diagnostics/network-check-target-4xm4j","openshift-multus/multus-9f96s","openshift-multus/multus-additional-cni-plugins-v9jdb","openshift-multus/network-metrics-daemon-pq587","openshift-network-operator/iptables-alerter-zxsjz","openshift-ovn-kubernetes/ovnkube-node-4qpjc","kube-system/konnectivity-agent-k4n6b","kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64","openshift-cluster-node-tuning-operator/tuned-r69c2"] Apr 16 18:09:39.997781 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.997757 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:39.997924 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:39.997896 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pq587" podUID="5b1617ae-f25b-4a90-adf4-ca28c7c22774" Apr 16 18:09:39.999052 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.999034 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4fj5x" Apr 16 18:09:39.999149 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:39.999125 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:09:39.999256 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:39.999233 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4xm4j" podUID="76ba3cac-7c44-4ba0-aefc-cfded09ee26e" Apr 16 18:09:40.000499 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.000475 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.001788 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.001747 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:09:40.001899 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.001843 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:09:40.001899 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.001852 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:09:40.002062 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.002041 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2b5w5\"" Apr 16 18:09:40.003544 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.002994 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.003544 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.003205 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:09:40.003544 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.003228 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-x4h8x\"" Apr 16 18:09:40.003544 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.003244 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5lcj4" Apr 16 18:09:40.003544 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.003372 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:09:40.003544 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.003507 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:09:40.003907 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.003648 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:09:40.004976 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.004958 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zxsjz" Apr 16 18:09:40.005546 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.005525 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vwqxk\"" Apr 16 18:09:40.005639 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.005589 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:09:40.005697 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.005671 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:09:40.006731 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.006714 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-j5k87\"" Apr 16 18:09:40.006731 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.006729 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:09:40.006868 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.006718 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:09:40.006978 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.006958 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.007779 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.007755 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:09:40.008171 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.008156 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:09:40.008252 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.008180 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:09:40.008252 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.008224 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-rqkcd\"" Apr 16 18:09:40.008594 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.008578 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-k4n6b" Apr 16 18:09:40.009603 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.009586 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:09:40.009860 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.009844 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:09:40.010225 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.010208 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.010838 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.010815 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-fs5s4\"" Apr 16 18:09:40.011711 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.011536 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:09:40.011711 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.011551 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:09:40.011711 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.011599 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:09:40.011711 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.011624 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:09:40.011711 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.011664 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dtdqr\"" Apr 16 18:09:40.011711 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.011678 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:09:40.012036 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.011721 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:09:40.012036 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.011863 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.013008 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.012990 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:09:40.013008 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.013005 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:09:40.013142 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.013030 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:09:40.013313 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.013297 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-462fd\"" Apr 16 18:09:40.014853 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.014828 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:09:40.014945 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.014911 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-g6dg6\"" Apr 16 18:09:40.015008 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.014961 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:09:40.018550 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.018529 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/757de476-6da8-4345-b7fc-6c36ed994dea-serviceca\") pod \"node-ca-4fj5x\" (UID: \"757de476-6da8-4345-b7fc-6c36ed994dea\") " pod="openshift-image-registry/node-ca-4fj5x" Apr 16 18:09:40.018638 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.018564 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-multus-cni-dir\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.018638 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.018590 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-host-var-lib-kubelet\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.018740 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.018657 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-node-log\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.018740 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.018697 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.018740 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.018724 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-run-systemd\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.018874 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.018751 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.018874 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.018793 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-system-cni-dir\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.018874 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.018816 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-host-var-lib-cni-bin\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.018874 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.018836 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-hostroot\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.018874 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.018858 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d1cf278-4df4-49a3-930a-9184e51a38b8-env-overrides\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.019117 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.018891 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-host-var-lib-cni-multus\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.019117 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.018933 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/afb3aa46-f688-46a6-9d9f-7529d606c9dc-cni-binary-copy\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.019117 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.018969 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-systemd-units\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.019117 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019011 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-run-netns\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.019117 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019040 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afb3aa46-f688-46a6-9d9f-7529d606c9dc-system-cni-dir\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.019117 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019069 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/afb3aa46-f688-46a6-9d9f-7529d606c9dc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.019117 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019085 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-slash\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.019117 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019099 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-cni-netd\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.019512 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019122 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-cnibin\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.019512 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019142 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-multus-conf-dir\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.019512 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019184 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-kubelet\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.019512 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019272 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/495e5ec9-68fc-4e69-a6b1-a92f31029302-cni-binary-copy\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.019512 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019306 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/afb3aa46-f688-46a6-9d9f-7529d606c9dc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.019512 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019331 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmnr7\" (UniqueName: \"kubernetes.io/projected/afb3aa46-f688-46a6-9d9f-7529d606c9dc-kube-api-access-gmnr7\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.019512 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019360 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs\") pod \"network-metrics-daemon-pq587\" (UID: \"5b1617ae-f25b-4a90-adf4-ca28c7c22774\") " pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:40.019512 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019387 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4jdp\" (UniqueName: \"kubernetes.io/projected/757de476-6da8-4345-b7fc-6c36ed994dea-kube-api-access-w4jdp\") pod \"node-ca-4fj5x\" (UID: \"757de476-6da8-4345-b7fc-6c36ed994dea\") " pod="openshift-image-registry/node-ca-4fj5x" Apr 16 18:09:40.019512 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019467 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/afb3aa46-f688-46a6-9d9f-7529d606c9dc-cnibin\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.019512 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019514 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ebb55998-f72d-48cf-bc9a-55fbe1047ae8-agent-certs\") pod \"konnectivity-agent-k4n6b\" (UID: \"ebb55998-f72d-48cf-bc9a-55fbe1047ae8\") " pod="kube-system/konnectivity-agent-k4n6b" Apr 16 18:09:40.019844 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019542 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ffb79d59-c86a-4570-91f6-5257796c1cb9-iptables-alerter-script\") pod \"iptables-alerter-zxsjz\" (UID: \"ffb79d59-c86a-4570-91f6-5257796c1cb9\") " pod="openshift-network-operator/iptables-alerter-zxsjz" Apr 16 18:09:40.019844 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019567 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffb79d59-c86a-4570-91f6-5257796c1cb9-host-slash\") pod \"iptables-alerter-zxsjz\" (UID: \"ffb79d59-c86a-4570-91f6-5257796c1cb9\") " pod="openshift-network-operator/iptables-alerter-zxsjz" Apr 16 18:09:40.019844 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019582 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqck6\" (UniqueName: \"kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6\") pod \"network-check-target-4xm4j\" (UID: \"76ba3cac-7c44-4ba0-aefc-cfded09ee26e\") " pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:09:40.019844 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019605 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-multus-socket-dir-parent\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.019844 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019647 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ebb55998-f72d-48cf-bc9a-55fbe1047ae8-konnectivity-ca\") pod \"konnectivity-agent-k4n6b\" (UID: \"ebb55998-f72d-48cf-bc9a-55fbe1047ae8\") " pod="kube-system/konnectivity-agent-k4n6b" Apr 16 18:09:40.019844 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019680 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pq8p\" (UniqueName: \"kubernetes.io/projected/5b1617ae-f25b-4a90-adf4-ca28c7c22774-kube-api-access-9pq8p\") pod \"network-metrics-daemon-pq587\" (UID: \"5b1617ae-f25b-4a90-adf4-ca28c7c22774\") " pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:40.019844 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019704 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1297cac1-5827-4690-b5d6-38c2ba71da4e-tmp-dir\") pod \"node-resolver-5lcj4\" (UID: \"1297cac1-5827-4690-b5d6-38c2ba71da4e\") " pod="openshift-dns/node-resolver-5lcj4" Apr 16 18:09:40.019844 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019728 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-etc-openvswitch\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.019844 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019751 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-run-openvswitch\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.019844 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019773 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-run-ovn\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.019844 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019795 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-host-run-multus-certs\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.019844 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019819 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6qmh\" (UniqueName: \"kubernetes.io/projected/495e5ec9-68fc-4e69-a6b1-a92f31029302-kube-api-access-g6qmh\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.019844 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019842 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/495e5ec9-68fc-4e69-a6b1-a92f31029302-multus-daemon-config\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.020271 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019865 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/afb3aa46-f688-46a6-9d9f-7529d606c9dc-os-release\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.020271 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019886 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1297cac1-5827-4690-b5d6-38c2ba71da4e-hosts-file\") pod \"node-resolver-5lcj4\" (UID: \"1297cac1-5827-4690-b5d6-38c2ba71da4e\") " pod="openshift-dns/node-resolver-5lcj4" Apr 16 18:09:40.020271 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019910 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-log-socket\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.020271 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.019933 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d1cf278-4df4-49a3-930a-9184e51a38b8-ovn-node-metrics-cert\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.020271 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.020079 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d1cf278-4df4-49a3-930a-9184e51a38b8-ovnkube-script-lib\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.020271 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.020207 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j5l5\" (UniqueName: \"kubernetes.io/projected/2d1cf278-4df4-49a3-930a-9184e51a38b8-kube-api-access-8j5l5\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.020271 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.020227 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/757de476-6da8-4345-b7fc-6c36ed994dea-host\") pod \"node-ca-4fj5x\" (UID: \"757de476-6da8-4345-b7fc-6c36ed994dea\") " pod="openshift-image-registry/node-ca-4fj5x" Apr 16 18:09:40.020271 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.020241 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-os-release\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.020271 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.020254 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-host-run-k8s-cni-cncf-io\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.020271 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.020274 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-host-run-netns\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.020673 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.020297 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-cni-bin\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.020673 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.020321 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/afb3aa46-f688-46a6-9d9f-7529d606c9dc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.020673 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.020362 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm279\" (UniqueName: \"kubernetes.io/projected/1297cac1-5827-4690-b5d6-38c2ba71da4e-kube-api-access-bm279\") pod \"node-resolver-5lcj4\" (UID: \"1297cac1-5827-4690-b5d6-38c2ba71da4e\") " pod="openshift-dns/node-resolver-5lcj4" Apr 16 18:09:40.020673 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.020415 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7kg9\" (UniqueName: \"kubernetes.io/projected/ffb79d59-c86a-4570-91f6-5257796c1cb9-kube-api-access-x7kg9\") pod \"iptables-alerter-zxsjz\" (UID: \"ffb79d59-c86a-4570-91f6-5257796c1cb9\") " pod="openshift-network-operator/iptables-alerter-zxsjz" Apr 16 18:09:40.020673 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.020444 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-var-lib-openvswitch\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.020673 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.020466 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d1cf278-4df4-49a3-930a-9184e51a38b8-ovnkube-config\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.020673 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.020482 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-etc-kubernetes\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.052770 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.052732 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:04:39 +0000 UTC" deadline="2027-10-16 14:28:07.068745952 +0000 UTC" Apr 16 18:09:40.052770 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.052768 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13148h18m27.015981799s" Apr 16 18:09:40.109310 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.109277 2563 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:09:40.121507 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121472 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-run-netns\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.121688 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121522 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-lib-modules\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.121688 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121553 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afb3aa46-f688-46a6-9d9f-7529d606c9dc-system-cni-dir\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.121688 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121580 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-run-netns\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.121688 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121612 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-kubernetes\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.121688 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121664 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afb3aa46-f688-46a6-9d9f-7529d606c9dc-system-cni-dir\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.121688 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121669 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-tuned\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.121971 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121703 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2qtb\" (UniqueName: \"kubernetes.io/projected/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-kube-api-access-b2qtb\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.121971 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121746 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.121971 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121779 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/afb3aa46-f688-46a6-9d9f-7529d606c9dc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.121971 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121829 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-slash\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.121971 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121854 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-cni-netd\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.121971 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121872 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-sysconfig\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.121971 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121892 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-run\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.121971 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121914 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-cnibin\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.121971 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121930 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-sys-fs\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.121971 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121931 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-slash\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.121971 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121951 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-multus-conf-dir\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.121971 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121971 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-kubelet\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.122522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.121991 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-sysctl-conf\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.122522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122026 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/495e5ec9-68fc-4e69-a6b1-a92f31029302-cni-binary-copy\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.122522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122056 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/afb3aa46-f688-46a6-9d9f-7529d606c9dc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.122522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122077 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmnr7\" (UniqueName: \"kubernetes.io/projected/afb3aa46-f688-46a6-9d9f-7529d606c9dc-kube-api-access-gmnr7\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.122522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122104 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs\") pod \"network-metrics-daemon-pq587\" (UID: \"5b1617ae-f25b-4a90-adf4-ca28c7c22774\") " pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:40.122522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122132 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4jdp\" (UniqueName: \"kubernetes.io/projected/757de476-6da8-4345-b7fc-6c36ed994dea-kube-api-access-w4jdp\") pod \"node-ca-4fj5x\" (UID: \"757de476-6da8-4345-b7fc-6c36ed994dea\") " pod="openshift-image-registry/node-ca-4fj5x" Apr 16 18:09:40.122522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122162 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/afb3aa46-f688-46a6-9d9f-7529d606c9dc-cnibin\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.122522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122183 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ebb55998-f72d-48cf-bc9a-55fbe1047ae8-agent-certs\") pod \"konnectivity-agent-k4n6b\" (UID: \"ebb55998-f72d-48cf-bc9a-55fbe1047ae8\") " pod="kube-system/konnectivity-agent-k4n6b" Apr 16 18:09:40.122522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122200 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ffb79d59-c86a-4570-91f6-5257796c1cb9-iptables-alerter-script\") pod \"iptables-alerter-zxsjz\" (UID: \"ffb79d59-c86a-4570-91f6-5257796c1cb9\") " pod="openshift-network-operator/iptables-alerter-zxsjz" Apr 16 18:09:40.122522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122223 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffb79d59-c86a-4570-91f6-5257796c1cb9-host-slash\") pod \"iptables-alerter-zxsjz\" (UID: \"ffb79d59-c86a-4570-91f6-5257796c1cb9\") " pod="openshift-network-operator/iptables-alerter-zxsjz" Apr 16 18:09:40.122522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122250 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-modprobe-d\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.122522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122276 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqck6\" (UniqueName: \"kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6\") pod \"network-check-target-4xm4j\" (UID: \"76ba3cac-7c44-4ba0-aefc-cfded09ee26e\") " pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:09:40.122522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122304 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-multus-socket-dir-parent\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.122522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122339 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ebb55998-f72d-48cf-bc9a-55fbe1047ae8-konnectivity-ca\") pod \"konnectivity-agent-k4n6b\" (UID: \"ebb55998-f72d-48cf-bc9a-55fbe1047ae8\") " pod="kube-system/konnectivity-agent-k4n6b" Apr 16 18:09:40.122522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122357 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/afb3aa46-f688-46a6-9d9f-7529d606c9dc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.122522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122389 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/afb3aa46-f688-46a6-9d9f-7529d606c9dc-cnibin\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.122522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122446 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pq8p\" (UniqueName: \"kubernetes.io/projected/5b1617ae-f25b-4a90-adf4-ca28c7c22774-kube-api-access-9pq8p\") pod \"network-metrics-daemon-pq587\" (UID: \"5b1617ae-f25b-4a90-adf4-ca28c7c22774\") " pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:40.123290 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122458 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-cnibin\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.123290 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122485 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1297cac1-5827-4690-b5d6-38c2ba71da4e-tmp-dir\") pod \"node-resolver-5lcj4\" (UID: \"1297cac1-5827-4690-b5d6-38c2ba71da4e\") " pod="openshift-dns/node-resolver-5lcj4" Apr 16 18:09:40.123290 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122498 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-multus-conf-dir\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.123290 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122501 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-cni-netd\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.123290 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122518 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-etc-openvswitch\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.123290 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122553 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-run-openvswitch\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.123290 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122581 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-run-ovn\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.123290 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122556 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffb79d59-c86a-4570-91f6-5257796c1cb9-host-slash\") pod \"iptables-alerter-zxsjz\" (UID: \"ffb79d59-c86a-4570-91f6-5257796c1cb9\") " pod="openshift-network-operator/iptables-alerter-zxsjz" Apr 16 18:09:40.123290 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122625 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-run-ovn\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.123290 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122659 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-host-run-multus-certs\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.123290 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122697 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6qmh\" (UniqueName: \"kubernetes.io/projected/495e5ec9-68fc-4e69-a6b1-a92f31029302-kube-api-access-g6qmh\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.123290 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122725 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-sys\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.123290 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122729 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-run-openvswitch\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.123290 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122751 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-var-lib-kubelet\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.123290 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122775 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-etc-selinux\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.123290 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122803 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/495e5ec9-68fc-4e69-a6b1-a92f31029302-multus-daemon-config\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.123290 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122828 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/afb3aa46-f688-46a6-9d9f-7529d606c9dc-os-release\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.123290 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122855 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/495e5ec9-68fc-4e69-a6b1-a92f31029302-cni-binary-copy\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.124107 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122878 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1297cac1-5827-4690-b5d6-38c2ba71da4e-hosts-file\") pod \"node-resolver-5lcj4\" (UID: \"1297cac1-5827-4690-b5d6-38c2ba71da4e\") " pod="openshift-dns/node-resolver-5lcj4" Apr 16 18:09:40.124107 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122890 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ebb55998-f72d-48cf-bc9a-55fbe1047ae8-konnectivity-ca\") pod \"konnectivity-agent-k4n6b\" (UID: \"ebb55998-f72d-48cf-bc9a-55fbe1047ae8\") " pod="kube-system/konnectivity-agent-k4n6b" Apr 16 18:09:40.124107 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122903 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-log-socket\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.124107 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122911 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-kubelet\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.124107 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122933 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d1cf278-4df4-49a3-930a-9184e51a38b8-ovn-node-metrics-cert\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.124107 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122944 2563 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:09:40.124107 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122952 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-etc-openvswitch\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.124107 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122960 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d1cf278-4df4-49a3-930a-9184e51a38b8-ovnkube-script-lib\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.124107 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122665 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-multus-socket-dir-parent\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.124107 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.122987 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8j5l5\" (UniqueName: \"kubernetes.io/projected/2d1cf278-4df4-49a3-930a-9184e51a38b8-kube-api-access-8j5l5\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.124107 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123014 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-host-run-multus-certs\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.124107 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123021 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-sysctl-d\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.124107 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123055 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/757de476-6da8-4345-b7fc-6c36ed994dea-host\") pod \"node-ca-4fj5x\" (UID: \"757de476-6da8-4345-b7fc-6c36ed994dea\") " pod="openshift-image-registry/node-ca-4fj5x" Apr 16 18:09:40.124107 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123089 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/757de476-6da8-4345-b7fc-6c36ed994dea-host\") pod \"node-ca-4fj5x\" (UID: \"757de476-6da8-4345-b7fc-6c36ed994dea\") " pod="openshift-image-registry/node-ca-4fj5x" Apr 16 18:09:40.124107 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123129 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/afb3aa46-f688-46a6-9d9f-7529d606c9dc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.124107 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:40.123178 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:40.124107 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123197 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-log-socket\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.124107 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:40.123282 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs podName:5b1617ae-f25b-4a90-adf4-ca28c7c22774 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:40.623227947 +0000 UTC m=+3.091836714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs") pod "network-metrics-daemon-pq587" (UID: "5b1617ae-f25b-4a90-adf4-ca28c7c22774") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:40.124898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123616 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1297cac1-5827-4690-b5d6-38c2ba71da4e-tmp-dir\") pod \"node-resolver-5lcj4\" (UID: \"1297cac1-5827-4690-b5d6-38c2ba71da4e\") " pod="openshift-dns/node-resolver-5lcj4" Apr 16 18:09:40.124898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123650 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/495e5ec9-68fc-4e69-a6b1-a92f31029302-multus-daemon-config\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.124898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123685 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-os-release\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.124898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123710 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-host-run-k8s-cni-cncf-io\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.124898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123735 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-host-run-netns\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.124898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123742 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/afb3aa46-f688-46a6-9d9f-7529d606c9dc-os-release\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.124898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123782 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1297cac1-5827-4690-b5d6-38c2ba71da4e-hosts-file\") pod \"node-resolver-5lcj4\" (UID: \"1297cac1-5827-4690-b5d6-38c2ba71da4e\") " pod="openshift-dns/node-resolver-5lcj4" Apr 16 18:09:40.124898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123792 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-os-release\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.124898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123792 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-host-run-k8s-cni-cncf-io\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.124898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123816 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-cni-bin\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.124898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123828 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-host-run-netns\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.124898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123838 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/afb3aa46-f688-46a6-9d9f-7529d606c9dc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.124898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123870 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-cni-bin\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.124898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123880 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bm279\" (UniqueName: \"kubernetes.io/projected/1297cac1-5827-4690-b5d6-38c2ba71da4e-kube-api-access-bm279\") pod \"node-resolver-5lcj4\" (UID: \"1297cac1-5827-4690-b5d6-38c2ba71da4e\") " pod="openshift-dns/node-resolver-5lcj4" Apr 16 18:09:40.124898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123890 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ffb79d59-c86a-4570-91f6-5257796c1cb9-iptables-alerter-script\") pod \"iptables-alerter-zxsjz\" (UID: \"ffb79d59-c86a-4570-91f6-5257796c1cb9\") " pod="openshift-network-operator/iptables-alerter-zxsjz" Apr 16 18:09:40.124898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123906 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7kg9\" (UniqueName: \"kubernetes.io/projected/ffb79d59-c86a-4570-91f6-5257796c1cb9-kube-api-access-x7kg9\") pod \"iptables-alerter-zxsjz\" (UID: \"ffb79d59-c86a-4570-91f6-5257796c1cb9\") " pod="openshift-network-operator/iptables-alerter-zxsjz" Apr 16 18:09:40.124898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123953 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-var-lib-openvswitch\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.124898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123978 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d1cf278-4df4-49a3-930a-9184e51a38b8-ovnkube-config\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.125610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.123988 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/afb3aa46-f688-46a6-9d9f-7529d606c9dc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.125610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124004 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-systemd\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.125610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124029 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-var-lib-openvswitch\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.125610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124065 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-etc-kubernetes\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.125610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124030 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-etc-kubernetes\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.125610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124071 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d1cf278-4df4-49a3-930a-9184e51a38b8-ovnkube-script-lib\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.125610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124109 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-tmp\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.125610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124137 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-device-dir\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.125610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124173 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/757de476-6da8-4345-b7fc-6c36ed994dea-serviceca\") pod \"node-ca-4fj5x\" (UID: \"757de476-6da8-4345-b7fc-6c36ed994dea\") " pod="openshift-image-registry/node-ca-4fj5x" Apr 16 18:09:40.125610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124198 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-multus-cni-dir\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.125610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124224 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-host-var-lib-kubelet\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.125610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124265 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-node-log\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.125610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124290 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.125610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124295 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-host-var-lib-kubelet\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.125610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124315 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-socket-dir\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.125610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124339 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-node-log\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.125610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124347 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-run-systemd\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.126201 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124374 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.126201 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124381 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.126201 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124350 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-multus-cni-dir\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.126201 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124414 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-run-systemd\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.126201 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124418 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-system-cni-dir\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.126201 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124457 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.126201 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124463 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-host-var-lib-cni-bin\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.126201 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124480 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d1cf278-4df4-49a3-930a-9184e51a38b8-ovnkube-config\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.126201 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124490 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-hostroot\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.126201 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124499 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-system-cni-dir\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.126201 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124514 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d1cf278-4df4-49a3-930a-9184e51a38b8-env-overrides\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.126201 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124542 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-registration-dir\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.126201 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124546 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-host-var-lib-cni-bin\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.126201 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124545 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-hostroot\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.126201 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124583 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm495\" (UniqueName: \"kubernetes.io/projected/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-kube-api-access-cm495\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.126201 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124596 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/757de476-6da8-4345-b7fc-6c36ed994dea-serviceca\") pod \"node-ca-4fj5x\" (UID: \"757de476-6da8-4345-b7fc-6c36ed994dea\") " pod="openshift-image-registry/node-ca-4fj5x" Apr 16 18:09:40.126201 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124624 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-host-var-lib-cni-multus\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.126864 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124658 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/495e5ec9-68fc-4e69-a6b1-a92f31029302-host-var-lib-cni-multus\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.126864 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124657 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/afb3aa46-f688-46a6-9d9f-7529d606c9dc-cni-binary-copy\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.126864 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124695 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-systemd-units\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.126864 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124723 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-host\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.126864 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124752 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d1cf278-4df4-49a3-930a-9184e51a38b8-systemd-units\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.126864 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.124964 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d1cf278-4df4-49a3-930a-9184e51a38b8-env-overrides\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.126864 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.125101 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/afb3aa46-f688-46a6-9d9f-7529d606c9dc-cni-binary-copy\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.126864 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.126544 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d1cf278-4df4-49a3-930a-9184e51a38b8-ovn-node-metrics-cert\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.126864 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.126676 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ebb55998-f72d-48cf-bc9a-55fbe1047ae8-agent-certs\") pod \"konnectivity-agent-k4n6b\" (UID: \"ebb55998-f72d-48cf-bc9a-55fbe1047ae8\") " pod="kube-system/konnectivity-agent-k4n6b" Apr 16 18:09:40.134530 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.133908 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pq8p\" (UniqueName: \"kubernetes.io/projected/5b1617ae-f25b-4a90-adf4-ca28c7c22774-kube-api-access-9pq8p\") pod \"network-metrics-daemon-pq587\" (UID: \"5b1617ae-f25b-4a90-adf4-ca28c7c22774\") " pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:40.135576 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:40.135524 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:40.135576 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:40.135550 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:40.136141 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:40.135908 2563 projected.go:194] Error preparing data for projected volume kube-api-access-mqck6 for pod openshift-network-diagnostics/network-check-target-4xm4j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:40.136141 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:40.136061 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6 podName:76ba3cac-7c44-4ba0-aefc-cfded09ee26e nodeName:}" failed. No retries permitted until 2026-04-16 18:09:40.636029828 +0000 UTC m=+3.104638599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mqck6" (UniqueName: "kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6") pod "network-check-target-4xm4j" (UID: "76ba3cac-7c44-4ba0-aefc-cfded09ee26e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:40.138513 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.138487 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6qmh\" (UniqueName: \"kubernetes.io/projected/495e5ec9-68fc-4e69-a6b1-a92f31029302-kube-api-access-g6qmh\") pod \"multus-9f96s\" (UID: \"495e5ec9-68fc-4e69-a6b1-a92f31029302\") " pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.139234 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.139181 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7kg9\" (UniqueName: \"kubernetes.io/projected/ffb79d59-c86a-4570-91f6-5257796c1cb9-kube-api-access-x7kg9\") pod \"iptables-alerter-zxsjz\" (UID: \"ffb79d59-c86a-4570-91f6-5257796c1cb9\") " pod="openshift-network-operator/iptables-alerter-zxsjz" Apr 16 18:09:40.139593 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.139571 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm279\" (UniqueName: \"kubernetes.io/projected/1297cac1-5827-4690-b5d6-38c2ba71da4e-kube-api-access-bm279\") pod \"node-resolver-5lcj4\" (UID: \"1297cac1-5827-4690-b5d6-38c2ba71da4e\") " pod="openshift-dns/node-resolver-5lcj4" Apr 16 18:09:40.140127 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.140103 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4jdp\" (UniqueName: \"kubernetes.io/projected/757de476-6da8-4345-b7fc-6c36ed994dea-kube-api-access-w4jdp\") pod \"node-ca-4fj5x\" (UID: \"757de476-6da8-4345-b7fc-6c36ed994dea\") " pod="openshift-image-registry/node-ca-4fj5x" Apr 16 18:09:40.140234 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.140180 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmnr7\" (UniqueName: \"kubernetes.io/projected/afb3aa46-f688-46a6-9d9f-7529d606c9dc-kube-api-access-gmnr7\") pod \"multus-additional-cni-plugins-v9jdb\" (UID: \"afb3aa46-f688-46a6-9d9f-7529d606c9dc\") " pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.140327 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.140302 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j5l5\" (UniqueName: \"kubernetes.io/projected/2d1cf278-4df4-49a3-930a-9184e51a38b8-kube-api-access-8j5l5\") pod \"ovnkube-node-4qpjc\" (UID: \"2d1cf278-4df4-49a3-930a-9184e51a38b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.225087 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225041 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cm495\" (UniqueName: \"kubernetes.io/projected/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-kube-api-access-cm495\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.225087 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225086 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-host\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.225314 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225158 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-host\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.225314 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225220 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-lib-modules\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.225314 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225262 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-kubernetes\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.225314 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225285 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-tuned\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.225512 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225350 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-kubernetes\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.225512 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225383 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2qtb\" (UniqueName: \"kubernetes.io/projected/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-kube-api-access-b2qtb\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.225512 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225429 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.225512 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225451 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-lib-modules\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.225512 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225507 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.225699 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225515 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-sysconfig\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.225699 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225458 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-sysconfig\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.225699 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225547 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-run\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.225699 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225571 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-sys-fs\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.225699 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225600 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-sysctl-conf\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.225699 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225647 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-modprobe-d\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.225699 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225684 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-run\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.226020 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225711 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-sys\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.226020 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225738 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-var-lib-kubelet\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.226020 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225744 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-sys-fs\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.226020 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225761 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-etc-selinux\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.226020 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225790 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-sysctl-d\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.226020 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225795 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-sys\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.226020 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225828 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-systemd\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.226020 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225826 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-sysctl-conf\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.226020 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225874 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-tmp\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.226020 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225889 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-etc-selinux\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.226020 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225906 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-device-dir\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.226020 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225942 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-systemd\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.226020 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225951 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-sysctl-d\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.226020 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225961 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-var-lib-kubelet\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.226020 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.225983 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-socket-dir\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.226020 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.226014 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-registration-dir\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.226020 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.226020 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-modprobe-d\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.226871 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.226014 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-device-dir\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.226871 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.226111 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-socket-dir\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.226871 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.226115 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-registration-dir\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.227790 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.227758 2563 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:40.227948 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.227926 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-etc-tuned\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.228201 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.228180 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-tmp\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.235047 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.234980 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2qtb\" (UniqueName: \"kubernetes.io/projected/ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb-kube-api-access-b2qtb\") pod \"tuned-r69c2\" (UID: \"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb\") " pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.235175 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.235042 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm495\" (UniqueName: \"kubernetes.io/projected/ff6d0cad-d820-4150-8b32-c6ba5e7da71f-kube-api-access-cm495\") pod \"aws-ebs-csi-driver-node-fsz64\" (UID: \"ff6d0cad-d820-4150-8b32-c6ba5e7da71f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.310889 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.310849 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4fj5x" Apr 16 18:09:40.318699 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.318672 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9f96s" Apr 16 18:09:40.326468 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.326436 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-v9jdb" Apr 16 18:09:40.332131 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.332105 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5lcj4" Apr 16 18:09:40.343815 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.343788 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zxsjz" Apr 16 18:09:40.350603 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.350581 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:09:40.356247 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.356228 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-k4n6b" Apr 16 18:09:40.362839 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.362818 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" Apr 16 18:09:40.367572 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.367549 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-r69c2" Apr 16 18:09:40.628462 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.628365 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs\") pod \"network-metrics-daemon-pq587\" (UID: \"5b1617ae-f25b-4a90-adf4-ca28c7c22774\") " pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:40.628623 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:40.628536 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:40.628623 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:40.628601 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs podName:5b1617ae-f25b-4a90-adf4-ca28c7c22774 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:41.628581209 +0000 UTC m=+4.097189978 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs") pod "network-metrics-daemon-pq587" (UID: "5b1617ae-f25b-4a90-adf4-ca28c7c22774") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:40.728682 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:40.728655 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqck6\" (UniqueName: \"kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6\") pod \"network-check-target-4xm4j\" (UID: \"76ba3cac-7c44-4ba0-aefc-cfded09ee26e\") " pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:09:40.728864 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:40.728781 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:40.728864 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:40.728796 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:40.728864 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:40.728804 2563 projected.go:194] Error preparing data for projected volume kube-api-access-mqck6 for pod openshift-network-diagnostics/network-check-target-4xm4j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:40.728864 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:40.728854 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6 podName:76ba3cac-7c44-4ba0-aefc-cfded09ee26e nodeName:}" failed. No retries permitted until 2026-04-16 18:09:41.728838033 +0000 UTC m=+4.197446807 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mqck6" (UniqueName: "kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6") pod "network-check-target-4xm4j" (UID: "76ba3cac-7c44-4ba0-aefc-cfded09ee26e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:40.780375 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:40.780341 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod495e5ec9_68fc_4e69_a6b1_a92f31029302.slice/crio-dd20e3c078faa64295db65ef33f458dc387525d615dd01c91a40aec41782a629 WatchSource:0}: Error finding container dd20e3c078faa64295db65ef33f458dc387525d615dd01c91a40aec41782a629: Status 404 returned error can't find the container with id dd20e3c078faa64295db65ef33f458dc387525d615dd01c91a40aec41782a629 Apr 16 18:09:40.787540 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:40.787509 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d1cf278_4df4_49a3_930a_9184e51a38b8.slice/crio-46b68f718b8994acb3dbecd49cc6208577015a53c30fbd7fb0778cfd854925ce WatchSource:0}: Error finding container 46b68f718b8994acb3dbecd49cc6208577015a53c30fbd7fb0778cfd854925ce: Status 404 returned error can't find the container with id 46b68f718b8994acb3dbecd49cc6208577015a53c30fbd7fb0778cfd854925ce Apr 16 18:09:40.788435 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:40.788414 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebb55998_f72d_48cf_bc9a_55fbe1047ae8.slice/crio-5202e1826401ec10732d9c8d8ac4d6b177f748a672ab2dd687bd1781466b138f WatchSource:0}: Error finding container 5202e1826401ec10732d9c8d8ac4d6b177f748a672ab2dd687bd1781466b138f: Status 404 returned error can't find the container with id 5202e1826401ec10732d9c8d8ac4d6b177f748a672ab2dd687bd1781466b138f Apr 16 18:09:40.789670 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:40.789618 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce669c4a_26f5_4db6_8ebc_9f4e5f3770bb.slice/crio-a13d9474dbf902f4c75cf4a7e2577e387af55f39e385a491aa9873950e6d37c8 WatchSource:0}: Error finding container a13d9474dbf902f4c75cf4a7e2577e387af55f39e385a491aa9873950e6d37c8: Status 404 returned error can't find the container with id a13d9474dbf902f4c75cf4a7e2577e387af55f39e385a491aa9873950e6d37c8 Apr 16 18:09:40.790455 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:40.790368 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff6d0cad_d820_4150_8b32_c6ba5e7da71f.slice/crio-db11cd6add100aefd2ec2f2d309a6c3d2bbea37e8c2c93015188ed744f3742eb WatchSource:0}: Error finding container db11cd6add100aefd2ec2f2d309a6c3d2bbea37e8c2c93015188ed744f3742eb: Status 404 returned error can't find the container with id db11cd6add100aefd2ec2f2d309a6c3d2bbea37e8c2c93015188ed744f3742eb Apr 16 18:09:40.792496 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:40.791596 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb3aa46_f688_46a6_9d9f_7529d606c9dc.slice/crio-0181119a0432f8557ec6ca2b3bc3e08365e1319720e4b13b36c664301eebe78a WatchSource:0}: Error finding container 0181119a0432f8557ec6ca2b3bc3e08365e1319720e4b13b36c664301eebe78a: Status 404 returned error can't find the container with id 0181119a0432f8557ec6ca2b3bc3e08365e1319720e4b13b36c664301eebe78a Apr 16 18:09:40.792496 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:40.791964 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffb79d59_c86a_4570_91f6_5257796c1cb9.slice/crio-adb1a649a8ce4020ea545444cd8caf9e7e14a46736461b1aa585d363dc1dbd1d WatchSource:0}: Error finding container adb1a649a8ce4020ea545444cd8caf9e7e14a46736461b1aa585d363dc1dbd1d: Status 404 returned error can't find the container with id adb1a649a8ce4020ea545444cd8caf9e7e14a46736461b1aa585d363dc1dbd1d Apr 16 18:09:40.793611 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:40.793588 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod757de476_6da8_4345_b7fc_6c36ed994dea.slice/crio-2449d03da3682bd63e0ebbe10c84d1309b892e2ef3a149b3656149f8507a8ab3 WatchSource:0}: Error finding container 2449d03da3682bd63e0ebbe10c84d1309b892e2ef3a149b3656149f8507a8ab3: Status 404 returned error can't find the container with id 2449d03da3682bd63e0ebbe10c84d1309b892e2ef3a149b3656149f8507a8ab3 Apr 16 18:09:40.794662 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:09:40.794640 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1297cac1_5827_4690_b5d6_38c2ba71da4e.slice/crio-6b0b008184484ba4dbd3dfccec9531d432f7f222f5d1807ec7a8af34a7d30ac8 WatchSource:0}: Error finding container 6b0b008184484ba4dbd3dfccec9531d432f7f222f5d1807ec7a8af34a7d30ac8: Status 404 returned error can't find the container with id 6b0b008184484ba4dbd3dfccec9531d432f7f222f5d1807ec7a8af34a7d30ac8 Apr 16 18:09:41.054243 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:41.053924 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:04:39 +0000 UTC" deadline="2027-12-09 17:55:52.659383981 +0000 UTC" Apr 16 18:09:41.054243 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:41.054162 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14447h46m11.60522703s" Apr 16 18:09:41.151286 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:41.150799 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:09:41.151286 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:41.150926 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4xm4j" podUID="76ba3cac-7c44-4ba0-aefc-cfded09ee26e" Apr 16 18:09:41.165923 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:41.165767 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-r69c2" event={"ID":"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb","Type":"ContainerStarted","Data":"a13d9474dbf902f4c75cf4a7e2577e387af55f39e385a491aa9873950e6d37c8"} Apr 16 18:09:41.170449 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:41.170333 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal" event={"ID":"51e2afdeb323f70af661db7224d1a09f","Type":"ContainerStarted","Data":"8978958708f0d7799b6d0e946b89a657192b0d4c199e7a734844c6c55491eca3"} Apr 16 18:09:41.174412 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:41.174223 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4fj5x" event={"ID":"757de476-6da8-4345-b7fc-6c36ed994dea","Type":"ContainerStarted","Data":"2449d03da3682bd63e0ebbe10c84d1309b892e2ef3a149b3656149f8507a8ab3"} Apr 16 18:09:41.183521 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:41.182711 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zxsjz" event={"ID":"ffb79d59-c86a-4570-91f6-5257796c1cb9","Type":"ContainerStarted","Data":"adb1a649a8ce4020ea545444cd8caf9e7e14a46736461b1aa585d363dc1dbd1d"} Apr 16 18:09:41.186806 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:41.186623 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal" podStartSLOduration=2.186605032 podStartE2EDuration="2.186605032s" podCreationTimestamp="2026-04-16 18:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:09:41.186270869 +0000 UTC m=+3.654879647" watchObservedRunningTime="2026-04-16 18:09:41.186605032 +0000 UTC m=+3.655213810" Apr 16 18:09:41.187882 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:41.187831 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" event={"ID":"ff6d0cad-d820-4150-8b32-c6ba5e7da71f","Type":"ContainerStarted","Data":"db11cd6add100aefd2ec2f2d309a6c3d2bbea37e8c2c93015188ed744f3742eb"} Apr 16 18:09:41.190256 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:41.190194 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-k4n6b" event={"ID":"ebb55998-f72d-48cf-bc9a-55fbe1047ae8","Type":"ContainerStarted","Data":"5202e1826401ec10732d9c8d8ac4d6b177f748a672ab2dd687bd1781466b138f"} Apr 16 18:09:41.192335 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:41.192309 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" event={"ID":"2d1cf278-4df4-49a3-930a-9184e51a38b8","Type":"ContainerStarted","Data":"46b68f718b8994acb3dbecd49cc6208577015a53c30fbd7fb0778cfd854925ce"} Apr 16 18:09:41.199432 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:41.199350 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9f96s" event={"ID":"495e5ec9-68fc-4e69-a6b1-a92f31029302","Type":"ContainerStarted","Data":"dd20e3c078faa64295db65ef33f458dc387525d615dd01c91a40aec41782a629"} Apr 16 18:09:41.203517 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:41.203478 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5lcj4" event={"ID":"1297cac1-5827-4690-b5d6-38c2ba71da4e","Type":"ContainerStarted","Data":"6b0b008184484ba4dbd3dfccec9531d432f7f222f5d1807ec7a8af34a7d30ac8"} Apr 16 18:09:41.205571 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:41.205547 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9jdb" event={"ID":"afb3aa46-f688-46a6-9d9f-7529d606c9dc","Type":"ContainerStarted","Data":"0181119a0432f8557ec6ca2b3bc3e08365e1319720e4b13b36c664301eebe78a"} Apr 16 18:09:41.636014 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:41.635980 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs\") pod \"network-metrics-daemon-pq587\" (UID: \"5b1617ae-f25b-4a90-adf4-ca28c7c22774\") " pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:41.636205 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:41.636151 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:41.636504 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:41.636372 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs podName:5b1617ae-f25b-4a90-adf4-ca28c7c22774 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:43.636194868 +0000 UTC m=+6.104803657 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs") pod "network-metrics-daemon-pq587" (UID: "5b1617ae-f25b-4a90-adf4-ca28c7c22774") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:41.739885 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:41.737261 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqck6\" (UniqueName: \"kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6\") pod \"network-check-target-4xm4j\" (UID: \"76ba3cac-7c44-4ba0-aefc-cfded09ee26e\") " pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:09:41.739885 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:41.737468 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:41.739885 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:41.737489 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:41.739885 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:41.737501 2563 projected.go:194] Error preparing data for projected volume kube-api-access-mqck6 for pod openshift-network-diagnostics/network-check-target-4xm4j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:41.739885 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:41.737562 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6 podName:76ba3cac-7c44-4ba0-aefc-cfded09ee26e nodeName:}" failed. No retries permitted until 2026-04-16 18:09:43.737543659 +0000 UTC m=+6.206152419 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mqck6" (UniqueName: "kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6") pod "network-check-target-4xm4j" (UID: "76ba3cac-7c44-4ba0-aefc-cfded09ee26e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:42.152006 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:42.151972 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:42.152484 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:42.152125 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pq587" podUID="5b1617ae-f25b-4a90-adf4-ca28c7c22774" Apr 16 18:09:42.220576 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:42.220010 2563 generic.go:358] "Generic (PLEG): container finished" podID="99460a2b083cb8262a03a4f6b644b00a" containerID="16471ff26711bb65624dfc7d7c5795a92577d9ef789ccb132f86b64d0df000ef" exitCode=0 Apr 16 18:09:42.220576 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:42.220246 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" event={"ID":"99460a2b083cb8262a03a4f6b644b00a","Type":"ContainerDied","Data":"16471ff26711bb65624dfc7d7c5795a92577d9ef789ccb132f86b64d0df000ef"} Apr 16 18:09:43.151510 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:43.150925 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:09:43.151510 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:43.151046 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4xm4j" podUID="76ba3cac-7c44-4ba0-aefc-cfded09ee26e" Apr 16 18:09:43.247421 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:43.247371 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" event={"ID":"99460a2b083cb8262a03a4f6b644b00a","Type":"ContainerStarted","Data":"f0fb7a33f5a3d4fd40da07428577e559409127421ad1897f03d97a4e604f19d1"} Apr 16 18:09:43.261253 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:43.261182 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" podStartSLOduration=4.261161404 podStartE2EDuration="4.261161404s" podCreationTimestamp="2026-04-16 18:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:09:43.260664504 +0000 UTC m=+5.729273284" watchObservedRunningTime="2026-04-16 18:09:43.261161404 +0000 UTC m=+5.729770183" Apr 16 18:09:43.653890 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:43.653847 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs\") pod \"network-metrics-daemon-pq587\" (UID: \"5b1617ae-f25b-4a90-adf4-ca28c7c22774\") " pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:43.654093 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:43.654024 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:43.654157 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:43.654097 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs podName:5b1617ae-f25b-4a90-adf4-ca28c7c22774 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:47.654078462 +0000 UTC m=+10.122687223 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs") pod "network-metrics-daemon-pq587" (UID: "5b1617ae-f25b-4a90-adf4-ca28c7c22774") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:43.754867 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:43.754833 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqck6\" (UniqueName: \"kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6\") pod \"network-check-target-4xm4j\" (UID: \"76ba3cac-7c44-4ba0-aefc-cfded09ee26e\") " pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:09:43.755049 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:43.755002 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:43.755049 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:43.755016 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:43.755049 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:43.755028 2563 projected.go:194] Error preparing data for projected volume kube-api-access-mqck6 for pod openshift-network-diagnostics/network-check-target-4xm4j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:43.755219 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:43.755084 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6 podName:76ba3cac-7c44-4ba0-aefc-cfded09ee26e nodeName:}" failed. No retries permitted until 2026-04-16 18:09:47.755068651 +0000 UTC m=+10.223677406 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mqck6" (UniqueName: "kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6") pod "network-check-target-4xm4j" (UID: "76ba3cac-7c44-4ba0-aefc-cfded09ee26e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:44.151572 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:44.151536 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:44.151768 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:44.151680 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pq587" podUID="5b1617ae-f25b-4a90-adf4-ca28c7c22774" Apr 16 18:09:45.150205 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:45.150151 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:09:45.150700 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:45.150367 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4xm4j" podUID="76ba3cac-7c44-4ba0-aefc-cfded09ee26e" Apr 16 18:09:46.150836 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:46.150802 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:46.151271 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:46.150947 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pq587" podUID="5b1617ae-f25b-4a90-adf4-ca28c7c22774" Apr 16 18:09:47.150696 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:47.150659 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:09:47.150879 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:47.150792 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4xm4j" podUID="76ba3cac-7c44-4ba0-aefc-cfded09ee26e" Apr 16 18:09:47.692128 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:47.691837 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs\") pod \"network-metrics-daemon-pq587\" (UID: \"5b1617ae-f25b-4a90-adf4-ca28c7c22774\") " pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:47.692128 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:47.692036 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:47.692128 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:47.692102 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs podName:5b1617ae-f25b-4a90-adf4-ca28c7c22774 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:55.692081586 +0000 UTC m=+18.160690360 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs") pod "network-metrics-daemon-pq587" (UID: "5b1617ae-f25b-4a90-adf4-ca28c7c22774") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:47.792435 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:47.792372 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqck6\" (UniqueName: \"kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6\") pod \"network-check-target-4xm4j\" (UID: \"76ba3cac-7c44-4ba0-aefc-cfded09ee26e\") " pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:09:47.792611 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:47.792591 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:47.792680 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:47.792613 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:47.792680 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:47.792626 2563 projected.go:194] Error preparing data for projected volume kube-api-access-mqck6 for pod openshift-network-diagnostics/network-check-target-4xm4j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:47.792789 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:47.792692 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6 podName:76ba3cac-7c44-4ba0-aefc-cfded09ee26e nodeName:}" failed. No retries permitted until 2026-04-16 18:09:55.792673435 +0000 UTC m=+18.261282195 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mqck6" (UniqueName: "kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6") pod "network-check-target-4xm4j" (UID: "76ba3cac-7c44-4ba0-aefc-cfded09ee26e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:48.151208 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:48.150728 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:48.151208 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:48.150866 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pq587" podUID="5b1617ae-f25b-4a90-adf4-ca28c7c22774" Apr 16 18:09:49.150120 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:49.150081 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:09:49.150297 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:49.150216 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4xm4j" podUID="76ba3cac-7c44-4ba0-aefc-cfded09ee26e" Apr 16 18:09:50.150668 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:50.150568 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:50.151123 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:50.150734 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pq587" podUID="5b1617ae-f25b-4a90-adf4-ca28c7c22774" Apr 16 18:09:51.149985 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:51.149944 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:09:51.150140 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:51.150075 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4xm4j" podUID="76ba3cac-7c44-4ba0-aefc-cfded09ee26e" Apr 16 18:09:52.150382 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:52.150350 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:52.150803 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:52.150512 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pq587" podUID="5b1617ae-f25b-4a90-adf4-ca28c7c22774" Apr 16 18:09:53.150610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:53.150576 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:09:53.151020 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:53.150701 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4xm4j" podUID="76ba3cac-7c44-4ba0-aefc-cfded09ee26e" Apr 16 18:09:54.150593 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:54.150561 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:54.150778 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:54.150678 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pq587" podUID="5b1617ae-f25b-4a90-adf4-ca28c7c22774" Apr 16 18:09:55.150153 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:55.150115 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:09:55.150379 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:55.150242 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4xm4j" podUID="76ba3cac-7c44-4ba0-aefc-cfded09ee26e" Apr 16 18:09:55.746131 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:55.746016 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs\") pod \"network-metrics-daemon-pq587\" (UID: \"5b1617ae-f25b-4a90-adf4-ca28c7c22774\") " pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:55.746585 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:55.746164 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:55.746585 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:55.746240 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs podName:5b1617ae-f25b-4a90-adf4-ca28c7c22774 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:11.746218653 +0000 UTC m=+34.214827410 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs") pod "network-metrics-daemon-pq587" (UID: "5b1617ae-f25b-4a90-adf4-ca28c7c22774") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:55.847393 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:55.847355 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqck6\" (UniqueName: \"kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6\") pod \"network-check-target-4xm4j\" (UID: \"76ba3cac-7c44-4ba0-aefc-cfded09ee26e\") " pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:09:55.847574 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:55.847552 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:55.847624 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:55.847577 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:55.847624 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:55.847591 2563 projected.go:194] Error preparing data for projected volume kube-api-access-mqck6 for pod openshift-network-diagnostics/network-check-target-4xm4j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:55.847706 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:55.847655 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6 podName:76ba3cac-7c44-4ba0-aefc-cfded09ee26e nodeName:}" failed. No retries permitted until 2026-04-16 18:10:11.847636104 +0000 UTC m=+34.316244862 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mqck6" (UniqueName: "kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6") pod "network-check-target-4xm4j" (UID: "76ba3cac-7c44-4ba0-aefc-cfded09ee26e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:56.150062 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:56.149958 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:56.150218 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:56.150123 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pq587" podUID="5b1617ae-f25b-4a90-adf4-ca28c7c22774" Apr 16 18:09:57.150459 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:57.150422 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:09:57.150904 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:57.150533 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4xm4j" podUID="76ba3cac-7c44-4ba0-aefc-cfded09ee26e" Apr 16 18:09:58.151212 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:58.150797 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:09:58.151874 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:58.151311 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pq587" podUID="5b1617ae-f25b-4a90-adf4-ca28c7c22774" Apr 16 18:09:58.273854 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:58.273824 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5lcj4" event={"ID":"1297cac1-5827-4690-b5d6-38c2ba71da4e","Type":"ContainerStarted","Data":"f22759cd5b634ead90926d5a6db1818c6e5c1b1c7badda1f462d7350040e3eaf"} Apr 16 18:09:58.275250 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:58.275224 2563 generic.go:358] "Generic (PLEG): container finished" podID="afb3aa46-f688-46a6-9d9f-7529d606c9dc" containerID="9263adac345cd9ba4e52ec5d9effeefca96ef2abdb86cf62b8409f68e305cf8a" exitCode=0 Apr 16 18:09:58.275374 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:58.275297 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9jdb" event={"ID":"afb3aa46-f688-46a6-9d9f-7529d606c9dc","Type":"ContainerDied","Data":"9263adac345cd9ba4e52ec5d9effeefca96ef2abdb86cf62b8409f68e305cf8a"} Apr 16 18:09:58.276758 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:58.276738 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-r69c2" event={"ID":"ce669c4a-26f5-4db6-8ebc-9f4e5f3770bb","Type":"ContainerStarted","Data":"462f7d970a007db57c27a96a62c68e5600422d876aa207e97ce62bb9117982b1"} Apr 16 18:09:58.278314 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:58.278198 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4fj5x" event={"ID":"757de476-6da8-4345-b7fc-6c36ed994dea","Type":"ContainerStarted","Data":"b7e6511bfd715175c4805595e0f4e6d84ccd503bde423ab0d1b38a916ecdb041"} Apr 16 18:09:58.279566 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:58.279547 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" event={"ID":"ff6d0cad-d820-4150-8b32-c6ba5e7da71f","Type":"ContainerStarted","Data":"0391d03aa281a0b47638f5bd5473d32590ddb4e3d9fd489490c133542e4b354f"} Apr 16 18:09:58.280939 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:58.280761 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-k4n6b" event={"ID":"ebb55998-f72d-48cf-bc9a-55fbe1047ae8","Type":"ContainerStarted","Data":"d16d9c8c3711d8d7fa2b6d6836e2369f8717bdd9d99a06d04c7c0a42a2c256f5"} Apr 16 18:09:58.283354 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:58.283223 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" event={"ID":"2d1cf278-4df4-49a3-930a-9184e51a38b8","Type":"ContainerStarted","Data":"0e76ba3a22672b61ec7f3ffffa82bbe3f1fe3b6e8444188f75da1e1bba7dac59"} Apr 16 18:09:58.283354 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:58.283253 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" event={"ID":"2d1cf278-4df4-49a3-930a-9184e51a38b8","Type":"ContainerStarted","Data":"6694c531bf9b479d0f7f096ebef010388a23ff0e04c8012c151a17d0355e0edd"} Apr 16 18:09:58.283354 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:58.283266 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" event={"ID":"2d1cf278-4df4-49a3-930a-9184e51a38b8","Type":"ContainerStarted","Data":"756462095fc6090799fe3e4cde5e490fcc1d5f1d9fedcfe9440b5bd0da04a949"} Apr 16 18:09:58.283354 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:58.283278 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" event={"ID":"2d1cf278-4df4-49a3-930a-9184e51a38b8","Type":"ContainerStarted","Data":"21f1f9b12ab6635e4d187716eebe89233afb857f7a765ff1568a2dd74404df01"} Apr 16 18:09:58.284655 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:58.284633 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9f96s" event={"ID":"495e5ec9-68fc-4e69-a6b1-a92f31029302","Type":"ContainerStarted","Data":"75d5e6da84b1f3e11e757265b35941c2371568bb80a0fadd0b55b80bd5414623"} Apr 16 18:09:58.286460 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:58.286425 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5lcj4" podStartSLOduration=3.324516876 podStartE2EDuration="20.286414254s" podCreationTimestamp="2026-04-16 18:09:38 +0000 UTC" firstStartedPulling="2026-04-16 18:09:40.797391267 +0000 UTC m=+3.266000023" lastFinishedPulling="2026-04-16 18:09:57.759288634 +0000 UTC m=+20.227897401" observedRunningTime="2026-04-16 18:09:58.286365421 +0000 UTC m=+20.754974198" watchObservedRunningTime="2026-04-16 18:09:58.286414254 +0000 UTC m=+20.755023030" Apr 16 18:09:58.325781 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:58.325740 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-k4n6b" podStartSLOduration=3.431515317 podStartE2EDuration="20.325726477s" podCreationTimestamp="2026-04-16 18:09:38 +0000 UTC" firstStartedPulling="2026-04-16 18:09:40.79067326 +0000 UTC m=+3.259282019" lastFinishedPulling="2026-04-16 18:09:57.684884409 +0000 UTC m=+20.153493179" observedRunningTime="2026-04-16 18:09:58.313890522 +0000 UTC m=+20.782499300" watchObservedRunningTime="2026-04-16 18:09:58.325726477 +0000 UTC m=+20.794335236" Apr 16 18:09:58.326057 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:58.326023 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4fj5x" podStartSLOduration=3.43678413 podStartE2EDuration="20.326016498s" podCreationTimestamp="2026-04-16 18:09:38 +0000 UTC" firstStartedPulling="2026-04-16 18:09:40.795654589 +0000 UTC m=+3.264263351" lastFinishedPulling="2026-04-16 18:09:57.684886964 +0000 UTC m=+20.153495719" observedRunningTime="2026-04-16 18:09:58.325849139 +0000 UTC m=+20.794457917" watchObservedRunningTime="2026-04-16 18:09:58.326016498 +0000 UTC m=+20.794625327" Apr 16 18:09:58.341316 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:58.341271 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9f96s" podStartSLOduration=3.364269133 podStartE2EDuration="20.341257578s" podCreationTimestamp="2026-04-16 18:09:38 +0000 UTC" firstStartedPulling="2026-04-16 18:09:40.783709218 +0000 UTC m=+3.252317988" lastFinishedPulling="2026-04-16 18:09:57.760697678 +0000 UTC m=+20.229306433" observedRunningTime="2026-04-16 18:09:58.340675299 +0000 UTC m=+20.809284076" watchObservedRunningTime="2026-04-16 18:09:58.341257578 +0000 UTC m=+20.809866354" Apr 16 18:09:58.356233 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:58.356181 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-r69c2" podStartSLOduration=3.42523065 podStartE2EDuration="20.356160552s" podCreationTimestamp="2026-04-16 18:09:38 +0000 UTC" firstStartedPulling="2026-04-16 18:09:40.791587611 +0000 UTC m=+3.260196385" lastFinishedPulling="2026-04-16 18:09:57.722517532 +0000 UTC m=+20.191126287" observedRunningTime="2026-04-16 18:09:58.355934491 +0000 UTC m=+20.824543268" watchObservedRunningTime="2026-04-16 18:09:58.356160552 +0000 UTC m=+20.824769329" Apr 16 18:09:59.011373 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:59.011328 2563 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:09:59.083966 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:59.083852 2563 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:09:59.011348012Z","UUID":"850a551f-a843-4e14-9ab8-441fb56b6a3a","Handler":null,"Name":"","Endpoint":""} Apr 16 18:09:59.086477 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:59.086449 2563 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:09:59.086477 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:59.086485 2563 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:09:59.150053 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:59.149960 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:09:59.150213 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:09:59.150098 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4xm4j" podUID="76ba3cac-7c44-4ba0-aefc-cfded09ee26e" Apr 16 18:09:59.288324 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:59.288200 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zxsjz" event={"ID":"ffb79d59-c86a-4570-91f6-5257796c1cb9","Type":"ContainerStarted","Data":"34440fc1160d2660435777fb65a20e3ef702347cea004d1b8dfbd79fd0a521ce"} Apr 16 18:09:59.289957 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:59.289926 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" event={"ID":"ff6d0cad-d820-4150-8b32-c6ba5e7da71f","Type":"ContainerStarted","Data":"dd64ee8d5b2eb2e99f18d34e3e38dc32077330d5538fdbd1d1675f14926c22b9"} Apr 16 18:09:59.292897 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:59.292872 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" event={"ID":"2d1cf278-4df4-49a3-930a-9184e51a38b8","Type":"ContainerStarted","Data":"697ba43a04bf4943c5cf0133d4bc328b2cb9ce97e4087012e2953ffc4ed6d859"} Apr 16 18:09:59.292991 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:59.292905 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" event={"ID":"2d1cf278-4df4-49a3-930a-9184e51a38b8","Type":"ContainerStarted","Data":"1c3b016435251fc76b502bb8a69caea2ff09d5084ce3af2a1eb511f4e92c9f36"} Apr 16 18:09:59.301727 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:09:59.301688 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zxsjz" podStartSLOduration=4.374429737 podStartE2EDuration="21.301676453s" podCreationTimestamp="2026-04-16 18:09:38 +0000 UTC" firstStartedPulling="2026-04-16 18:09:40.79525525 +0000 UTC m=+3.263864014" lastFinishedPulling="2026-04-16 18:09:57.72250196 +0000 UTC m=+20.191110730" observedRunningTime="2026-04-16 18:09:59.301271077 +0000 UTC m=+21.769879854" watchObservedRunningTime="2026-04-16 18:09:59.301676453 +0000 UTC m=+21.770285230" Apr 16 18:10:00.150992 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:00.150727 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:10:00.150992 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:00.150878 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pq587" podUID="5b1617ae-f25b-4a90-adf4-ca28c7c22774" Apr 16 18:10:00.172392 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:00.172361 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-5987t"] Apr 16 18:10:00.178234 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:00.178207 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:00.178412 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:00.178277 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5987t" podUID="50d6b6c6-7edb-4c20-9fa7-9d7f97465217" Apr 16 18:10:00.283150 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:00.283104 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-kubelet-config\") pod \"global-pull-secret-syncer-5987t\" (UID: \"50d6b6c6-7edb-4c20-9fa7-9d7f97465217\") " pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:00.283340 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:00.283160 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-dbus\") pod \"global-pull-secret-syncer-5987t\" (UID: \"50d6b6c6-7edb-4c20-9fa7-9d7f97465217\") " pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:00.283340 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:00.283251 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-original-pull-secret\") pod \"global-pull-secret-syncer-5987t\" (UID: \"50d6b6c6-7edb-4c20-9fa7-9d7f97465217\") " pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:00.296813 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:00.296763 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" event={"ID":"ff6d0cad-d820-4150-8b32-c6ba5e7da71f","Type":"ContainerStarted","Data":"acfd582ce0f635d3f93dbc7168a69fdffb691ac72076c79a1501596b98cab0d9"} Apr 16 18:10:00.315827 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:00.315763 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fsz64" podStartSLOduration=3.328767879 podStartE2EDuration="22.315741739s" podCreationTimestamp="2026-04-16 18:09:38 +0000 UTC" firstStartedPulling="2026-04-16 18:09:40.792602617 +0000 UTC m=+3.261211377" lastFinishedPulling="2026-04-16 18:09:59.779576477 +0000 UTC m=+22.248185237" observedRunningTime="2026-04-16 18:10:00.315172401 +0000 UTC m=+22.783781188" watchObservedRunningTime="2026-04-16 18:10:00.315741739 +0000 UTC m=+22.784350517" Apr 16 18:10:00.384707 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:00.384663 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-kubelet-config\") pod \"global-pull-secret-syncer-5987t\" (UID: \"50d6b6c6-7edb-4c20-9fa7-9d7f97465217\") " pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:00.384893 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:00.384730 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-dbus\") pod \"global-pull-secret-syncer-5987t\" (UID: \"50d6b6c6-7edb-4c20-9fa7-9d7f97465217\") " pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:00.384893 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:00.384774 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-kubelet-config\") pod \"global-pull-secret-syncer-5987t\" (UID: \"50d6b6c6-7edb-4c20-9fa7-9d7f97465217\") " pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:00.384893 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:00.384826 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-original-pull-secret\") pod \"global-pull-secret-syncer-5987t\" (UID: \"50d6b6c6-7edb-4c20-9fa7-9d7f97465217\") " pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:00.385062 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:00.384961 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-dbus\") pod \"global-pull-secret-syncer-5987t\" (UID: \"50d6b6c6-7edb-4c20-9fa7-9d7f97465217\") " pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:00.385062 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:00.384992 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:00.385170 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:00.385113 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-original-pull-secret podName:50d6b6c6-7edb-4c20-9fa7-9d7f97465217 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:00.885093093 +0000 UTC m=+23.353701849 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-original-pull-secret") pod "global-pull-secret-syncer-5987t" (UID: "50d6b6c6-7edb-4c20-9fa7-9d7f97465217") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:00.888759 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:00.888720 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-original-pull-secret\") pod \"global-pull-secret-syncer-5987t\" (UID: \"50d6b6c6-7edb-4c20-9fa7-9d7f97465217\") " pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:00.888928 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:00.888860 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:00.888928 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:00.888926 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-original-pull-secret podName:50d6b6c6-7edb-4c20-9fa7-9d7f97465217 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:01.888911456 +0000 UTC m=+24.357520211 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-original-pull-secret") pod "global-pull-secret-syncer-5987t" (UID: "50d6b6c6-7edb-4c20-9fa7-9d7f97465217") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:01.150588 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:01.150494 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:10:01.150739 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:01.150628 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4xm4j" podUID="76ba3cac-7c44-4ba0-aefc-cfded09ee26e" Apr 16 18:10:01.301716 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:01.301679 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" event={"ID":"2d1cf278-4df4-49a3-930a-9184e51a38b8","Type":"ContainerStarted","Data":"882bae5d2ccc4e79d521fb4d02504fdac7397e617cc55a470a2c758da3cbf7c2"} Apr 16 18:10:01.894238 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:01.894201 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-original-pull-secret\") pod \"global-pull-secret-syncer-5987t\" (UID: \"50d6b6c6-7edb-4c20-9fa7-9d7f97465217\") " pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:01.894433 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:01.894382 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:01.894487 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:01.894476 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-original-pull-secret podName:50d6b6c6-7edb-4c20-9fa7-9d7f97465217 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:03.894456248 +0000 UTC m=+26.363065019 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-original-pull-secret") pod "global-pull-secret-syncer-5987t" (UID: "50d6b6c6-7edb-4c20-9fa7-9d7f97465217") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:02.150317 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:02.150069 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:02.150496 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:02.150147 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:10:02.150496 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:02.150371 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5987t" podUID="50d6b6c6-7edb-4c20-9fa7-9d7f97465217" Apr 16 18:10:02.150496 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:02.150474 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pq587" podUID="5b1617ae-f25b-4a90-adf4-ca28c7c22774" Apr 16 18:10:02.159055 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:02.159032 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-k4n6b" Apr 16 18:10:02.159617 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:02.159598 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-k4n6b" Apr 16 18:10:02.303116 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:02.303092 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-k4n6b" Apr 16 18:10:02.303898 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:02.303877 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-k4n6b" Apr 16 18:10:03.150871 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:03.150785 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:10:03.151121 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:03.150909 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4xm4j" podUID="76ba3cac-7c44-4ba0-aefc-cfded09ee26e" Apr 16 18:10:03.307159 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:03.307123 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" event={"ID":"2d1cf278-4df4-49a3-930a-9184e51a38b8","Type":"ContainerStarted","Data":"044c38bb58fa7e4e07cb166e9d700fc52e3f357d1dcc01dc05638591891a7ba2"} Apr 16 18:10:03.308030 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:03.307577 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:10:03.308030 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:03.307605 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:10:03.308030 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:03.307617 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:10:03.309042 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:03.309014 2563 generic.go:358] "Generic (PLEG): container finished" podID="afb3aa46-f688-46a6-9d9f-7529d606c9dc" containerID="06260647d97e6e4e7b3bc64409b742a900a71db9bb5e81d13a1ca547976234d7" exitCode=0 Apr 16 18:10:03.309275 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:03.309101 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9jdb" event={"ID":"afb3aa46-f688-46a6-9d9f-7529d606c9dc","Type":"ContainerDied","Data":"06260647d97e6e4e7b3bc64409b742a900a71db9bb5e81d13a1ca547976234d7"} Apr 16 18:10:03.322299 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:03.322273 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:10:03.322457 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:03.322362 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:10:03.337151 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:03.337106 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" podStartSLOduration=8.279129212 podStartE2EDuration="25.337093532s" podCreationTimestamp="2026-04-16 18:09:38 +0000 UTC" firstStartedPulling="2026-04-16 18:09:40.78982635 +0000 UTC m=+3.258435106" lastFinishedPulling="2026-04-16 18:09:57.847790656 +0000 UTC m=+20.316399426" observedRunningTime="2026-04-16 18:10:03.336871074 +0000 UTC m=+25.805479873" watchObservedRunningTime="2026-04-16 18:10:03.337093532 +0000 UTC m=+25.805702308" Apr 16 18:10:03.911163 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:03.911126 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-original-pull-secret\") pod \"global-pull-secret-syncer-5987t\" (UID: \"50d6b6c6-7edb-4c20-9fa7-9d7f97465217\") " pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:03.911365 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:03.911261 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:03.911365 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:03.911323 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-original-pull-secret podName:50d6b6c6-7edb-4c20-9fa7-9d7f97465217 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:07.9113072 +0000 UTC m=+30.379915956 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-original-pull-secret") pod "global-pull-secret-syncer-5987t" (UID: "50d6b6c6-7edb-4c20-9fa7-9d7f97465217") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:04.150566 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:04.150299 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:10:04.150758 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:04.150299 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:04.150758 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:04.150697 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pq587" podUID="5b1617ae-f25b-4a90-adf4-ca28c7c22774" Apr 16 18:10:04.150872 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:04.150773 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5987t" podUID="50d6b6c6-7edb-4c20-9fa7-9d7f97465217" Apr 16 18:10:04.313046 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:04.313016 2563 generic.go:358] "Generic (PLEG): container finished" podID="afb3aa46-f688-46a6-9d9f-7529d606c9dc" containerID="d0fed7fa7b68304a22f5ea29f551ca1d9bd62e4c216eeac692793e6bc1f1d113" exitCode=0 Apr 16 18:10:04.313486 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:04.313100 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9jdb" event={"ID":"afb3aa46-f688-46a6-9d9f-7529d606c9dc","Type":"ContainerDied","Data":"d0fed7fa7b68304a22f5ea29f551ca1d9bd62e4c216eeac692793e6bc1f1d113"} Apr 16 18:10:04.486523 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:04.486488 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5987t"] Apr 16 18:10:04.486672 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:04.486622 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:04.486732 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:04.486711 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5987t" podUID="50d6b6c6-7edb-4c20-9fa7-9d7f97465217" Apr 16 18:10:04.489589 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:04.489560 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pq587"] Apr 16 18:10:04.489887 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:04.489865 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:10:04.490010 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:04.489992 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pq587" podUID="5b1617ae-f25b-4a90-adf4-ca28c7c22774" Apr 16 18:10:04.490413 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:04.490372 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4xm4j"] Apr 16 18:10:04.490534 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:04.490518 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:10:04.490675 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:04.490645 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4xm4j" podUID="76ba3cac-7c44-4ba0-aefc-cfded09ee26e" Apr 16 18:10:05.317570 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:05.317538 2563 generic.go:358] "Generic (PLEG): container finished" podID="afb3aa46-f688-46a6-9d9f-7529d606c9dc" containerID="c955a99a010b7cde520cd7929ea94999be71c198c063ad6607359d40a73351cc" exitCode=0 Apr 16 18:10:05.317945 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:05.317627 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9jdb" event={"ID":"afb3aa46-f688-46a6-9d9f-7529d606c9dc","Type":"ContainerDied","Data":"c955a99a010b7cde520cd7929ea94999be71c198c063ad6607359d40a73351cc"} Apr 16 18:10:06.150142 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:06.150114 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:10:06.150142 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:06.150144 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:10:06.150387 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:06.150119 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:06.150387 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:06.150231 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4xm4j" podUID="76ba3cac-7c44-4ba0-aefc-cfded09ee26e" Apr 16 18:10:06.150387 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:06.150307 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5987t" podUID="50d6b6c6-7edb-4c20-9fa7-9d7f97465217" Apr 16 18:10:06.150566 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:06.150426 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pq587" podUID="5b1617ae-f25b-4a90-adf4-ca28c7c22774" Apr 16 18:10:07.944115 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:07.944063 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-original-pull-secret\") pod \"global-pull-secret-syncer-5987t\" (UID: \"50d6b6c6-7edb-4c20-9fa7-9d7f97465217\") " pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:07.944661 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:07.944217 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:07.944661 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:07.944292 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-original-pull-secret podName:50d6b6c6-7edb-4c20-9fa7-9d7f97465217 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:15.944275333 +0000 UTC m=+38.412884088 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-original-pull-secret") pod "global-pull-secret-syncer-5987t" (UID: "50d6b6c6-7edb-4c20-9fa7-9d7f97465217") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:08.151598 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:08.151557 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:10:08.151762 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:08.151611 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:08.151762 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:08.151666 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:10:08.151762 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:08.151679 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4xm4j" podUID="76ba3cac-7c44-4ba0-aefc-cfded09ee26e" Apr 16 18:10:08.151880 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:08.151777 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pq587" podUID="5b1617ae-f25b-4a90-adf4-ca28c7c22774" Apr 16 18:10:08.151880 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:08.151864 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5987t" podUID="50d6b6c6-7edb-4c20-9fa7-9d7f97465217" Apr 16 18:10:09.809882 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.809855 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeReady" Apr 16 18:10:09.810329 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.809977 2563 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:10:09.841695 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.841661 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z46bv"] Apr 16 18:10:09.874157 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.874125 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-57dc669799-6pc56"] Apr 16 18:10:09.874332 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.874313 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z46bv" Apr 16 18:10:09.878075 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.878042 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-mt8q9\"" Apr 16 18:10:09.878484 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.878461 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 18:10:09.878728 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.878690 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:10:09.897446 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.896480 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l"] Apr 16 18:10:09.897446 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.896883 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:09.900163 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.900009 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:10:09.900163 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.900087 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:10:09.900375 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.900297 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:10:09.900748 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.900576 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-v2fsg\"" Apr 16 18:10:09.914812 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.914765 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:10:09.916124 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.916096 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-z7qz9"] Apr 16 18:10:09.916267 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.916217 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l" Apr 16 18:10:09.918816 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.918791 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 18:10:09.919040 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.919020 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:10:09.919139 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.919074 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-zpwl4\"" Apr 16 18:10:09.919299 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.919278 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 18:10:09.934459 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.934433 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z46bv"] Apr 16 18:10:09.934593 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.934467 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-z7qz9"] Apr 16 18:10:09.934593 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.934480 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57dc669799-6pc56"] Apr 16 18:10:09.934593 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.934492 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-c6f985f48-6wxzx"] Apr 16 18:10:09.934593 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.934518 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:09.937544 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.937325 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 18:10:09.937544 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.937380 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 18:10:09.937544 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.937381 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:10:09.937544 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.937450 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-9rc2l\"" Apr 16 18:10:09.937544 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.937381 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 18:10:09.943535 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.943510 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 18:10:09.956143 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.956107 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr"] Apr 16 18:10:09.956293 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.956121 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:09.958723 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.958699 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqt4d\" (UniqueName: \"kubernetes.io/projected/ea87fd15-995a-4cb6-9f35-8ef427ef8e52-kube-api-access-wqt4d\") pod \"volume-data-source-validator-7d955d5dd4-z46bv\" (UID: \"ea87fd15-995a-4cb6-9f35-8ef427ef8e52\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z46bv" Apr 16 18:10:09.962713 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.962688 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 18:10:09.968068 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.968032 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 18:10:09.968068 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.968059 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 18:10:09.968347 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.968321 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 18:10:09.968347 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.968346 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 18:10:09.968577 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.968385 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 18:10:09.968577 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.968419 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-2v4gp\"" Apr 16 18:10:09.968737 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.968723 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd"] Apr 16 18:10:09.968878 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.968865 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr" Apr 16 18:10:09.981643 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.981621 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 18:10:09.981775 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.981715 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 18:10:09.981842 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.981783 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:10:09.982067 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.982047 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-j64n7\"" Apr 16 18:10:09.982245 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.982162 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 18:10:09.994323 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.994282 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm"] Apr 16 18:10:09.994489 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.994444 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd" Apr 16 18:10:09.997563 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.997538 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 18:10:09.997717 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.997612 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:10:09.997717 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.997645 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 18:10:09.997863 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.997797 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 18:10:09.997945 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:09.997931 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-g52k7\"" Apr 16 18:10:10.012632 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.012605 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l"] Apr 16 18:10:10.012782 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.012640 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-rxktg"] Apr 16 18:10:10.012782 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.012691 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" Apr 16 18:10:10.015518 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.015496 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:10:10.017268 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.016981 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 18:10:10.017268 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.017003 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 18:10:10.017268 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.017031 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-g72qw\"" Apr 16 18:10:10.017268 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.017044 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:10:10.031202 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.031167 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf"] Apr 16 18:10:10.031333 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.031231 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.034125 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.034100 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 18:10:10.034262 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.034245 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:10:10.034357 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.034343 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-s2hcx\"" Apr 16 18:10:10.034891 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.034678 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 18:10:10.034891 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.034724 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:10:10.042815 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.042790 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-h6q2h"] Apr 16 18:10:10.042944 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.042928 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf" Apr 16 18:10:10.043389 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.043366 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 18:10:10.045859 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.045816 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 18:10:10.045970 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.045892 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-xkp6q\"" Apr 16 18:10:10.045970 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.045951 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 18:10:10.055582 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.055560 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz"] Apr 16 18:10:10.055726 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.055696 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-h6q2h" Apr 16 18:10:10.058533 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.058510 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:10:10.058659 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.058567 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-xbfmv\"" Apr 16 18:10:10.058712 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.058510 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:10:10.059003 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.058977 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8668273d-77af-4f62-9e62-4fcaab486fec-registry-certificates\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.059104 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059015 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-stats-auth\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:10.059104 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059048 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-bound-sa-token\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.059104 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059067 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e658950a-b3d3-49ff-a5ce-4445a68ef06f-serving-cert\") pod \"console-operator-d87b8d5fc-z7qz9\" (UID: \"e658950a-b3d3-49ff-a5ce-4445a68ef06f\") " pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:10.059104 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059081 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e658950a-b3d3-49ff-a5ce-4445a68ef06f-trusted-ca\") pod \"console-operator-d87b8d5fc-z7qz9\" (UID: \"e658950a-b3d3-49ff-a5ce-4445a68ef06f\") " pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:10.059104 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059100 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfprr\" (UniqueName: \"kubernetes.io/projected/94f90437-3d9f-443f-9a60-c3237a421595-kube-api-access-cfprr\") pod \"service-ca-operator-69965bb79d-whkpr\" (UID: \"94f90437-3d9f-443f-9a60-c3237a421595\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr" Apr 16 18:10:10.059288 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059117 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d0b4ef5-4d33-472c-bc36-98eecd77e026-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qvbqd\" (UID: \"7d0b4ef5-4d33-472c-bc36-98eecd77e026\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd" Apr 16 18:10:10.059288 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059143 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqt4d\" (UniqueName: \"kubernetes.io/projected/ea87fd15-995a-4cb6-9f35-8ef427ef8e52-kube-api-access-wqt4d\") pod \"volume-data-source-validator-7d955d5dd4-z46bv\" (UID: \"ea87fd15-995a-4cb6-9f35-8ef427ef8e52\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z46bv" Apr 16 18:10:10.059288 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059159 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8668273d-77af-4f62-9e62-4fcaab486fec-image-registry-private-configuration\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.059288 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059175 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8668273d-77af-4f62-9e62-4fcaab486fec-trusted-ca\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.059288 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059192 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwtsh\" (UniqueName: \"kubernetes.io/projected/138781da-d4c5-4c2a-b64e-b740488095cf-kube-api-access-fwtsh\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:10.059288 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059217 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8668273d-77af-4f62-9e62-4fcaab486fec-ca-trust-extracted\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.059288 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059241 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d72jd\" (UniqueName: \"kubernetes.io/projected/3c225701-67ef-433b-93da-7745170f4769-kube-api-access-d72jd\") pod \"cluster-samples-operator-667775844f-8mn2l\" (UID: \"3c225701-67ef-433b-93da-7745170f4769\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l" Apr 16 18:10:10.059288 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059271 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94f90437-3d9f-443f-9a60-c3237a421595-config\") pod \"service-ca-operator-69965bb79d-whkpr\" (UID: \"94f90437-3d9f-443f-9a60-c3237a421595\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr" Apr 16 18:10:10.059288 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059286 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.059735 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059300 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e658950a-b3d3-49ff-a5ce-4445a68ef06f-config\") pod \"console-operator-d87b8d5fc-z7qz9\" (UID: \"e658950a-b3d3-49ff-a5ce-4445a68ef06f\") " pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:10.059735 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059317 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:10.059735 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059341 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94f90437-3d9f-443f-9a60-c3237a421595-serving-cert\") pod \"service-ca-operator-69965bb79d-whkpr\" (UID: \"94f90437-3d9f-443f-9a60-c3237a421595\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr" Apr 16 18:10:10.059735 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059359 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-8mn2l\" (UID: \"3c225701-67ef-433b-93da-7745170f4769\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l" Apr 16 18:10:10.059735 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059374 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d0b4ef5-4d33-472c-bc36-98eecd77e026-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qvbqd\" (UID: \"7d0b4ef5-4d33-472c-bc36-98eecd77e026\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd" Apr 16 18:10:10.059735 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059489 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8668273d-77af-4f62-9e62-4fcaab486fec-installation-pull-secrets\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.059735 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059539 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:10.059735 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059587 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq8h7\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-kube-api-access-sq8h7\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.059735 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059613 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-default-certificate\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:10.059735 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059664 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxkp6\" (UniqueName: \"kubernetes.io/projected/7d0b4ef5-4d33-472c-bc36-98eecd77e026-kube-api-access-pxkp6\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qvbqd\" (UID: \"7d0b4ef5-4d33-472c-bc36-98eecd77e026\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd" Apr 16 18:10:10.059735 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.059729 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv2rv\" (UniqueName: \"kubernetes.io/projected/e658950a-b3d3-49ff-a5ce-4445a68ef06f-kube-api-access-qv2rv\") pod \"console-operator-d87b8d5fc-z7qz9\" (UID: \"e658950a-b3d3-49ff-a5ce-4445a68ef06f\") " pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:10.067587 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.067566 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fbf4846df-zwdjn"] Apr 16 18:10:10.067854 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.067832 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz" Apr 16 18:10:10.070726 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.070705 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 18:10:10.070833 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.070749 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 18:10:10.070922 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.070705 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 18:10:10.071014 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.070927 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 18:10:10.074409 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.074377 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqt4d\" (UniqueName: \"kubernetes.io/projected/ea87fd15-995a-4cb6-9f35-8ef427ef8e52-kube-api-access-wqt4d\") pod \"volume-data-source-validator-7d955d5dd4-z46bv\" (UID: \"ea87fd15-995a-4cb6-9f35-8ef427ef8e52\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z46bv" Apr 16 18:10:10.081606 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.081388 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8"] Apr 16 18:10:10.081734 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.081511 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fbf4846df-zwdjn" Apr 16 18:10:10.085564 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.085547 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-ckq7n\"" Apr 16 18:10:10.085690 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.085673 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 18:10:10.097811 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.097790 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4d9lv"] Apr 16 18:10:10.098059 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.098042 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.100951 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.100932 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 18:10:10.101054 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.100976 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 18:10:10.101054 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.100932 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 18:10:10.101054 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.101032 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 18:10:10.111927 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.111904 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7gpt8"] Apr 16 18:10:10.112103 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.112084 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:10.115024 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.115001 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:10:10.115131 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.115045 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:10:10.115131 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.115046 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xkg79\"" Apr 16 18:10:10.120888 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.120868 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr"] Apr 16 18:10:10.120975 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.120896 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd"] Apr 16 18:10:10.120975 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.120911 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-c6f985f48-6wxzx"] Apr 16 18:10:10.120975 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.120922 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7gpt8"] Apr 16 18:10:10.120975 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.120936 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-h6q2h"] Apr 16 18:10:10.120975 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.120951 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fbf4846df-zwdjn"] Apr 16 18:10:10.120975 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.120968 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm"] Apr 16 18:10:10.121235 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.120980 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf"] Apr 16 18:10:10.121235 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.120993 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-rxktg"] Apr 16 18:10:10.121235 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.121004 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8"] Apr 16 18:10:10.121235 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.121016 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4d9lv"] Apr 16 18:10:10.121235 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.121024 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7gpt8" Apr 16 18:10:10.121235 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.121031 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz"] Apr 16 18:10:10.123730 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.123709 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:10:10.123831 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.123762 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:10:10.123831 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.123713 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:10:10.123916 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.123828 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7tj5h\"" Apr 16 18:10:10.149967 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.149935 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:10:10.150134 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.149936 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:10:10.150200 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.149936 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:10.152915 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.152892 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:10:10.152915 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.152909 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:10:10.153543 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.153514 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-w4nfv\"" Apr 16 18:10:10.153815 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.153797 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vqckd\"" Apr 16 18:10:10.160592 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.160571 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-8mn2l\" (UID: \"3c225701-67ef-433b-93da-7745170f4769\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l" Apr 16 18:10:10.160927 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.160733 2563 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:10:10.160927 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.160829 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls podName:3c225701-67ef-433b-93da-7745170f4769 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:10.660807718 +0000 UTC m=+33.129416475 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls") pod "cluster-samples-operator-667775844f-8mn2l" (UID: "3c225701-67ef-433b-93da-7745170f4769") : secret "samples-operator-tls" not found Apr 16 18:10:10.161171 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161037 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8668273d-77af-4f62-9e62-4fcaab486fec-installation-pull-secrets\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.161171 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161130 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tjvkf\" (UID: \"8e947e1e-4646-485c-a1cf-45fa455fe359\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf" Apr 16 18:10:10.161286 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161178 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/95fe9d02-b7ea-45cc-b808-ac2f646c1f2c-tmp\") pod \"klusterlet-addon-workmgr-b4cf49c9d-dpmhz\" (UID: \"95fe9d02-b7ea-45cc-b808-ac2f646c1f2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz" Apr 16 18:10:10.161286 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161218 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8668273d-77af-4f62-9e62-4fcaab486fec-image-registry-private-configuration\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.161286 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161246 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8668273d-77af-4f62-9e62-4fcaab486fec-trusted-ca\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.161465 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161302 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwtsh\" (UniqueName: \"kubernetes.io/projected/138781da-d4c5-4c2a-b64e-b740488095cf-kube-api-access-fwtsh\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:10.161465 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161330 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/95fe9d02-b7ea-45cc-b808-ac2f646c1f2c-klusterlet-config\") pod \"klusterlet-addon-workmgr-b4cf49c9d-dpmhz\" (UID: \"95fe9d02-b7ea-45cc-b808-ac2f646c1f2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz" Apr 16 18:10:10.161465 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161357 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d0b4ef5-4d33-472c-bc36-98eecd77e026-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qvbqd\" (UID: \"7d0b4ef5-4d33-472c-bc36-98eecd77e026\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd" Apr 16 18:10:10.161465 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161423 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8668273d-77af-4f62-9e62-4fcaab486fec-registry-certificates\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.161465 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161450 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-stats-auth\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:10.161693 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161477 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5869ca3b-420e-46b0-ac02-e5572d8d6b05-tmp\") pod \"insights-operator-5785d4fcdd-rxktg\" (UID: \"5869ca3b-420e-46b0-ac02-e5572d8d6b05\") " pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.161693 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161529 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-bound-sa-token\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.161693 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161555 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e658950a-b3d3-49ff-a5ce-4445a68ef06f-serving-cert\") pod \"console-operator-d87b8d5fc-z7qz9\" (UID: \"e658950a-b3d3-49ff-a5ce-4445a68ef06f\") " pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:10.161693 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161585 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8e947e1e-4646-485c-a1cf-45fa455fe359-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-tjvkf\" (UID: \"8e947e1e-4646-485c-a1cf-45fa455fe359\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf" Apr 16 18:10:10.161693 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161613 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttkpp\" (UniqueName: \"kubernetes.io/projected/367599ae-c563-446b-95e4-1f750b698283-kube-api-access-ttkpp\") pod \"network-check-source-7b678d77c7-h6q2h\" (UID: \"367599ae-c563-446b-95e4-1f750b698283\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-h6q2h" Apr 16 18:10:10.161693 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161645 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfprr\" (UniqueName: \"kubernetes.io/projected/94f90437-3d9f-443f-9a60-c3237a421595-kube-api-access-cfprr\") pod \"service-ca-operator-69965bb79d-whkpr\" (UID: \"94f90437-3d9f-443f-9a60-c3237a421595\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr" Apr 16 18:10:10.161693 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161671 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d0b4ef5-4d33-472c-bc36-98eecd77e026-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qvbqd\" (UID: \"7d0b4ef5-4d33-472c-bc36-98eecd77e026\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd" Apr 16 18:10:10.162016 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161710 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-8k8jm\" (UID: \"4ba55740-f7dd-4c3c-9da7-00fa99217735\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" Apr 16 18:10:10.162016 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161743 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8668273d-77af-4f62-9e62-4fcaab486fec-ca-trust-extracted\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.162016 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161775 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5869ca3b-420e-46b0-ac02-e5572d8d6b05-serving-cert\") pod \"insights-operator-5785d4fcdd-rxktg\" (UID: \"5869ca3b-420e-46b0-ac02-e5572d8d6b05\") " pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.162016 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161807 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.162016 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161835 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e658950a-b3d3-49ff-a5ce-4445a68ef06f-config\") pod \"console-operator-d87b8d5fc-z7qz9\" (UID: \"e658950a-b3d3-49ff-a5ce-4445a68ef06f\") " pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:10.162016 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161862 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:10.162016 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161893 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94f90437-3d9f-443f-9a60-c3237a421595-serving-cert\") pod \"service-ca-operator-69965bb79d-whkpr\" (UID: \"94f90437-3d9f-443f-9a60-c3237a421595\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr" Apr 16 18:10:10.162016 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161920 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtswn\" (UniqueName: \"kubernetes.io/projected/4ba55740-f7dd-4c3c-9da7-00fa99217735-kube-api-access-gtswn\") pod \"cluster-monitoring-operator-6667474d89-8k8jm\" (UID: \"4ba55740-f7dd-4c3c-9da7-00fa99217735\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" Apr 16 18:10:10.162016 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161956 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:10.162016 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.161985 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sq8h7\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-kube-api-access-sq8h7\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.162016 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.162018 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-default-certificate\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:10.162545 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.162062 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qv2rv\" (UniqueName: \"kubernetes.io/projected/e658950a-b3d3-49ff-a5ce-4445a68ef06f-kube-api-access-qv2rv\") pod \"console-operator-d87b8d5fc-z7qz9\" (UID: \"e658950a-b3d3-49ff-a5ce-4445a68ef06f\") " pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:10.162545 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.162101 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxlbh\" (UniqueName: \"kubernetes.io/projected/5869ca3b-420e-46b0-ac02-e5572d8d6b05-kube-api-access-bxlbh\") pod \"insights-operator-5785d4fcdd-rxktg\" (UID: \"5869ca3b-420e-46b0-ac02-e5572d8d6b05\") " pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.162545 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.162129 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4ba55740-f7dd-4c3c-9da7-00fa99217735-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-8k8jm\" (UID: \"4ba55740-f7dd-4c3c-9da7-00fa99217735\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" Apr 16 18:10:10.162545 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.162221 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5wsz\" (UniqueName: \"kubernetes.io/projected/95fe9d02-b7ea-45cc-b808-ac2f646c1f2c-kube-api-access-j5wsz\") pod \"klusterlet-addon-workmgr-b4cf49c9d-dpmhz\" (UID: \"95fe9d02-b7ea-45cc-b808-ac2f646c1f2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz" Apr 16 18:10:10.162545 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.162255 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxkp6\" (UniqueName: \"kubernetes.io/projected/7d0b4ef5-4d33-472c-bc36-98eecd77e026-kube-api-access-pxkp6\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qvbqd\" (UID: \"7d0b4ef5-4d33-472c-bc36-98eecd77e026\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd" Apr 16 18:10:10.162545 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.162294 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e658950a-b3d3-49ff-a5ce-4445a68ef06f-trusted-ca\") pod \"console-operator-d87b8d5fc-z7qz9\" (UID: \"e658950a-b3d3-49ff-a5ce-4445a68ef06f\") " pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:10.162545 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.162324 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5869ca3b-420e-46b0-ac02-e5572d8d6b05-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-rxktg\" (UID: \"5869ca3b-420e-46b0-ac02-e5572d8d6b05\") " pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.162545 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.162328 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8668273d-77af-4f62-9e62-4fcaab486fec-trusted-ca\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.162545 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.162354 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e84c1343-65f5-4dda-a296-2f591522a539-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-fbf4846df-zwdjn\" (UID: \"e84c1343-65f5-4dda-a296-2f591522a539\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fbf4846df-zwdjn" Apr 16 18:10:10.162545 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.162384 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5869ca3b-420e-46b0-ac02-e5572d8d6b05-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-rxktg\" (UID: \"5869ca3b-420e-46b0-ac02-e5572d8d6b05\") " pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.162545 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.162454 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d72jd\" (UniqueName: \"kubernetes.io/projected/3c225701-67ef-433b-93da-7745170f4769-kube-api-access-d72jd\") pod \"cluster-samples-operator-667775844f-8mn2l\" (UID: \"3c225701-67ef-433b-93da-7745170f4769\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l" Apr 16 18:10:10.162545 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.162535 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d0b4ef5-4d33-472c-bc36-98eecd77e026-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qvbqd\" (UID: \"7d0b4ef5-4d33-472c-bc36-98eecd77e026\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd" Apr 16 18:10:10.163098 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.162574 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:10:10.163098 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.162589 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57dc669799-6pc56: secret "image-registry-tls" not found Apr 16 18:10:10.163098 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.162617 2563 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:10:10.163098 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.162656 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls podName:8668273d-77af-4f62-9e62-4fcaab486fec nodeName:}" failed. No retries permitted until 2026-04-16 18:10:10.662638429 +0000 UTC m=+33.131247198 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls") pod "image-registry-57dc669799-6pc56" (UID: "8668273d-77af-4f62-9e62-4fcaab486fec") : secret "image-registry-tls" not found Apr 16 18:10:10.163098 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.162680 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs podName:138781da-d4c5-4c2a-b64e-b740488095cf nodeName:}" failed. No retries permitted until 2026-04-16 18:10:10.662662996 +0000 UTC m=+33.131271751 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs") pod "router-default-c6f985f48-6wxzx" (UID: "138781da-d4c5-4c2a-b64e-b740488095cf") : secret "router-metrics-certs-default" not found Apr 16 18:10:10.163098 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.162786 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8668273d-77af-4f62-9e62-4fcaab486fec-ca-trust-extracted\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.163098 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.162922 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/5869ca3b-420e-46b0-ac02-e5572d8d6b05-snapshots\") pod \"insights-operator-5785d4fcdd-rxktg\" (UID: \"5869ca3b-420e-46b0-ac02-e5572d8d6b05\") " pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.163098 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.162974 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94f90437-3d9f-443f-9a60-c3237a421595-config\") pod \"service-ca-operator-69965bb79d-whkpr\" (UID: \"94f90437-3d9f-443f-9a60-c3237a421595\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr" Apr 16 18:10:10.163098 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.163012 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8668273d-77af-4f62-9e62-4fcaab486fec-registry-certificates\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.163098 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.163026 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxg8t\" (UniqueName: \"kubernetes.io/projected/e84c1343-65f5-4dda-a296-2f591522a539-kube-api-access-zxg8t\") pod \"managed-serviceaccount-addon-agent-fbf4846df-zwdjn\" (UID: \"e84c1343-65f5-4dda-a296-2f591522a539\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fbf4846df-zwdjn" Apr 16 18:10:10.163844 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.163826 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e658950a-b3d3-49ff-a5ce-4445a68ef06f-config\") pod \"console-operator-d87b8d5fc-z7qz9\" (UID: \"e658950a-b3d3-49ff-a5ce-4445a68ef06f\") " pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:10.163938 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.163930 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle podName:138781da-d4c5-4c2a-b64e-b740488095cf nodeName:}" failed. No retries permitted until 2026-04-16 18:10:10.663914741 +0000 UTC m=+33.132523519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle") pod "router-default-c6f985f48-6wxzx" (UID: "138781da-d4c5-4c2a-b64e-b740488095cf") : configmap references non-existent config key: service-ca.crt Apr 16 18:10:10.164434 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.164297 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8668273d-77af-4f62-9e62-4fcaab486fec-image-registry-private-configuration\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.164577 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.164554 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d0b4ef5-4d33-472c-bc36-98eecd77e026-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qvbqd\" (UID: \"7d0b4ef5-4d33-472c-bc36-98eecd77e026\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd" Apr 16 18:10:10.164726 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.164706 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8668273d-77af-4f62-9e62-4fcaab486fec-installation-pull-secrets\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.165153 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.165133 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94f90437-3d9f-443f-9a60-c3237a421595-serving-cert\") pod \"service-ca-operator-69965bb79d-whkpr\" (UID: \"94f90437-3d9f-443f-9a60-c3237a421595\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr" Apr 16 18:10:10.165229 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.165149 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e658950a-b3d3-49ff-a5ce-4445a68ef06f-trusted-ca\") pod \"console-operator-d87b8d5fc-z7qz9\" (UID: \"e658950a-b3d3-49ff-a5ce-4445a68ef06f\") " pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:10.166512 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.166489 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e658950a-b3d3-49ff-a5ce-4445a68ef06f-serving-cert\") pod \"console-operator-d87b8d5fc-z7qz9\" (UID: \"e658950a-b3d3-49ff-a5ce-4445a68ef06f\") " pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:10.168441 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.168383 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-stats-auth\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:10.168526 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.168480 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-default-certificate\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:10.169301 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.169251 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94f90437-3d9f-443f-9a60-c3237a421595-config\") pod \"service-ca-operator-69965bb79d-whkpr\" (UID: \"94f90437-3d9f-443f-9a60-c3237a421595\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr" Apr 16 18:10:10.170429 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.170368 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwtsh\" (UniqueName: \"kubernetes.io/projected/138781da-d4c5-4c2a-b64e-b740488095cf-kube-api-access-fwtsh\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:10.170981 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.170937 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfprr\" (UniqueName: \"kubernetes.io/projected/94f90437-3d9f-443f-9a60-c3237a421595-kube-api-access-cfprr\") pod \"service-ca-operator-69965bb79d-whkpr\" (UID: \"94f90437-3d9f-443f-9a60-c3237a421595\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr" Apr 16 18:10:10.171751 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.171706 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-bound-sa-token\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.172748 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.172694 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq8h7\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-kube-api-access-sq8h7\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.172748 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.172727 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxkp6\" (UniqueName: \"kubernetes.io/projected/7d0b4ef5-4d33-472c-bc36-98eecd77e026-kube-api-access-pxkp6\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qvbqd\" (UID: \"7d0b4ef5-4d33-472c-bc36-98eecd77e026\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd" Apr 16 18:10:10.172978 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.172947 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv2rv\" (UniqueName: \"kubernetes.io/projected/e658950a-b3d3-49ff-a5ce-4445a68ef06f-kube-api-access-qv2rv\") pod \"console-operator-d87b8d5fc-z7qz9\" (UID: \"e658950a-b3d3-49ff-a5ce-4445a68ef06f\") " pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:10.174748 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.174694 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d72jd\" (UniqueName: \"kubernetes.io/projected/3c225701-67ef-433b-93da-7745170f4769-kube-api-access-d72jd\") pod \"cluster-samples-operator-667775844f-8mn2l\" (UID: \"3c225701-67ef-433b-93da-7745170f4769\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l" Apr 16 18:10:10.187548 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.187518 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z46bv" Apr 16 18:10:10.244897 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.244866 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:10.263913 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.263873 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/95fe9d02-b7ea-45cc-b808-ac2f646c1f2c-klusterlet-config\") pod \"klusterlet-addon-workmgr-b4cf49c9d-dpmhz\" (UID: \"95fe9d02-b7ea-45cc-b808-ac2f646c1f2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz" Apr 16 18:10:10.264073 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.263933 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5869ca3b-420e-46b0-ac02-e5572d8d6b05-tmp\") pod \"insights-operator-5785d4fcdd-rxktg\" (UID: \"5869ca3b-420e-46b0-ac02-e5572d8d6b05\") " pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.264073 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.263965 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ac8b3146-c9ef-45ff-a401-4847e957c45c-tmp-dir\") pod \"dns-default-4d9lv\" (UID: \"ac8b3146-c9ef-45ff-a401-4847e957c45c\") " pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:10.264073 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.263992 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8743f843-ea56-44b5-baa1-1718ae9ee68a-hub\") pod \"cluster-proxy-proxy-agent-695f5d8556-cbvv8\" (UID: \"8743f843-ea56-44b5-baa1-1718ae9ee68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.264073 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264017 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8743f843-ea56-44b5-baa1-1718ae9ee68a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-695f5d8556-cbvv8\" (UID: \"8743f843-ea56-44b5-baa1-1718ae9ee68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.264073 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264045 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8e947e1e-4646-485c-a1cf-45fa455fe359-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-tjvkf\" (UID: \"8e947e1e-4646-485c-a1cf-45fa455fe359\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf" Apr 16 18:10:10.264073 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264070 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttkpp\" (UniqueName: \"kubernetes.io/projected/367599ae-c563-446b-95e4-1f750b698283-kube-api-access-ttkpp\") pod \"network-check-source-7b678d77c7-h6q2h\" (UID: \"367599ae-c563-446b-95e4-1f750b698283\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-h6q2h" Apr 16 18:10:10.264380 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264101 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-8k8jm\" (UID: \"4ba55740-f7dd-4c3c-9da7-00fa99217735\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" Apr 16 18:10:10.264380 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264132 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5869ca3b-420e-46b0-ac02-e5572d8d6b05-serving-cert\") pod \"insights-operator-5785d4fcdd-rxktg\" (UID: \"5869ca3b-420e-46b0-ac02-e5572d8d6b05\") " pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.264380 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264189 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8743f843-ea56-44b5-baa1-1718ae9ee68a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-695f5d8556-cbvv8\" (UID: \"8743f843-ea56-44b5-baa1-1718ae9ee68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.264380 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264221 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtswn\" (UniqueName: \"kubernetes.io/projected/4ba55740-f7dd-4c3c-9da7-00fa99217735-kube-api-access-gtswn\") pod \"cluster-monitoring-operator-6667474d89-8k8jm\" (UID: \"4ba55740-f7dd-4c3c-9da7-00fa99217735\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" Apr 16 18:10:10.264380 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.264239 2563 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:10:10.264380 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264294 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxlbh\" (UniqueName: \"kubernetes.io/projected/5869ca3b-420e-46b0-ac02-e5572d8d6b05-kube-api-access-bxlbh\") pod \"insights-operator-5785d4fcdd-rxktg\" (UID: \"5869ca3b-420e-46b0-ac02-e5572d8d6b05\") " pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.264380 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.264320 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls podName:4ba55740-f7dd-4c3c-9da7-00fa99217735 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:10.76429448 +0000 UTC m=+33.232903241 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-8k8jm" (UID: "4ba55740-f7dd-4c3c-9da7-00fa99217735") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:10:10.264746 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264420 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5869ca3b-420e-46b0-ac02-e5572d8d6b05-tmp\") pod \"insights-operator-5785d4fcdd-rxktg\" (UID: \"5869ca3b-420e-46b0-ac02-e5572d8d6b05\") " pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.264746 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264480 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4ba55740-f7dd-4c3c-9da7-00fa99217735-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-8k8jm\" (UID: \"4ba55740-f7dd-4c3c-9da7-00fa99217735\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" Apr 16 18:10:10.264746 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264516 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6slh5\" (UniqueName: \"kubernetes.io/projected/8743f843-ea56-44b5-baa1-1718ae9ee68a-kube-api-access-6slh5\") pod \"cluster-proxy-proxy-agent-695f5d8556-cbvv8\" (UID: \"8743f843-ea56-44b5-baa1-1718ae9ee68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.264746 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264557 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac8b3146-c9ef-45ff-a401-4847e957c45c-config-volume\") pod \"dns-default-4d9lv\" (UID: \"ac8b3146-c9ef-45ff-a401-4847e957c45c\") " pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:10.264746 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264594 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5wsz\" (UniqueName: \"kubernetes.io/projected/95fe9d02-b7ea-45cc-b808-ac2f646c1f2c-kube-api-access-j5wsz\") pod \"klusterlet-addon-workmgr-b4cf49c9d-dpmhz\" (UID: \"95fe9d02-b7ea-45cc-b808-ac2f646c1f2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz" Apr 16 18:10:10.264746 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264637 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwjp8\" (UniqueName: \"kubernetes.io/projected/ac8b3146-c9ef-45ff-a401-4847e957c45c-kube-api-access-kwjp8\") pod \"dns-default-4d9lv\" (UID: \"ac8b3146-c9ef-45ff-a401-4847e957c45c\") " pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:10.264746 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264683 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5869ca3b-420e-46b0-ac02-e5572d8d6b05-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-rxktg\" (UID: \"5869ca3b-420e-46b0-ac02-e5572d8d6b05\") " pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.264746 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264719 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e84c1343-65f5-4dda-a296-2f591522a539-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-fbf4846df-zwdjn\" (UID: \"e84c1343-65f5-4dda-a296-2f591522a539\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fbf4846df-zwdjn" Apr 16 18:10:10.264746 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264750 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5869ca3b-420e-46b0-ac02-e5572d8d6b05-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-rxktg\" (UID: \"5869ca3b-420e-46b0-ac02-e5572d8d6b05\") " pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.265157 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264782 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b74g\" (UniqueName: \"kubernetes.io/projected/3399f6b2-54c3-4ead-9fed-519bebb162da-kube-api-access-7b74g\") pod \"ingress-canary-7gpt8\" (UID: \"3399f6b2-54c3-4ead-9fed-519bebb162da\") " pod="openshift-ingress-canary/ingress-canary-7gpt8" Apr 16 18:10:10.265157 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264809 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls\") pod \"dns-default-4d9lv\" (UID: \"ac8b3146-c9ef-45ff-a401-4847e957c45c\") " pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:10.265157 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264836 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8743f843-ea56-44b5-baa1-1718ae9ee68a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-695f5d8556-cbvv8\" (UID: \"8743f843-ea56-44b5-baa1-1718ae9ee68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.265157 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264866 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/5869ca3b-420e-46b0-ac02-e5572d8d6b05-snapshots\") pod \"insights-operator-5785d4fcdd-rxktg\" (UID: \"5869ca3b-420e-46b0-ac02-e5572d8d6b05\") " pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.265157 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264908 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxg8t\" (UniqueName: \"kubernetes.io/projected/e84c1343-65f5-4dda-a296-2f591522a539-kube-api-access-zxg8t\") pod \"managed-serviceaccount-addon-agent-fbf4846df-zwdjn\" (UID: \"e84c1343-65f5-4dda-a296-2f591522a539\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fbf4846df-zwdjn" Apr 16 18:10:10.265157 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264963 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tjvkf\" (UID: \"8e947e1e-4646-485c-a1cf-45fa455fe359\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf" Apr 16 18:10:10.265157 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.264991 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/95fe9d02-b7ea-45cc-b808-ac2f646c1f2c-tmp\") pod \"klusterlet-addon-workmgr-b4cf49c9d-dpmhz\" (UID: \"95fe9d02-b7ea-45cc-b808-ac2f646c1f2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz" Apr 16 18:10:10.265157 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.265018 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8743f843-ea56-44b5-baa1-1718ae9ee68a-ca\") pod \"cluster-proxy-proxy-agent-695f5d8556-cbvv8\" (UID: \"8743f843-ea56-44b5-baa1-1718ae9ee68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.265157 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.265049 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert\") pod \"ingress-canary-7gpt8\" (UID: \"3399f6b2-54c3-4ead-9fed-519bebb162da\") " pod="openshift-ingress-canary/ingress-canary-7gpt8" Apr 16 18:10:10.266008 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.265307 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5869ca3b-420e-46b0-ac02-e5572d8d6b05-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-rxktg\" (UID: \"5869ca3b-420e-46b0-ac02-e5572d8d6b05\") " pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.266008 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.265381 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8e947e1e-4646-485c-a1cf-45fa455fe359-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-tjvkf\" (UID: \"8e947e1e-4646-485c-a1cf-45fa455fe359\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf" Apr 16 18:10:10.266008 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.265615 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5869ca3b-420e-46b0-ac02-e5572d8d6b05-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-rxktg\" (UID: \"5869ca3b-420e-46b0-ac02-e5572d8d6b05\") " pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.266008 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.265775 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/5869ca3b-420e-46b0-ac02-e5572d8d6b05-snapshots\") pod \"insights-operator-5785d4fcdd-rxktg\" (UID: \"5869ca3b-420e-46b0-ac02-e5572d8d6b05\") " pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.266008 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.265775 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/95fe9d02-b7ea-45cc-b808-ac2f646c1f2c-tmp\") pod \"klusterlet-addon-workmgr-b4cf49c9d-dpmhz\" (UID: \"95fe9d02-b7ea-45cc-b808-ac2f646c1f2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz" Apr 16 18:10:10.266008 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.265921 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4ba55740-f7dd-4c3c-9da7-00fa99217735-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-8k8jm\" (UID: \"4ba55740-f7dd-4c3c-9da7-00fa99217735\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" Apr 16 18:10:10.266590 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.266570 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:10:10.266685 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.266631 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert podName:8e947e1e-4646-485c-a1cf-45fa455fe359 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:10.766614252 +0000 UTC m=+33.235223008 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-tjvkf" (UID: "8e947e1e-4646-485c-a1cf-45fa455fe359") : secret "networking-console-plugin-cert" not found Apr 16 18:10:10.267200 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.267169 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/95fe9d02-b7ea-45cc-b808-ac2f646c1f2c-klusterlet-config\") pod \"klusterlet-addon-workmgr-b4cf49c9d-dpmhz\" (UID: \"95fe9d02-b7ea-45cc-b808-ac2f646c1f2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz" Apr 16 18:10:10.267610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.267589 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e84c1343-65f5-4dda-a296-2f591522a539-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-fbf4846df-zwdjn\" (UID: \"e84c1343-65f5-4dda-a296-2f591522a539\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fbf4846df-zwdjn" Apr 16 18:10:10.267906 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.267876 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5869ca3b-420e-46b0-ac02-e5572d8d6b05-serving-cert\") pod \"insights-operator-5785d4fcdd-rxktg\" (UID: \"5869ca3b-420e-46b0-ac02-e5572d8d6b05\") " pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.275147 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.275093 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxg8t\" (UniqueName: \"kubernetes.io/projected/e84c1343-65f5-4dda-a296-2f591522a539-kube-api-access-zxg8t\") pod \"managed-serviceaccount-addon-agent-fbf4846df-zwdjn\" (UID: \"e84c1343-65f5-4dda-a296-2f591522a539\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fbf4846df-zwdjn" Apr 16 18:10:10.275234 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.275195 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxlbh\" (UniqueName: \"kubernetes.io/projected/5869ca3b-420e-46b0-ac02-e5572d8d6b05-kube-api-access-bxlbh\") pod \"insights-operator-5785d4fcdd-rxktg\" (UID: \"5869ca3b-420e-46b0-ac02-e5572d8d6b05\") " pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.275549 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.275522 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtswn\" (UniqueName: \"kubernetes.io/projected/4ba55740-f7dd-4c3c-9da7-00fa99217735-kube-api-access-gtswn\") pod \"cluster-monitoring-operator-6667474d89-8k8jm\" (UID: \"4ba55740-f7dd-4c3c-9da7-00fa99217735\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" Apr 16 18:10:10.276006 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.275974 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5wsz\" (UniqueName: \"kubernetes.io/projected/95fe9d02-b7ea-45cc-b808-ac2f646c1f2c-kube-api-access-j5wsz\") pod \"klusterlet-addon-workmgr-b4cf49c9d-dpmhz\" (UID: \"95fe9d02-b7ea-45cc-b808-ac2f646c1f2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz" Apr 16 18:10:10.276078 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.276033 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttkpp\" (UniqueName: \"kubernetes.io/projected/367599ae-c563-446b-95e4-1f750b698283-kube-api-access-ttkpp\") pod \"network-check-source-7b678d77c7-h6q2h\" (UID: \"367599ae-c563-446b-95e4-1f750b698283\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-h6q2h" Apr 16 18:10:10.277989 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.277967 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr" Apr 16 18:10:10.305849 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.305820 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd" Apr 16 18:10:10.350205 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.350119 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" Apr 16 18:10:10.365996 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.365961 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-h6q2h" Apr 16 18:10:10.366312 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.366284 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8743f843-ea56-44b5-baa1-1718ae9ee68a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-695f5d8556-cbvv8\" (UID: \"8743f843-ea56-44b5-baa1-1718ae9ee68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.366434 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.366354 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6slh5\" (UniqueName: \"kubernetes.io/projected/8743f843-ea56-44b5-baa1-1718ae9ee68a-kube-api-access-6slh5\") pod \"cluster-proxy-proxy-agent-695f5d8556-cbvv8\" (UID: \"8743f843-ea56-44b5-baa1-1718ae9ee68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.366434 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.366386 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac8b3146-c9ef-45ff-a401-4847e957c45c-config-volume\") pod \"dns-default-4d9lv\" (UID: \"ac8b3146-c9ef-45ff-a401-4847e957c45c\") " pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:10.366535 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.366487 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwjp8\" (UniqueName: \"kubernetes.io/projected/ac8b3146-c9ef-45ff-a401-4847e957c45c-kube-api-access-kwjp8\") pod \"dns-default-4d9lv\" (UID: \"ac8b3146-c9ef-45ff-a401-4847e957c45c\") " pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:10.366884 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.366849 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7b74g\" (UniqueName: \"kubernetes.io/projected/3399f6b2-54c3-4ead-9fed-519bebb162da-kube-api-access-7b74g\") pod \"ingress-canary-7gpt8\" (UID: \"3399f6b2-54c3-4ead-9fed-519bebb162da\") " pod="openshift-ingress-canary/ingress-canary-7gpt8" Apr 16 18:10:10.366973 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.366887 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls\") pod \"dns-default-4d9lv\" (UID: \"ac8b3146-c9ef-45ff-a401-4847e957c45c\") " pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:10.366973 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.366912 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8743f843-ea56-44b5-baa1-1718ae9ee68a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-695f5d8556-cbvv8\" (UID: \"8743f843-ea56-44b5-baa1-1718ae9ee68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.367081 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.366981 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8743f843-ea56-44b5-baa1-1718ae9ee68a-ca\") pod \"cluster-proxy-proxy-agent-695f5d8556-cbvv8\" (UID: \"8743f843-ea56-44b5-baa1-1718ae9ee68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.367081 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.367009 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert\") pod \"ingress-canary-7gpt8\" (UID: \"3399f6b2-54c3-4ead-9fed-519bebb162da\") " pod="openshift-ingress-canary/ingress-canary-7gpt8" Apr 16 18:10:10.367081 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.367020 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8743f843-ea56-44b5-baa1-1718ae9ee68a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-695f5d8556-cbvv8\" (UID: \"8743f843-ea56-44b5-baa1-1718ae9ee68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.367081 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.367067 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ac8b3146-c9ef-45ff-a401-4847e957c45c-tmp-dir\") pod \"dns-default-4d9lv\" (UID: \"ac8b3146-c9ef-45ff-a401-4847e957c45c\") " pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:10.367250 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.367097 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8743f843-ea56-44b5-baa1-1718ae9ee68a-hub\") pod \"cluster-proxy-proxy-agent-695f5d8556-cbvv8\" (UID: \"8743f843-ea56-44b5-baa1-1718ae9ee68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.367250 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.367110 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:10.367250 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.367130 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8743f843-ea56-44b5-baa1-1718ae9ee68a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-695f5d8556-cbvv8\" (UID: \"8743f843-ea56-44b5-baa1-1718ae9ee68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.367250 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.367165 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert podName:3399f6b2-54c3-4ead-9fed-519bebb162da nodeName:}" failed. No retries permitted until 2026-04-16 18:10:10.867147253 +0000 UTC m=+33.335756025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert") pod "ingress-canary-7gpt8" (UID: "3399f6b2-54c3-4ead-9fed-519bebb162da") : secret "canary-serving-cert" not found Apr 16 18:10:10.367250 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.367178 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac8b3146-c9ef-45ff-a401-4847e957c45c-config-volume\") pod \"dns-default-4d9lv\" (UID: \"ac8b3146-c9ef-45ff-a401-4847e957c45c\") " pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:10.367508 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.367387 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:10.367508 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.367454 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls podName:ac8b3146-c9ef-45ff-a401-4847e957c45c nodeName:}" failed. No retries permitted until 2026-04-16 18:10:10.867438996 +0000 UTC m=+33.336047765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls") pod "dns-default-4d9lv" (UID: "ac8b3146-c9ef-45ff-a401-4847e957c45c") : secret "dns-default-metrics-tls" not found Apr 16 18:10:10.367655 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.367627 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ac8b3146-c9ef-45ff-a401-4847e957c45c-tmp-dir\") pod \"dns-default-4d9lv\" (UID: \"ac8b3146-c9ef-45ff-a401-4847e957c45c\") " pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:10.369805 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.369783 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8743f843-ea56-44b5-baa1-1718ae9ee68a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-695f5d8556-cbvv8\" (UID: \"8743f843-ea56-44b5-baa1-1718ae9ee68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.373809 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.373783 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8743f843-ea56-44b5-baa1-1718ae9ee68a-ca\") pod \"cluster-proxy-proxy-agent-695f5d8556-cbvv8\" (UID: \"8743f843-ea56-44b5-baa1-1718ae9ee68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.373919 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.373840 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8743f843-ea56-44b5-baa1-1718ae9ee68a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-695f5d8556-cbvv8\" (UID: \"8743f843-ea56-44b5-baa1-1718ae9ee68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.374086 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.374056 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8743f843-ea56-44b5-baa1-1718ae9ee68a-hub\") pod \"cluster-proxy-proxy-agent-695f5d8556-cbvv8\" (UID: \"8743f843-ea56-44b5-baa1-1718ae9ee68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.376023 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.375996 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwjp8\" (UniqueName: \"kubernetes.io/projected/ac8b3146-c9ef-45ff-a401-4847e957c45c-kube-api-access-kwjp8\") pod \"dns-default-4d9lv\" (UID: \"ac8b3146-c9ef-45ff-a401-4847e957c45c\") " pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:10.376968 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.376948 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6slh5\" (UniqueName: \"kubernetes.io/projected/8743f843-ea56-44b5-baa1-1718ae9ee68a-kube-api-access-6slh5\") pod \"cluster-proxy-proxy-agent-695f5d8556-cbvv8\" (UID: \"8743f843-ea56-44b5-baa1-1718ae9ee68a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.377063 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.377042 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b74g\" (UniqueName: \"kubernetes.io/projected/3399f6b2-54c3-4ead-9fed-519bebb162da-kube-api-access-7b74g\") pod \"ingress-canary-7gpt8\" (UID: \"3399f6b2-54c3-4ead-9fed-519bebb162da\") " pod="openshift-ingress-canary/ingress-canary-7gpt8" Apr 16 18:10:10.383878 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.383842 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz" Apr 16 18:10:10.400171 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.400141 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fbf4846df-zwdjn" Apr 16 18:10:10.407944 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.407918 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:10:10.670392 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.670308 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-8mn2l\" (UID: \"3c225701-67ef-433b-93da-7745170f4769\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l" Apr 16 18:10:10.670574 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.670469 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:10.670574 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.670495 2563 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:10:10.670574 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.670502 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:10.670574 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.670570 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls podName:3c225701-67ef-433b-93da-7745170f4769 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:11.670547607 +0000 UTC m=+34.139156378 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls") pod "cluster-samples-operator-667775844f-8mn2l" (UID: "3c225701-67ef-433b-93da-7745170f4769") : secret "samples-operator-tls" not found Apr 16 18:10:10.670785 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.670580 2563 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:10:10.670785 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.670619 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:10:10.670785 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.670640 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57dc669799-6pc56: secret "image-registry-tls" not found Apr 16 18:10:10.670785 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.670617 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:10.670785 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.670641 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs podName:138781da-d4c5-4c2a-b64e-b740488095cf nodeName:}" failed. No retries permitted until 2026-04-16 18:10:11.670626735 +0000 UTC m=+34.139235492 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs") pod "router-default-c6f985f48-6wxzx" (UID: "138781da-d4c5-4c2a-b64e-b740488095cf") : secret "router-metrics-certs-default" not found Apr 16 18:10:10.670785 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.670704 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle podName:138781da-d4c5-4c2a-b64e-b740488095cf nodeName:}" failed. No retries permitted until 2026-04-16 18:10:11.670690364 +0000 UTC m=+34.139299122 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle") pod "router-default-c6f985f48-6wxzx" (UID: "138781da-d4c5-4c2a-b64e-b740488095cf") : configmap references non-existent config key: service-ca.crt Apr 16 18:10:10.670785 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.670727 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls podName:8668273d-77af-4f62-9e62-4fcaab486fec nodeName:}" failed. No retries permitted until 2026-04-16 18:10:11.670717856 +0000 UTC m=+34.139326616 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls") pod "image-registry-57dc669799-6pc56" (UID: "8668273d-77af-4f62-9e62-4fcaab486fec") : secret "image-registry-tls" not found Apr 16 18:10:10.771864 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.771829 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-8k8jm\" (UID: \"4ba55740-f7dd-4c3c-9da7-00fa99217735\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" Apr 16 18:10:10.772022 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.771965 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tjvkf\" (UID: \"8e947e1e-4646-485c-a1cf-45fa455fe359\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf" Apr 16 18:10:10.772022 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.771971 2563 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:10:10.772085 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.772033 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls podName:4ba55740-f7dd-4c3c-9da7-00fa99217735 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:11.772017899 +0000 UTC m=+34.240626658 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-8k8jm" (UID: "4ba55740-f7dd-4c3c-9da7-00fa99217735") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:10:10.772126 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.772088 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:10:10.772175 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.772165 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert podName:8e947e1e-4646-485c-a1cf-45fa455fe359 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:11.772153641 +0000 UTC m=+34.240762400 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-tjvkf" (UID: "8e947e1e-4646-485c-a1cf-45fa455fe359") : secret "networking-console-plugin-cert" not found Apr 16 18:10:10.872660 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.872575 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert\") pod \"ingress-canary-7gpt8\" (UID: \"3399f6b2-54c3-4ead-9fed-519bebb162da\") " pod="openshift-ingress-canary/ingress-canary-7gpt8" Apr 16 18:10:10.872963 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.872731 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:10.872963 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.872791 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:10.872963 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.872801 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert podName:3399f6b2-54c3-4ead-9fed-519bebb162da nodeName:}" failed. No retries permitted until 2026-04-16 18:10:11.8727811 +0000 UTC m=+34.341389871 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert") pod "ingress-canary-7gpt8" (UID: "3399f6b2-54c3-4ead-9fed-519bebb162da") : secret "canary-serving-cert" not found Apr 16 18:10:10.872963 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:10.872733 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls\") pod \"dns-default-4d9lv\" (UID: \"ac8b3146-c9ef-45ff-a401-4847e957c45c\") " pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:10.872963 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:10.872823 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls podName:ac8b3146-c9ef-45ff-a401-4847e957c45c nodeName:}" failed. No retries permitted until 2026-04-16 18:10:11.872813652 +0000 UTC m=+34.341422407 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls") pod "dns-default-4d9lv" (UID: "ac8b3146-c9ef-45ff-a401-4847e957c45c") : secret "dns-default-metrics-tls" not found Apr 16 18:10:11.170428 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.168866 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz"] Apr 16 18:10:11.171520 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.171209 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-h6q2h"] Apr 16 18:10:11.178817 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:10:11.178772 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95fe9d02_b7ea_45cc_b808_ac2f646c1f2c.slice/crio-76edc350e9c2387f9b175831b13d6809a75cd957ac4a47299119545cb8aec24b WatchSource:0}: Error finding container 76edc350e9c2387f9b175831b13d6809a75cd957ac4a47299119545cb8aec24b: Status 404 returned error can't find the container with id 76edc350e9c2387f9b175831b13d6809a75cd957ac4a47299119545cb8aec24b Apr 16 18:10:11.179352 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:10:11.179311 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod367599ae_c563_446b_95e4_1f750b698283.slice/crio-87cda574391a0a2467858bfbb09194b28010dd70a4a6247f55377c9ee86909d6 WatchSource:0}: Error finding container 87cda574391a0a2467858bfbb09194b28010dd70a4a6247f55377c9ee86909d6: Status 404 returned error can't find the container with id 87cda574391a0a2467858bfbb09194b28010dd70a4a6247f55377c9ee86909d6 Apr 16 18:10:11.186270 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.186244 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z46bv"] Apr 16 18:10:11.189750 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.189714 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-z7qz9"] Apr 16 18:10:11.192867 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.192833 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-rxktg"] Apr 16 18:10:11.196875 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.196839 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd"] Apr 16 18:10:11.201503 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:10:11.201333 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d0b4ef5_4d33_472c_bc36_98eecd77e026.slice/crio-4f581612b6c23d7003b702ccbc651387e79ed27a862fcf068e5436fd9748deec WatchSource:0}: Error finding container 4f581612b6c23d7003b702ccbc651387e79ed27a862fcf068e5436fd9748deec: Status 404 returned error can't find the container with id 4f581612b6c23d7003b702ccbc651387e79ed27a862fcf068e5436fd9748deec Apr 16 18:10:11.206589 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.206543 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr"] Apr 16 18:10:11.208892 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.208870 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8"] Apr 16 18:10:11.210162 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:10:11.210101 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94f90437_3d9f_443f_9a60_c3237a421595.slice/crio-2f3269ecb0230023ffadb3049005b1dacd35f9a0746b4944645942e5129552d6 WatchSource:0}: Error finding container 2f3269ecb0230023ffadb3049005b1dacd35f9a0746b4944645942e5129552d6: Status 404 returned error can't find the container with id 2f3269ecb0230023ffadb3049005b1dacd35f9a0746b4944645942e5129552d6 Apr 16 18:10:11.212020 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.211998 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fbf4846df-zwdjn"] Apr 16 18:10:11.216616 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:10:11.216550 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8743f843_ea56_44b5_baa1_1718ae9ee68a.slice/crio-bffe7c8f210739faa790adbb6045e1f590e7973967de1e1152fd90adf22e9be6 WatchSource:0}: Error finding container bffe7c8f210739faa790adbb6045e1f590e7973967de1e1152fd90adf22e9be6: Status 404 returned error can't find the container with id bffe7c8f210739faa790adbb6045e1f590e7973967de1e1152fd90adf22e9be6 Apr 16 18:10:11.217305 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:10:11.217268 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode84c1343_65f5_4dda_a296_2f591522a539.slice/crio-95ba9a6462e5a8686e86c5c637e6cf29193ebb0c7a54ed8c86e2468cacbc16f4 WatchSource:0}: Error finding container 95ba9a6462e5a8686e86c5c637e6cf29193ebb0c7a54ed8c86e2468cacbc16f4: Status 404 returned error can't find the container with id 95ba9a6462e5a8686e86c5c637e6cf29193ebb0c7a54ed8c86e2468cacbc16f4 Apr 16 18:10:11.330651 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.330385 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr" event={"ID":"94f90437-3d9f-443f-9a60-c3237a421595","Type":"ContainerStarted","Data":"2f3269ecb0230023ffadb3049005b1dacd35f9a0746b4944645942e5129552d6"} Apr 16 18:10:11.331416 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.331362 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz" event={"ID":"95fe9d02-b7ea-45cc-b808-ac2f646c1f2c","Type":"ContainerStarted","Data":"76edc350e9c2387f9b175831b13d6809a75cd957ac4a47299119545cb8aec24b"} Apr 16 18:10:11.332294 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.332269 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z46bv" event={"ID":"ea87fd15-995a-4cb6-9f35-8ef427ef8e52","Type":"ContainerStarted","Data":"6b6f152a0b0fc5ba63bb0b4aca2b8c6f61f5746980c55334440f52e1a1444b9e"} Apr 16 18:10:11.333067 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.333042 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fbf4846df-zwdjn" event={"ID":"e84c1343-65f5-4dda-a296-2f591522a539","Type":"ContainerStarted","Data":"95ba9a6462e5a8686e86c5c637e6cf29193ebb0c7a54ed8c86e2468cacbc16f4"} Apr 16 18:10:11.333900 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.333881 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-h6q2h" event={"ID":"367599ae-c563-446b-95e4-1f750b698283","Type":"ContainerStarted","Data":"87cda574391a0a2467858bfbb09194b28010dd70a4a6247f55377c9ee86909d6"} Apr 16 18:10:11.334712 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.334694 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" event={"ID":"e658950a-b3d3-49ff-a5ce-4445a68ef06f","Type":"ContainerStarted","Data":"a1596acf379f9d14da3104c2a93ecc3a10ba137a589af91bfebefe69dc640b95"} Apr 16 18:10:11.335629 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.335612 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" event={"ID":"5869ca3b-420e-46b0-ac02-e5572d8d6b05","Type":"ContainerStarted","Data":"5cf6a172b794142d3614f0b862bd195569d9f04d1ac4d6eee27debfee8c591aa"} Apr 16 18:10:11.336582 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.336552 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd" event={"ID":"7d0b4ef5-4d33-472c-bc36-98eecd77e026","Type":"ContainerStarted","Data":"4f581612b6c23d7003b702ccbc651387e79ed27a862fcf068e5436fd9748deec"} Apr 16 18:10:11.337442 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.337424 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" event={"ID":"8743f843-ea56-44b5-baa1-1718ae9ee68a","Type":"ContainerStarted","Data":"bffe7c8f210739faa790adbb6045e1f590e7973967de1e1152fd90adf22e9be6"} Apr 16 18:10:11.681849 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.681811 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:11.682035 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.681938 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-8mn2l\" (UID: \"3c225701-67ef-433b-93da-7745170f4769\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l" Apr 16 18:10:11.682035 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:11.682014 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle podName:138781da-d4c5-4c2a-b64e-b740488095cf nodeName:}" failed. No retries permitted until 2026-04-16 18:10:13.681991402 +0000 UTC m=+36.150600157 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle") pod "router-default-c6f985f48-6wxzx" (UID: "138781da-d4c5-4c2a-b64e-b740488095cf") : configmap references non-existent config key: service-ca.crt Apr 16 18:10:11.682170 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:11.682087 2563 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:10:11.682170 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.682115 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:11.682170 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:11.682136 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls podName:3c225701-67ef-433b-93da-7745170f4769 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:13.682120743 +0000 UTC m=+36.150729499 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls") pod "cluster-samples-operator-667775844f-8mn2l" (UID: "3c225701-67ef-433b-93da-7745170f4769") : secret "samples-operator-tls" not found Apr 16 18:10:11.682310 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.682173 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:11.682310 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:11.682200 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:10:11.682310 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:11.682214 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57dc669799-6pc56: secret "image-registry-tls" not found Apr 16 18:10:11.682310 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:11.682252 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls podName:8668273d-77af-4f62-9e62-4fcaab486fec nodeName:}" failed. No retries permitted until 2026-04-16 18:10:13.682238986 +0000 UTC m=+36.150847741 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls") pod "image-registry-57dc669799-6pc56" (UID: "8668273d-77af-4f62-9e62-4fcaab486fec") : secret "image-registry-tls" not found Apr 16 18:10:11.682310 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:11.682268 2563 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:10:11.682310 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:11.682308 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs podName:138781da-d4c5-4c2a-b64e-b740488095cf nodeName:}" failed. No retries permitted until 2026-04-16 18:10:13.682297919 +0000 UTC m=+36.150906677 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs") pod "router-default-c6f985f48-6wxzx" (UID: "138781da-d4c5-4c2a-b64e-b740488095cf") : secret "router-metrics-certs-default" not found Apr 16 18:10:11.783630 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.783505 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs\") pod \"network-metrics-daemon-pq587\" (UID: \"5b1617ae-f25b-4a90-adf4-ca28c7c22774\") " pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:10:11.783810 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.783640 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tjvkf\" (UID: \"8e947e1e-4646-485c-a1cf-45fa455fe359\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf" Apr 16 18:10:11.783810 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.783716 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-8k8jm\" (UID: \"4ba55740-f7dd-4c3c-9da7-00fa99217735\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" Apr 16 18:10:11.783901 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:11.783859 2563 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:10:11.783936 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:11.783920 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls podName:4ba55740-f7dd-4c3c-9da7-00fa99217735 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:13.783903096 +0000 UTC m=+36.252511854 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-8k8jm" (UID: "4ba55740-f7dd-4c3c-9da7-00fa99217735") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:10:11.784462 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:11.784291 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:10:11.784462 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:11.784341 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs podName:5b1617ae-f25b-4a90-adf4-ca28c7c22774 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:43.784326539 +0000 UTC m=+66.252935296 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs") pod "network-metrics-daemon-pq587" (UID: "5b1617ae-f25b-4a90-adf4-ca28c7c22774") : secret "metrics-daemon-secret" not found Apr 16 18:10:11.784462 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:11.784408 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:10:11.784462 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:11.784444 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert podName:8e947e1e-4646-485c-a1cf-45fa455fe359 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:13.784431351 +0000 UTC m=+36.253040108 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-tjvkf" (UID: "8e947e1e-4646-485c-a1cf-45fa455fe359") : secret "networking-console-plugin-cert" not found Apr 16 18:10:11.884994 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.884954 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls\") pod \"dns-default-4d9lv\" (UID: \"ac8b3146-c9ef-45ff-a401-4847e957c45c\") " pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:11.886038 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.885054 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert\") pod \"ingress-canary-7gpt8\" (UID: \"3399f6b2-54c3-4ead-9fed-519bebb162da\") " pod="openshift-ingress-canary/ingress-canary-7gpt8" Apr 16 18:10:11.886038 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.885090 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqck6\" (UniqueName: \"kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6\") pod \"network-check-target-4xm4j\" (UID: \"76ba3cac-7c44-4ba0-aefc-cfded09ee26e\") " pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:10:11.886038 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:11.885271 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:11.886038 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:11.885344 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert podName:3399f6b2-54c3-4ead-9fed-519bebb162da nodeName:}" failed. No retries permitted until 2026-04-16 18:10:13.885323267 +0000 UTC m=+36.353932036 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert") pod "ingress-canary-7gpt8" (UID: "3399f6b2-54c3-4ead-9fed-519bebb162da") : secret "canary-serving-cert" not found Apr 16 18:10:11.886038 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:11.885269 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:11.886038 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:11.885598 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls podName:ac8b3146-c9ef-45ff-a401-4847e957c45c nodeName:}" failed. No retries permitted until 2026-04-16 18:10:13.885575252 +0000 UTC m=+36.354184021 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls") pod "dns-default-4d9lv" (UID: "ac8b3146-c9ef-45ff-a401-4847e957c45c") : secret "dns-default-metrics-tls" not found Apr 16 18:10:11.889915 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.889865 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqck6\" (UniqueName: \"kubernetes.io/projected/76ba3cac-7c44-4ba0-aefc-cfded09ee26e-kube-api-access-mqck6\") pod \"network-check-target-4xm4j\" (UID: \"76ba3cac-7c44-4ba0-aefc-cfded09ee26e\") " pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:10:11.998790 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:11.998301 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:10:12.170343 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:12.170260 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4xm4j"] Apr 16 18:10:12.186235 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:10:12.186158 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76ba3cac_7c44_4ba0_aefc_cfded09ee26e.slice/crio-66e43d4322c46b12acee18e845636a47fc1092e58ced106181bf7aef8334dd49 WatchSource:0}: Error finding container 66e43d4322c46b12acee18e845636a47fc1092e58ced106181bf7aef8334dd49: Status 404 returned error can't find the container with id 66e43d4322c46b12acee18e845636a47fc1092e58ced106181bf7aef8334dd49 Apr 16 18:10:12.343479 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:12.343342 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4xm4j" event={"ID":"76ba3cac-7c44-4ba0-aefc-cfded09ee26e","Type":"ContainerStarted","Data":"66e43d4322c46b12acee18e845636a47fc1092e58ced106181bf7aef8334dd49"} Apr 16 18:10:12.356850 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:12.356091 2563 generic.go:358] "Generic (PLEG): container finished" podID="afb3aa46-f688-46a6-9d9f-7529d606c9dc" containerID="1e98edcee37e08e4e4ac8f2232be4b1fb581720736c4586f2e5d944a03df2868" exitCode=0 Apr 16 18:10:12.356850 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:12.356163 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9jdb" event={"ID":"afb3aa46-f688-46a6-9d9f-7529d606c9dc","Type":"ContainerDied","Data":"1e98edcee37e08e4e4ac8f2232be4b1fb581720736c4586f2e5d944a03df2868"} Apr 16 18:10:13.399717 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:13.398675 2563 generic.go:358] "Generic (PLEG): container finished" podID="afb3aa46-f688-46a6-9d9f-7529d606c9dc" containerID="04ca7ecbf6eb2e566a794a88b6ef0f2c4d6a17c93e1fd891f7cd46dc301ceb92" exitCode=0 Apr 16 18:10:13.399717 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:13.398731 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9jdb" event={"ID":"afb3aa46-f688-46a6-9d9f-7529d606c9dc","Type":"ContainerDied","Data":"04ca7ecbf6eb2e566a794a88b6ef0f2c4d6a17c93e1fd891f7cd46dc301ceb92"} Apr 16 18:10:13.707334 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:13.707299 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-8mn2l\" (UID: \"3c225701-67ef-433b-93da-7745170f4769\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l" Apr 16 18:10:13.707525 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:13.707443 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:13.707525 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:13.707478 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:13.707525 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:13.707508 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:13.708171 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:13.707696 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle podName:138781da-d4c5-4c2a-b64e-b740488095cf nodeName:}" failed. No retries permitted until 2026-04-16 18:10:17.707676369 +0000 UTC m=+40.176285128 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle") pod "router-default-c6f985f48-6wxzx" (UID: "138781da-d4c5-4c2a-b64e-b740488095cf") : configmap references non-existent config key: service-ca.crt Apr 16 18:10:13.708171 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:13.707825 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:10:13.708171 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:13.707846 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57dc669799-6pc56: secret "image-registry-tls" not found Apr 16 18:10:13.708171 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:13.707901 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls podName:8668273d-77af-4f62-9e62-4fcaab486fec nodeName:}" failed. No retries permitted until 2026-04-16 18:10:17.707884747 +0000 UTC m=+40.176493516 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls") pod "image-registry-57dc669799-6pc56" (UID: "8668273d-77af-4f62-9e62-4fcaab486fec") : secret "image-registry-tls" not found Apr 16 18:10:13.708171 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:13.707972 2563 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:10:13.708171 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:13.708011 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls podName:3c225701-67ef-433b-93da-7745170f4769 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:17.707999137 +0000 UTC m=+40.176607907 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls") pod "cluster-samples-operator-667775844f-8mn2l" (UID: "3c225701-67ef-433b-93da-7745170f4769") : secret "samples-operator-tls" not found Apr 16 18:10:13.708171 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:13.708062 2563 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:10:13.708171 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:13.708092 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs podName:138781da-d4c5-4c2a-b64e-b740488095cf nodeName:}" failed. No retries permitted until 2026-04-16 18:10:17.708082399 +0000 UTC m=+40.176691167 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs") pod "router-default-c6f985f48-6wxzx" (UID: "138781da-d4c5-4c2a-b64e-b740488095cf") : secret "router-metrics-certs-default" not found Apr 16 18:10:13.808581 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:13.808546 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-8k8jm\" (UID: \"4ba55740-f7dd-4c3c-9da7-00fa99217735\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" Apr 16 18:10:13.808809 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:13.808738 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tjvkf\" (UID: \"8e947e1e-4646-485c-a1cf-45fa455fe359\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf" Apr 16 18:10:13.809177 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:13.808924 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:10:13.809177 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:13.808995 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert podName:8e947e1e-4646-485c-a1cf-45fa455fe359 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:17.808975364 +0000 UTC m=+40.277584125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-tjvkf" (UID: "8e947e1e-4646-485c-a1cf-45fa455fe359") : secret "networking-console-plugin-cert" not found Apr 16 18:10:13.809480 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:13.809438 2563 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:10:13.809577 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:13.809490 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls podName:4ba55740-f7dd-4c3c-9da7-00fa99217735 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:17.809475308 +0000 UTC m=+40.278084069 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-8k8jm" (UID: "4ba55740-f7dd-4c3c-9da7-00fa99217735") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:10:13.909871 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:13.909800 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls\") pod \"dns-default-4d9lv\" (UID: \"ac8b3146-c9ef-45ff-a401-4847e957c45c\") " pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:13.910051 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:13.909932 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert\") pod \"ingress-canary-7gpt8\" (UID: \"3399f6b2-54c3-4ead-9fed-519bebb162da\") " pod="openshift-ingress-canary/ingress-canary-7gpt8" Apr 16 18:10:13.910229 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:13.910184 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:13.910299 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:13.910246 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert podName:3399f6b2-54c3-4ead-9fed-519bebb162da nodeName:}" failed. No retries permitted until 2026-04-16 18:10:17.9102279 +0000 UTC m=+40.378836659 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert") pod "ingress-canary-7gpt8" (UID: "3399f6b2-54c3-4ead-9fed-519bebb162da") : secret "canary-serving-cert" not found Apr 16 18:10:13.910666 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:13.910648 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:13.910734 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:13.910697 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls podName:ac8b3146-c9ef-45ff-a401-4847e957c45c nodeName:}" failed. No retries permitted until 2026-04-16 18:10:17.910683424 +0000 UTC m=+40.379292180 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls") pod "dns-default-4d9lv" (UID: "ac8b3146-c9ef-45ff-a401-4847e957c45c") : secret "dns-default-metrics-tls" not found Apr 16 18:10:14.443025 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:14.442984 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9jdb" event={"ID":"afb3aa46-f688-46a6-9d9f-7529d606c9dc","Type":"ContainerStarted","Data":"2b72ea47117820ca50a0e7d2a6ed202d09ca59eef6ed35119a7836c44149e79f"} Apr 16 18:10:14.469888 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:14.469557 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-v9jdb" podStartSLOduration=6.083627425 podStartE2EDuration="36.469536314s" podCreationTimestamp="2026-04-16 18:09:38 +0000 UTC" firstStartedPulling="2026-04-16 18:09:40.79370858 +0000 UTC m=+3.262317341" lastFinishedPulling="2026-04-16 18:10:11.179617461 +0000 UTC m=+33.648226230" observedRunningTime="2026-04-16 18:10:14.466224104 +0000 UTC m=+36.934832882" watchObservedRunningTime="2026-04-16 18:10:14.469536314 +0000 UTC m=+36.938145092" Apr 16 18:10:16.039355 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:16.039318 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-original-pull-secret\") pod \"global-pull-secret-syncer-5987t\" (UID: \"50d6b6c6-7edb-4c20-9fa7-9d7f97465217\") " pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:16.044176 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:16.044151 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/50d6b6c6-7edb-4c20-9fa7-9d7f97465217-original-pull-secret\") pod \"global-pull-secret-syncer-5987t\" (UID: \"50d6b6c6-7edb-4c20-9fa7-9d7f97465217\") " pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:16.208811 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:16.208778 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5987t" Apr 16 18:10:17.755900 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:17.755852 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-8mn2l\" (UID: \"3c225701-67ef-433b-93da-7745170f4769\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l" Apr 16 18:10:17.756506 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:17.755976 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:17.756506 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:17.756009 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:17.756506 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:17.756021 2563 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:10:17.756506 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:17.756039 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:17.756506 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:17.756101 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls podName:3c225701-67ef-433b-93da-7745170f4769 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:25.756079751 +0000 UTC m=+48.224688509 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls") pod "cluster-samples-operator-667775844f-8mn2l" (UID: "3c225701-67ef-433b-93da-7745170f4769") : secret "samples-operator-tls" not found Apr 16 18:10:17.756506 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:17.756141 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:10:17.756506 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:17.756164 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57dc669799-6pc56: secret "image-registry-tls" not found Apr 16 18:10:17.756506 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:17.756164 2563 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:10:17.756506 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:17.756176 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle podName:138781da-d4c5-4c2a-b64e-b740488095cf nodeName:}" failed. No retries permitted until 2026-04-16 18:10:25.756156859 +0000 UTC m=+48.224765628 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle") pod "router-default-c6f985f48-6wxzx" (UID: "138781da-d4c5-4c2a-b64e-b740488095cf") : configmap references non-existent config key: service-ca.crt Apr 16 18:10:17.756506 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:17.756210 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls podName:8668273d-77af-4f62-9e62-4fcaab486fec nodeName:}" failed. No retries permitted until 2026-04-16 18:10:25.756196686 +0000 UTC m=+48.224805451 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls") pod "image-registry-57dc669799-6pc56" (UID: "8668273d-77af-4f62-9e62-4fcaab486fec") : secret "image-registry-tls" not found Apr 16 18:10:17.756506 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:17.756226 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs podName:138781da-d4c5-4c2a-b64e-b740488095cf nodeName:}" failed. No retries permitted until 2026-04-16 18:10:25.756218001 +0000 UTC m=+48.224826755 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs") pod "router-default-c6f985f48-6wxzx" (UID: "138781da-d4c5-4c2a-b64e-b740488095cf") : secret "router-metrics-certs-default" not found Apr 16 18:10:17.857508 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:17.857472 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tjvkf\" (UID: \"8e947e1e-4646-485c-a1cf-45fa455fe359\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf" Apr 16 18:10:17.857684 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:17.857550 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-8k8jm\" (UID: \"4ba55740-f7dd-4c3c-9da7-00fa99217735\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" Apr 16 18:10:17.857684 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:17.857648 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:10:17.857804 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:17.857713 2563 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:10:17.857804 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:17.857733 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert podName:8e947e1e-4646-485c-a1cf-45fa455fe359 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:25.857712246 +0000 UTC m=+48.326321003 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-tjvkf" (UID: "8e947e1e-4646-485c-a1cf-45fa455fe359") : secret "networking-console-plugin-cert" not found Apr 16 18:10:17.857804 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:17.857791 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls podName:4ba55740-f7dd-4c3c-9da7-00fa99217735 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:25.857773397 +0000 UTC m=+48.326382156 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-8k8jm" (UID: "4ba55740-f7dd-4c3c-9da7-00fa99217735") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:10:17.958332 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:17.958297 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls\") pod \"dns-default-4d9lv\" (UID: \"ac8b3146-c9ef-45ff-a401-4847e957c45c\") " pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:17.958516 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:17.958356 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert\") pod \"ingress-canary-7gpt8\" (UID: \"3399f6b2-54c3-4ead-9fed-519bebb162da\") " pod="openshift-ingress-canary/ingress-canary-7gpt8" Apr 16 18:10:17.958516 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:17.958461 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:17.958516 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:17.958464 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:17.958516 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:17.958512 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert podName:3399f6b2-54c3-4ead-9fed-519bebb162da nodeName:}" failed. No retries permitted until 2026-04-16 18:10:25.958498278 +0000 UTC m=+48.427107033 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert") pod "ingress-canary-7gpt8" (UID: "3399f6b2-54c3-4ead-9fed-519bebb162da") : secret "canary-serving-cert" not found Apr 16 18:10:17.958657 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:17.958525 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls podName:ac8b3146-c9ef-45ff-a401-4847e957c45c nodeName:}" failed. No retries permitted until 2026-04-16 18:10:25.958519677 +0000 UTC m=+48.427128432 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls") pod "dns-default-4d9lv" (UID: "ac8b3146-c9ef-45ff-a401-4847e957c45c") : secret "dns-default-metrics-tls" not found Apr 16 18:10:24.278998 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:24.278895 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5987t"] Apr 16 18:10:24.281493 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:10:24.281469 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50d6b6c6_7edb_4c20_9fa7_9d7f97465217.slice/crio-d3db9a58d7a8bd6e94564556eeb2c91345e80fbd168b7691c3a2aaad6e0b9ac3 WatchSource:0}: Error finding container d3db9a58d7a8bd6e94564556eeb2c91345e80fbd168b7691c3a2aaad6e0b9ac3: Status 404 returned error can't find the container with id d3db9a58d7a8bd6e94564556eeb2c91345e80fbd168b7691c3a2aaad6e0b9ac3 Apr 16 18:10:24.480148 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:24.479142 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr" event={"ID":"94f90437-3d9f-443f-9a60-c3237a421595","Type":"ContainerStarted","Data":"7aee0a3ff859d01a834e56512cf2a69c7742604b8f69202af3953861db66f6dd"} Apr 16 18:10:24.488227 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:24.488107 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z46bv" event={"ID":"ea87fd15-995a-4cb6-9f35-8ef427ef8e52","Type":"ContainerStarted","Data":"f41c8010656ac41db95e244b4a25b2aa2fb4f189f0ba545b2e1423772ac92e37"} Apr 16 18:10:24.499938 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:24.499901 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fbf4846df-zwdjn" event={"ID":"e84c1343-65f5-4dda-a296-2f591522a539","Type":"ContainerStarted","Data":"55ee8e640aaa4e7dbea02afba589e8e51d3214037ecd9ec7d024d401c7fa722d"} Apr 16 18:10:24.507648 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:24.505932 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" event={"ID":"e658950a-b3d3-49ff-a5ce-4445a68ef06f","Type":"ContainerStarted","Data":"6ef7fcb50e860c9d16ee0f1b3f16e6fe5221657f878123cb7436f5021604cde5"} Apr 16 18:10:24.508496 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:24.508472 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:24.511134 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:24.511077 2563 patch_prober.go:28] interesting pod/console-operator-d87b8d5fc-z7qz9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.132.0.9:8443/readyz\": dial tcp 10.132.0.9:8443: connect: connection refused" start-of-body= Apr 16 18:10:24.511351 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:24.511296 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" podUID="e658950a-b3d3-49ff-a5ce-4445a68ef06f" containerName="console-operator" probeResult="failure" output="Get \"https://10.132.0.9:8443/readyz\": dial tcp 10.132.0.9:8443: connect: connection refused" Apr 16 18:10:24.513943 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:24.513221 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" event={"ID":"5869ca3b-420e-46b0-ac02-e5572d8d6b05","Type":"ContainerStarted","Data":"fd3a42b3e94dde3fceeffeba5a2d9cff2dc2163df615fd34e604d3bc73bc3529"} Apr 16 18:10:24.515165 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:24.514947 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5987t" event={"ID":"50d6b6c6-7edb-4c20-9fa7-9d7f97465217","Type":"ContainerStarted","Data":"d3db9a58d7a8bd6e94564556eeb2c91345e80fbd168b7691c3a2aaad6e0b9ac3"} Apr 16 18:10:24.516846 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:24.516516 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd" event={"ID":"7d0b4ef5-4d33-472c-bc36-98eecd77e026","Type":"ContainerStarted","Data":"a9fc3b3d408ccbe8648e7917cc22a4379bbf4eec09d0a1bbbc0ad4bd229b2328"} Apr 16 18:10:24.526681 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:24.524780 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr" podStartSLOduration=27.64648868 podStartE2EDuration="40.524763562s" podCreationTimestamp="2026-04-16 18:09:44 +0000 UTC" firstStartedPulling="2026-04-16 18:10:11.212687672 +0000 UTC m=+33.681296444" lastFinishedPulling="2026-04-16 18:10:24.090962558 +0000 UTC m=+46.559571326" observedRunningTime="2026-04-16 18:10:24.523297754 +0000 UTC m=+46.991906533" watchObservedRunningTime="2026-04-16 18:10:24.524763562 +0000 UTC m=+46.993372339" Apr 16 18:10:24.545387 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:24.544936 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" podStartSLOduration=27.654718845 podStartE2EDuration="40.544916682s" podCreationTimestamp="2026-04-16 18:09:44 +0000 UTC" firstStartedPulling="2026-04-16 18:10:11.200713648 +0000 UTC m=+33.669322413" lastFinishedPulling="2026-04-16 18:10:24.090911481 +0000 UTC m=+46.559520250" observedRunningTime="2026-04-16 18:10:24.542989183 +0000 UTC m=+47.011597961" watchObservedRunningTime="2026-04-16 18:10:24.544916682 +0000 UTC m=+47.013525460" Apr 16 18:10:24.594491 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:24.592716 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" podStartSLOduration=27.604209581 podStartE2EDuration="40.59269607s" podCreationTimestamp="2026-04-16 18:09:44 +0000 UTC" firstStartedPulling="2026-04-16 18:10:11.198243415 +0000 UTC m=+33.666852184" lastFinishedPulling="2026-04-16 18:10:24.186729904 +0000 UTC m=+46.655338673" observedRunningTime="2026-04-16 18:10:24.592016135 +0000 UTC m=+47.060624912" watchObservedRunningTime="2026-04-16 18:10:24.59269607 +0000 UTC m=+47.061304848" Apr 16 18:10:24.594491 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:24.593765 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z46bv" podStartSLOduration=28.252634159 podStartE2EDuration="40.593752722s" podCreationTimestamp="2026-04-16 18:09:44 +0000 UTC" firstStartedPulling="2026-04-16 18:10:11.199631077 +0000 UTC m=+33.668239833" lastFinishedPulling="2026-04-16 18:10:23.540749637 +0000 UTC m=+46.009358396" observedRunningTime="2026-04-16 18:10:24.565841674 +0000 UTC m=+47.034450452" watchObservedRunningTime="2026-04-16 18:10:24.593752722 +0000 UTC m=+47.062361500" Apr 16 18:10:24.615794 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:24.615641 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fbf4846df-zwdjn" podStartSLOduration=23.647871055 podStartE2EDuration="36.615621957s" podCreationTimestamp="2026-04-16 18:09:48 +0000 UTC" firstStartedPulling="2026-04-16 18:10:11.219450656 +0000 UTC m=+33.688059411" lastFinishedPulling="2026-04-16 18:10:24.187201544 +0000 UTC m=+46.655810313" observedRunningTime="2026-04-16 18:10:24.614267151 +0000 UTC m=+47.082875927" watchObservedRunningTime="2026-04-16 18:10:24.615621957 +0000 UTC m=+47.084230739" Apr 16 18:10:24.641388 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:24.641326 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd" podStartSLOduration=28.303782114 podStartE2EDuration="40.641305651s" podCreationTimestamp="2026-04-16 18:09:44 +0000 UTC" firstStartedPulling="2026-04-16 18:10:11.203228152 +0000 UTC m=+33.671836910" lastFinishedPulling="2026-04-16 18:10:23.540751678 +0000 UTC m=+46.009360447" observedRunningTime="2026-04-16 18:10:24.639231644 +0000 UTC m=+47.107840434" watchObservedRunningTime="2026-04-16 18:10:24.641305651 +0000 UTC m=+47.109914438" Apr 16 18:10:25.525777 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.525186 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz" event={"ID":"95fe9d02-b7ea-45cc-b808-ac2f646c1f2c","Type":"ContainerStarted","Data":"b1a9d8e64edd1340f5b30426f34463178fbf958fccedaf9c4c46014391935583"} Apr 16 18:10:25.526253 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.526057 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz" Apr 16 18:10:25.528335 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.528297 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz" Apr 16 18:10:25.530193 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.530150 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-h6q2h" event={"ID":"367599ae-c563-446b-95e4-1f750b698283","Type":"ContainerStarted","Data":"fd2967fbf9703310c316f3d7200912092bae1bd904f819f13a671d07d0561e70"} Apr 16 18:10:25.532998 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.532718 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/0.log" Apr 16 18:10:25.532998 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.532757 2563 generic.go:358] "Generic (PLEG): container finished" podID="e658950a-b3d3-49ff-a5ce-4445a68ef06f" containerID="6ef7fcb50e860c9d16ee0f1b3f16e6fe5221657f878123cb7436f5021604cde5" exitCode=255 Apr 16 18:10:25.532998 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.532830 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" event={"ID":"e658950a-b3d3-49ff-a5ce-4445a68ef06f","Type":"ContainerDied","Data":"6ef7fcb50e860c9d16ee0f1b3f16e6fe5221657f878123cb7436f5021604cde5"} Apr 16 18:10:25.533283 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.533038 2563 scope.go:117] "RemoveContainer" containerID="6ef7fcb50e860c9d16ee0f1b3f16e6fe5221657f878123cb7436f5021604cde5" Apr 16 18:10:25.535193 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.535088 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4xm4j" event={"ID":"76ba3cac-7c44-4ba0-aefc-cfded09ee26e","Type":"ContainerStarted","Data":"8d9050e76d207a7e25f5a254cc4a7e43530c9decdc7948b8fc46920de4b351e4"} Apr 16 18:10:25.535193 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.535116 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:10:25.539289 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.539241 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" event={"ID":"8743f843-ea56-44b5-baa1-1718ae9ee68a","Type":"ContainerStarted","Data":"490ea32a221284d4b7f41bf52240c577b68d909ca8cd27ce985034c02748c502"} Apr 16 18:10:25.562425 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.561994 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b4cf49c9d-dpmhz" podStartSLOduration=24.557939943 podStartE2EDuration="37.561972692s" podCreationTimestamp="2026-04-16 18:09:48 +0000 UTC" firstStartedPulling="2026-04-16 18:10:11.182544648 +0000 UTC m=+33.651153402" lastFinishedPulling="2026-04-16 18:10:24.18657738 +0000 UTC m=+46.655186151" observedRunningTime="2026-04-16 18:10:25.545162409 +0000 UTC m=+48.013771188" watchObservedRunningTime="2026-04-16 18:10:25.561972692 +0000 UTC m=+48.030581470" Apr 16 18:10:25.598007 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.597626 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-4xm4j" podStartSLOduration=35.561183331 podStartE2EDuration="47.597606306s" podCreationTimestamp="2026-04-16 18:09:38 +0000 UTC" firstStartedPulling="2026-04-16 18:10:12.194174879 +0000 UTC m=+34.662783640" lastFinishedPulling="2026-04-16 18:10:24.23059786 +0000 UTC m=+46.699206615" observedRunningTime="2026-04-16 18:10:25.595627504 +0000 UTC m=+48.064236297" watchObservedRunningTime="2026-04-16 18:10:25.597606306 +0000 UTC m=+48.066215085" Apr 16 18:10:25.835787 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.835650 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-8mn2l\" (UID: \"3c225701-67ef-433b-93da-7745170f4769\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l" Apr 16 18:10:25.835787 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.835767 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:25.836064 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.835799 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:25.836064 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.835829 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:25.836064 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:25.835965 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle podName:138781da-d4c5-4c2a-b64e-b740488095cf nodeName:}" failed. No retries permitted until 2026-04-16 18:10:41.835947059 +0000 UTC m=+64.304555819 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle") pod "router-default-c6f985f48-6wxzx" (UID: "138781da-d4c5-4c2a-b64e-b740488095cf") : configmap references non-existent config key: service-ca.crt Apr 16 18:10:25.836064 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:25.836049 2563 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:10:25.836326 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:25.836081 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls podName:3c225701-67ef-433b-93da-7745170f4769 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:41.8360707 +0000 UTC m=+64.304679462 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls") pod "cluster-samples-operator-667775844f-8mn2l" (UID: "3c225701-67ef-433b-93da-7745170f4769") : secret "samples-operator-tls" not found Apr 16 18:10:25.836326 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:25.836137 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:10:25.836326 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:25.836147 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57dc669799-6pc56: secret "image-registry-tls" not found Apr 16 18:10:25.836326 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:25.836177 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls podName:8668273d-77af-4f62-9e62-4fcaab486fec nodeName:}" failed. No retries permitted until 2026-04-16 18:10:41.836167745 +0000 UTC m=+64.304776505 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls") pod "image-registry-57dc669799-6pc56" (UID: "8668273d-77af-4f62-9e62-4fcaab486fec") : secret "image-registry-tls" not found Apr 16 18:10:25.836326 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:25.836224 2563 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:10:25.836326 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:25.836250 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs podName:138781da-d4c5-4c2a-b64e-b740488095cf nodeName:}" failed. No retries permitted until 2026-04-16 18:10:41.836241262 +0000 UTC m=+64.304850021 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs") pod "router-default-c6f985f48-6wxzx" (UID: "138781da-d4c5-4c2a-b64e-b740488095cf") : secret "router-metrics-certs-default" not found Apr 16 18:10:25.936774 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.936731 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tjvkf\" (UID: \"8e947e1e-4646-485c-a1cf-45fa455fe359\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf" Apr 16 18:10:25.936946 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:25.936827 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-8k8jm\" (UID: \"4ba55740-f7dd-4c3c-9da7-00fa99217735\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" Apr 16 18:10:25.937014 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:25.936975 2563 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:10:25.937069 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:25.937041 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls podName:4ba55740-f7dd-4c3c-9da7-00fa99217735 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:41.937022584 +0000 UTC m=+64.405631350 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-8k8jm" (UID: "4ba55740-f7dd-4c3c-9da7-00fa99217735") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:10:25.937332 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:25.937233 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:10:25.937332 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:25.937298 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert podName:8e947e1e-4646-485c-a1cf-45fa455fe359 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:41.937280899 +0000 UTC m=+64.405889657 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-tjvkf" (UID: "8e947e1e-4646-485c-a1cf-45fa455fe359") : secret "networking-console-plugin-cert" not found Apr 16 18:10:26.038772 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:26.037452 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls\") pod \"dns-default-4d9lv\" (UID: \"ac8b3146-c9ef-45ff-a401-4847e957c45c\") " pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:26.038772 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:26.037537 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert\") pod \"ingress-canary-7gpt8\" (UID: \"3399f6b2-54c3-4ead-9fed-519bebb162da\") " pod="openshift-ingress-canary/ingress-canary-7gpt8" Apr 16 18:10:26.038772 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:26.037690 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:26.038772 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:26.037745 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert podName:3399f6b2-54c3-4ead-9fed-519bebb162da nodeName:}" failed. No retries permitted until 2026-04-16 18:10:42.037728334 +0000 UTC m=+64.506337091 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert") pod "ingress-canary-7gpt8" (UID: "3399f6b2-54c3-4ead-9fed-519bebb162da") : secret "canary-serving-cert" not found Apr 16 18:10:26.038772 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:26.038129 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:26.038772 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:26.038175 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls podName:ac8b3146-c9ef-45ff-a401-4847e957c45c nodeName:}" failed. No retries permitted until 2026-04-16 18:10:42.038160112 +0000 UTC m=+64.506768870 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls") pod "dns-default-4d9lv" (UID: "ac8b3146-c9ef-45ff-a401-4847e957c45c") : secret "dns-default-metrics-tls" not found Apr 16 18:10:26.543172 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:26.543137 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:10:26.543652 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:26.543576 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/0.log" Apr 16 18:10:26.543652 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:26.543612 2563 generic.go:358] "Generic (PLEG): container finished" podID="e658950a-b3d3-49ff-a5ce-4445a68ef06f" containerID="7f7ee726319bff0adabdd2cba1056daa6d027efda4290df404c5c0215a55c775" exitCode=255 Apr 16 18:10:26.543765 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:26.543752 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" event={"ID":"e658950a-b3d3-49ff-a5ce-4445a68ef06f","Type":"ContainerDied","Data":"7f7ee726319bff0adabdd2cba1056daa6d027efda4290df404c5c0215a55c775"} Apr 16 18:10:26.543820 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:26.543787 2563 scope.go:117] "RemoveContainer" containerID="6ef7fcb50e860c9d16ee0f1b3f16e6fe5221657f878123cb7436f5021604cde5" Apr 16 18:10:26.544065 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:26.543999 2563 scope.go:117] "RemoveContainer" containerID="7f7ee726319bff0adabdd2cba1056daa6d027efda4290df404c5c0215a55c775" Apr 16 18:10:26.544284 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:26.544228 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-z7qz9_openshift-console-operator(e658950a-b3d3-49ff-a5ce-4445a68ef06f)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" podUID="e658950a-b3d3-49ff-a5ce-4445a68ef06f" Apr 16 18:10:26.562578 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:26.562519 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-h6q2h" podStartSLOduration=29.557173972 podStartE2EDuration="42.562500445s" podCreationTimestamp="2026-04-16 18:09:44 +0000 UTC" firstStartedPulling="2026-04-16 18:10:11.182240561 +0000 UTC m=+33.650849317" lastFinishedPulling="2026-04-16 18:10:24.18756702 +0000 UTC m=+46.656175790" observedRunningTime="2026-04-16 18:10:25.614380278 +0000 UTC m=+48.082989036" watchObservedRunningTime="2026-04-16 18:10:26.562500445 +0000 UTC m=+49.031109219" Apr 16 18:10:27.326786 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.326713 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8pgh6"] Apr 16 18:10:27.329725 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.329681 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5lcj4_1297cac1-5827-4690-b5d6-38c2ba71da4e/dns-node-resolver/0.log" Apr 16 18:10:27.350380 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.349646 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8pgh6"] Apr 16 18:10:27.350380 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.349800 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:27.353532 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.353310 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:10:27.353532 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.353421 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:10:27.353532 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.353492 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bxtnw\"" Apr 16 18:10:27.453384 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.453345 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f08fcec0-1201-4e71-9f48-1e8276df4e23-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:27.453584 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.453408 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghfcz\" (UniqueName: \"kubernetes.io/projected/f08fcec0-1201-4e71-9f48-1e8276df4e23-kube-api-access-ghfcz\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:27.453584 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.453526 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f08fcec0-1201-4e71-9f48-1e8276df4e23-data-volume\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:27.453584 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.453570 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f08fcec0-1201-4e71-9f48-1e8276df4e23-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:27.453725 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.453635 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f08fcec0-1201-4e71-9f48-1e8276df4e23-crio-socket\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:27.550720 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.550690 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:10:27.551166 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.551101 2563 scope.go:117] "RemoveContainer" containerID="7f7ee726319bff0adabdd2cba1056daa6d027efda4290df404c5c0215a55c775" Apr 16 18:10:27.551282 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:27.551264 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-z7qz9_openshift-console-operator(e658950a-b3d3-49ff-a5ce-4445a68ef06f)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" podUID="e658950a-b3d3-49ff-a5ce-4445a68ef06f" Apr 16 18:10:27.554457 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.554431 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghfcz\" (UniqueName: \"kubernetes.io/projected/f08fcec0-1201-4e71-9f48-1e8276df4e23-kube-api-access-ghfcz\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:27.554596 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.554483 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f08fcec0-1201-4e71-9f48-1e8276df4e23-data-volume\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:27.554596 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.554518 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f08fcec0-1201-4e71-9f48-1e8276df4e23-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:27.554596 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.554560 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f08fcec0-1201-4e71-9f48-1e8276df4e23-crio-socket\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:27.554768 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:27.554664 2563 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 18:10:27.554768 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:27.554736 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f08fcec0-1201-4e71-9f48-1e8276df4e23-insights-runtime-extractor-tls podName:f08fcec0-1201-4e71-9f48-1e8276df4e23 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:28.054717607 +0000 UTC m=+50.523326365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/f08fcec0-1201-4e71-9f48-1e8276df4e23-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8pgh6" (UID: "f08fcec0-1201-4e71-9f48-1e8276df4e23") : secret "insights-runtime-extractor-tls" not found Apr 16 18:10:27.554768 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.554754 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f08fcec0-1201-4e71-9f48-1e8276df4e23-crio-socket\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:27.554933 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.554832 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f08fcec0-1201-4e71-9f48-1e8276df4e23-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:27.554933 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.554885 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f08fcec0-1201-4e71-9f48-1e8276df4e23-data-volume\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:27.555462 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.555439 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f08fcec0-1201-4e71-9f48-1e8276df4e23-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:27.563369 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.563349 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghfcz\" (UniqueName: \"kubernetes.io/projected/f08fcec0-1201-4e71-9f48-1e8276df4e23-kube-api-access-ghfcz\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:27.895905 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.895870 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-5hrgg"] Apr 16 18:10:27.918923 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.918878 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-5hrgg"] Apr 16 18:10:27.919107 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.919031 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5hrgg" Apr 16 18:10:27.922031 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.921984 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-d79rn\"" Apr 16 18:10:27.922031 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.921981 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 18:10:27.922307 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.921983 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 18:10:27.959353 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:27.959318 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pkwb\" (UniqueName: \"kubernetes.io/projected/00d41a7d-3c9c-48f9-adde-5ceda2b430a0-kube-api-access-7pkwb\") pod \"migrator-64d4d94569-5hrgg\" (UID: \"00d41a7d-3c9c-48f9-adde-5ceda2b430a0\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5hrgg" Apr 16 18:10:28.060883 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.060845 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f08fcec0-1201-4e71-9f48-1e8276df4e23-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:28.061071 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.060938 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pkwb\" (UniqueName: \"kubernetes.io/projected/00d41a7d-3c9c-48f9-adde-5ceda2b430a0-kube-api-access-7pkwb\") pod \"migrator-64d4d94569-5hrgg\" (UID: \"00d41a7d-3c9c-48f9-adde-5ceda2b430a0\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5hrgg" Apr 16 18:10:28.061071 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:28.061021 2563 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 18:10:28.061184 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:28.061089 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f08fcec0-1201-4e71-9f48-1e8276df4e23-insights-runtime-extractor-tls podName:f08fcec0-1201-4e71-9f48-1e8276df4e23 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:29.061073953 +0000 UTC m=+51.529682707 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/f08fcec0-1201-4e71-9f48-1e8276df4e23-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8pgh6" (UID: "f08fcec0-1201-4e71-9f48-1e8276df4e23") : secret "insights-runtime-extractor-tls" not found Apr 16 18:10:28.069505 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.069480 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pkwb\" (UniqueName: \"kubernetes.io/projected/00d41a7d-3c9c-48f9-adde-5ceda2b430a0-kube-api-access-7pkwb\") pod \"migrator-64d4d94569-5hrgg\" (UID: \"00d41a7d-3c9c-48f9-adde-5ceda2b430a0\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5hrgg" Apr 16 18:10:28.230759 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.230715 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5hrgg" Apr 16 18:10:28.327533 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.327498 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4fj5x_757de476-6da8-4345-b7fc-6c36ed994dea/node-ca/0.log" Apr 16 18:10:28.753574 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.753534 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-pj7kv"] Apr 16 18:10:28.757373 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.757347 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-pj7kv" Apr 16 18:10:28.760449 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.760418 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 18:10:28.760449 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.760440 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-5286q\"" Apr 16 18:10:28.760673 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.760424 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 18:10:28.762253 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.762235 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 18:10:28.762360 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.762317 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 18:10:28.771336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.771314 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-pj7kv"] Apr 16 18:10:28.870683 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.870647 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/84a86a57-6af8-4694-9c13-5149ef0a0f52-signing-key\") pod \"service-ca-bfc587fb7-pj7kv\" (UID: \"84a86a57-6af8-4694-9c13-5149ef0a0f52\") " pod="openshift-service-ca/service-ca-bfc587fb7-pj7kv" Apr 16 18:10:28.870866 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.870840 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgtfs\" (UniqueName: \"kubernetes.io/projected/84a86a57-6af8-4694-9c13-5149ef0a0f52-kube-api-access-rgtfs\") pod \"service-ca-bfc587fb7-pj7kv\" (UID: \"84a86a57-6af8-4694-9c13-5149ef0a0f52\") " pod="openshift-service-ca/service-ca-bfc587fb7-pj7kv" Apr 16 18:10:28.870960 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.870932 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/84a86a57-6af8-4694-9c13-5149ef0a0f52-signing-cabundle\") pod \"service-ca-bfc587fb7-pj7kv\" (UID: \"84a86a57-6af8-4694-9c13-5149ef0a0f52\") " pod="openshift-service-ca/service-ca-bfc587fb7-pj7kv" Apr 16 18:10:28.971589 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.971554 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgtfs\" (UniqueName: \"kubernetes.io/projected/84a86a57-6af8-4694-9c13-5149ef0a0f52-kube-api-access-rgtfs\") pod \"service-ca-bfc587fb7-pj7kv\" (UID: \"84a86a57-6af8-4694-9c13-5149ef0a0f52\") " pod="openshift-service-ca/service-ca-bfc587fb7-pj7kv" Apr 16 18:10:28.971768 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.971603 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/84a86a57-6af8-4694-9c13-5149ef0a0f52-signing-cabundle\") pod \"service-ca-bfc587fb7-pj7kv\" (UID: \"84a86a57-6af8-4694-9c13-5149ef0a0f52\") " pod="openshift-service-ca/service-ca-bfc587fb7-pj7kv" Apr 16 18:10:28.971768 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.971726 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/84a86a57-6af8-4694-9c13-5149ef0a0f52-signing-key\") pod \"service-ca-bfc587fb7-pj7kv\" (UID: \"84a86a57-6af8-4694-9c13-5149ef0a0f52\") " pod="openshift-service-ca/service-ca-bfc587fb7-pj7kv" Apr 16 18:10:28.972969 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.972943 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/84a86a57-6af8-4694-9c13-5149ef0a0f52-signing-cabundle\") pod \"service-ca-bfc587fb7-pj7kv\" (UID: \"84a86a57-6af8-4694-9c13-5149ef0a0f52\") " pod="openshift-service-ca/service-ca-bfc587fb7-pj7kv" Apr 16 18:10:28.974324 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.974293 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/84a86a57-6af8-4694-9c13-5149ef0a0f52-signing-key\") pod \"service-ca-bfc587fb7-pj7kv\" (UID: \"84a86a57-6af8-4694-9c13-5149ef0a0f52\") " pod="openshift-service-ca/service-ca-bfc587fb7-pj7kv" Apr 16 18:10:28.982544 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:28.982520 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgtfs\" (UniqueName: \"kubernetes.io/projected/84a86a57-6af8-4694-9c13-5149ef0a0f52-kube-api-access-rgtfs\") pod \"service-ca-bfc587fb7-pj7kv\" (UID: \"84a86a57-6af8-4694-9c13-5149ef0a0f52\") " pod="openshift-service-ca/service-ca-bfc587fb7-pj7kv" Apr 16 18:10:29.068021 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:29.067897 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-pj7kv" Apr 16 18:10:29.072451 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:29.072423 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f08fcec0-1201-4e71-9f48-1e8276df4e23-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:29.072620 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:29.072599 2563 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 18:10:29.072728 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:29.072663 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f08fcec0-1201-4e71-9f48-1e8276df4e23-insights-runtime-extractor-tls podName:f08fcec0-1201-4e71-9f48-1e8276df4e23 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:31.072641934 +0000 UTC m=+53.541250694 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/f08fcec0-1201-4e71-9f48-1e8276df4e23-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8pgh6" (UID: "f08fcec0-1201-4e71-9f48-1e8276df4e23") : secret "insights-runtime-extractor-tls" not found Apr 16 18:10:29.133208 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:29.132715 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-5hrgg"] Apr 16 18:10:29.137723 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:10:29.137476 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00d41a7d_3c9c_48f9_adde_5ceda2b430a0.slice/crio-efadd12f0fd2e05e46964ec96f91cfc2d233588f91aeb1e4dbcf288a9be1e6f4 WatchSource:0}: Error finding container efadd12f0fd2e05e46964ec96f91cfc2d233588f91aeb1e4dbcf288a9be1e6f4: Status 404 returned error can't find the container with id efadd12f0fd2e05e46964ec96f91cfc2d233588f91aeb1e4dbcf288a9be1e6f4 Apr 16 18:10:29.218805 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:29.218741 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-pj7kv"] Apr 16 18:10:29.221321 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:10:29.221291 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84a86a57_6af8_4694_9c13_5149ef0a0f52.slice/crio-1fb5c6966b507a2c17ec8f04b06b20df62089bce34a997d569a037fe4c75fb69 WatchSource:0}: Error finding container 1fb5c6966b507a2c17ec8f04b06b20df62089bce34a997d569a037fe4c75fb69: Status 404 returned error can't find the container with id 1fb5c6966b507a2c17ec8f04b06b20df62089bce34a997d569a037fe4c75fb69 Apr 16 18:10:29.557423 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:29.557320 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5hrgg" event={"ID":"00d41a7d-3c9c-48f9-adde-5ceda2b430a0","Type":"ContainerStarted","Data":"efadd12f0fd2e05e46964ec96f91cfc2d233588f91aeb1e4dbcf288a9be1e6f4"} Apr 16 18:10:29.558676 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:29.558648 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5987t" event={"ID":"50d6b6c6-7edb-4c20-9fa7-9d7f97465217","Type":"ContainerStarted","Data":"1d0c956cca87829b1f36941c4a5131d248601c5b74aa893a7ed2505d1d772c3e"} Apr 16 18:10:29.559976 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:29.559955 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-pj7kv" event={"ID":"84a86a57-6af8-4694-9c13-5149ef0a0f52","Type":"ContainerStarted","Data":"0e265e2b4d87d68002198792dc3f6416cd991dfaf67477b5d5274fe1fa8c40e5"} Apr 16 18:10:29.560063 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:29.559983 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-pj7kv" event={"ID":"84a86a57-6af8-4694-9c13-5149ef0a0f52","Type":"ContainerStarted","Data":"1fb5c6966b507a2c17ec8f04b06b20df62089bce34a997d569a037fe4c75fb69"} Apr 16 18:10:29.561655 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:29.561636 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" event={"ID":"8743f843-ea56-44b5-baa1-1718ae9ee68a","Type":"ContainerStarted","Data":"784a813537c61336b8d20862609724a48ca07a1aee9e28c857f793ee02cb459c"} Apr 16 18:10:29.561730 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:29.561664 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" event={"ID":"8743f843-ea56-44b5-baa1-1718ae9ee68a","Type":"ContainerStarted","Data":"398f91424ac1fe8a9e59c8a044075477d23a40498f8b9f86aec59268fbd15dc7"} Apr 16 18:10:29.575574 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:29.575520 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-5987t" podStartSLOduration=24.849729782 podStartE2EDuration="29.575503954s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="2026-04-16 18:10:24.283070258 +0000 UTC m=+46.751679014" lastFinishedPulling="2026-04-16 18:10:29.008844427 +0000 UTC m=+51.477453186" observedRunningTime="2026-04-16 18:10:29.574740323 +0000 UTC m=+52.043349102" watchObservedRunningTime="2026-04-16 18:10:29.575503954 +0000 UTC m=+52.044112722" Apr 16 18:10:29.594705 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:29.594640 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" podStartSLOduration=23.822411394 podStartE2EDuration="41.594623193s" podCreationTimestamp="2026-04-16 18:09:48 +0000 UTC" firstStartedPulling="2026-04-16 18:10:11.218791612 +0000 UTC m=+33.687400373" lastFinishedPulling="2026-04-16 18:10:28.991003402 +0000 UTC m=+51.459612172" observedRunningTime="2026-04-16 18:10:29.59315311 +0000 UTC m=+52.061761886" watchObservedRunningTime="2026-04-16 18:10:29.594623193 +0000 UTC m=+52.063231971" Apr 16 18:10:29.611365 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:29.611307 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-pj7kv" podStartSLOduration=1.6112878880000001 podStartE2EDuration="1.611287888s" podCreationTimestamp="2026-04-16 18:10:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:10:29.610131978 +0000 UTC m=+52.078740756" watchObservedRunningTime="2026-04-16 18:10:29.611287888 +0000 UTC m=+52.079896664" Apr 16 18:10:30.245909 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:30.245860 2563 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:30.246412 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:30.246377 2563 scope.go:117] "RemoveContainer" containerID="7f7ee726319bff0adabdd2cba1056daa6d027efda4290df404c5c0215a55c775" Apr 16 18:10:30.246657 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:30.246634 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-z7qz9_openshift-console-operator(e658950a-b3d3-49ff-a5ce-4445a68ef06f)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" podUID="e658950a-b3d3-49ff-a5ce-4445a68ef06f" Apr 16 18:10:30.568050 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:30.567953 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5hrgg" event={"ID":"00d41a7d-3c9c-48f9-adde-5ceda2b430a0","Type":"ContainerStarted","Data":"92069d038ae4f4e136ddd8d3f82d28e3ad4567211fc2727471c1fbcfc57b61a9"} Apr 16 18:10:30.568050 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:30.568001 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5hrgg" event={"ID":"00d41a7d-3c9c-48f9-adde-5ceda2b430a0","Type":"ContainerStarted","Data":"f1d3c83bb41537e0319ea62fbcbf26f84dcc146a5715263080f7bb63c5e99c92"} Apr 16 18:10:30.583582 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:30.583534 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5hrgg" podStartSLOduration=2.381466694 podStartE2EDuration="3.583520241s" podCreationTimestamp="2026-04-16 18:10:27 +0000 UTC" firstStartedPulling="2026-04-16 18:10:29.139998594 +0000 UTC m=+51.608607371" lastFinishedPulling="2026-04-16 18:10:30.342052149 +0000 UTC m=+52.810660918" observedRunningTime="2026-04-16 18:10:30.583089717 +0000 UTC m=+53.051698495" watchObservedRunningTime="2026-04-16 18:10:30.583520241 +0000 UTC m=+53.052129018" Apr 16 18:10:31.092894 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:31.092847 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f08fcec0-1201-4e71-9f48-1e8276df4e23-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:31.093200 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:31.093126 2563 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 18:10:31.093200 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:31.093188 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f08fcec0-1201-4e71-9f48-1e8276df4e23-insights-runtime-extractor-tls podName:f08fcec0-1201-4e71-9f48-1e8276df4e23 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:35.093170247 +0000 UTC m=+57.561779006 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/f08fcec0-1201-4e71-9f48-1e8276df4e23-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8pgh6" (UID: "f08fcec0-1201-4e71-9f48-1e8276df4e23") : secret "insights-runtime-extractor-tls" not found Apr 16 18:10:34.507017 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:34.506986 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:34.507505 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:34.507486 2563 scope.go:117] "RemoveContainer" containerID="7f7ee726319bff0adabdd2cba1056daa6d027efda4290df404c5c0215a55c775" Apr 16 18:10:34.507750 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:34.507721 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-z7qz9_openshift-console-operator(e658950a-b3d3-49ff-a5ce-4445a68ef06f)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" podUID="e658950a-b3d3-49ff-a5ce-4445a68ef06f" Apr 16 18:10:35.133507 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:35.133469 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f08fcec0-1201-4e71-9f48-1e8276df4e23-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:35.133690 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:35.133628 2563 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 18:10:35.133734 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:10:35.133698 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f08fcec0-1201-4e71-9f48-1e8276df4e23-insights-runtime-extractor-tls podName:f08fcec0-1201-4e71-9f48-1e8276df4e23 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:43.133680459 +0000 UTC m=+65.602289217 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/f08fcec0-1201-4e71-9f48-1e8276df4e23-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8pgh6" (UID: "f08fcec0-1201-4e71-9f48-1e8276df4e23") : secret "insights-runtime-extractor-tls" not found Apr 16 18:10:35.332189 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:35.332160 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qpjc" Apr 16 18:10:41.895182 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:41.895147 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:41.895182 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:41.895186 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:41.895726 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:41.895207 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:41.895726 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:41.895295 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-8mn2l\" (UID: \"3c225701-67ef-433b-93da-7745170f4769\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l" Apr 16 18:10:41.895992 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:41.895966 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/138781da-d4c5-4c2a-b64e-b740488095cf-service-ca-bundle\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:41.898088 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:41.898054 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3c225701-67ef-433b-93da-7745170f4769-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-8mn2l\" (UID: \"3c225701-67ef-433b-93da-7745170f4769\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l" Apr 16 18:10:41.898184 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:41.898140 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/138781da-d4c5-4c2a-b64e-b740488095cf-metrics-certs\") pod \"router-default-c6f985f48-6wxzx\" (UID: \"138781da-d4c5-4c2a-b64e-b740488095cf\") " pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:41.898184 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:41.898160 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls\") pod \"image-registry-57dc669799-6pc56\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:41.995906 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:41.995862 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-8k8jm\" (UID: \"4ba55740-f7dd-4c3c-9da7-00fa99217735\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" Apr 16 18:10:41.996084 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:41.995991 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tjvkf\" (UID: \"8e947e1e-4646-485c-a1cf-45fa455fe359\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf" Apr 16 18:10:41.998390 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:41.998361 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ba55740-f7dd-4c3c-9da7-00fa99217735-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-8k8jm\" (UID: \"4ba55740-f7dd-4c3c-9da7-00fa99217735\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" Apr 16 18:10:41.998519 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:41.998453 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8e947e1e-4646-485c-a1cf-45fa455fe359-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-tjvkf\" (UID: \"8e947e1e-4646-485c-a1cf-45fa455fe359\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf" Apr 16 18:10:42.029619 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.029592 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-zpwl4\"" Apr 16 18:10:42.031161 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.031143 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-v2fsg\"" Apr 16 18:10:42.037433 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.037382 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l" Apr 16 18:10:42.039039 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.039021 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:42.068945 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.068914 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-2v4gp\"" Apr 16 18:10:42.077633 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.077345 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:42.100194 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.099264 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls\") pod \"dns-default-4d9lv\" (UID: \"ac8b3146-c9ef-45ff-a401-4847e957c45c\") " pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:42.100194 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.099349 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert\") pod \"ingress-canary-7gpt8\" (UID: \"3399f6b2-54c3-4ead-9fed-519bebb162da\") " pod="openshift-ingress-canary/ingress-canary-7gpt8" Apr 16 18:10:42.103563 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.103512 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3399f6b2-54c3-4ead-9fed-519bebb162da-cert\") pod \"ingress-canary-7gpt8\" (UID: \"3399f6b2-54c3-4ead-9fed-519bebb162da\") " pod="openshift-ingress-canary/ingress-canary-7gpt8" Apr 16 18:10:42.104251 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.104191 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac8b3146-c9ef-45ff-a401-4847e957c45c-metrics-tls\") pod \"dns-default-4d9lv\" (UID: \"ac8b3146-c9ef-45ff-a401-4847e957c45c\") " pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:42.127472 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.127215 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-g72qw\"" Apr 16 18:10:42.134337 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.134226 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" Apr 16 18:10:42.156924 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.156716 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-xkp6q\"" Apr 16 18:10:42.164373 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.163988 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf" Apr 16 18:10:42.184230 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.184189 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57dc669799-6pc56"] Apr 16 18:10:42.190603 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:10:42.190160 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8668273d_77af_4f62_9e62_4fcaab486fec.slice/crio-736dd0ddaac4698804cbc8c6dc5a7c9ab97da60da3c31ecc0d939a1ae9f8486d WatchSource:0}: Error finding container 736dd0ddaac4698804cbc8c6dc5a7c9ab97da60da3c31ecc0d939a1ae9f8486d: Status 404 returned error can't find the container with id 736dd0ddaac4698804cbc8c6dc5a7c9ab97da60da3c31ecc0d939a1ae9f8486d Apr 16 18:10:42.203537 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.203491 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l"] Apr 16 18:10:42.242332 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.238460 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-c6f985f48-6wxzx"] Apr 16 18:10:42.242332 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.240591 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xkg79\"" Apr 16 18:10:42.242332 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.241305 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7tj5h\"" Apr 16 18:10:42.254875 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.254054 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7gpt8" Apr 16 18:10:42.254875 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.254542 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:42.328028 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.327417 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm"] Apr 16 18:10:42.334584 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:10:42.334555 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ba55740_f7dd_4c3c_9da7_00fa99217735.slice/crio-5b03f5bbead7a0f6eaf76d41a782cc402727b6e64a91ceb72fe9f7a947e27c42 WatchSource:0}: Error finding container 5b03f5bbead7a0f6eaf76d41a782cc402727b6e64a91ceb72fe9f7a947e27c42: Status 404 returned error can't find the container with id 5b03f5bbead7a0f6eaf76d41a782cc402727b6e64a91ceb72fe9f7a947e27c42 Apr 16 18:10:42.352536 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.350975 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf"] Apr 16 18:10:42.365138 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:10:42.365101 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e947e1e_4646_485c_a1cf_45fa455fe359.slice/crio-b9ab726e4729b020d6c6623333a3b930e34f9592d7b28e68220f18eb820ea435 WatchSource:0}: Error finding container b9ab726e4729b020d6c6623333a3b930e34f9592d7b28e68220f18eb820ea435: Status 404 returned error can't find the container with id b9ab726e4729b020d6c6623333a3b930e34f9592d7b28e68220f18eb820ea435 Apr 16 18:10:42.430749 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.430722 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7gpt8"] Apr 16 18:10:42.432748 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:10:42.432721 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3399f6b2_54c3_4ead_9fed_519bebb162da.slice/crio-d90862905ad4809779374a87b96004f3010d195c8c54dcdc61682d4d1489cefa WatchSource:0}: Error finding container d90862905ad4809779374a87b96004f3010d195c8c54dcdc61682d4d1489cefa: Status 404 returned error can't find the container with id d90862905ad4809779374a87b96004f3010d195c8c54dcdc61682d4d1489cefa Apr 16 18:10:42.448250 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.448227 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4d9lv"] Apr 16 18:10:42.450682 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:10:42.450650 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac8b3146_c9ef_45ff_a401_4847e957c45c.slice/crio-e7ae2f97c5302bb3936201abb6cda15e643dc8706e5ce5e2704b11f5783d2a14 WatchSource:0}: Error finding container e7ae2f97c5302bb3936201abb6cda15e643dc8706e5ce5e2704b11f5783d2a14: Status 404 returned error can't find the container with id e7ae2f97c5302bb3936201abb6cda15e643dc8706e5ce5e2704b11f5783d2a14 Apr 16 18:10:42.601610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.601572 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57dc669799-6pc56" event={"ID":"8668273d-77af-4f62-9e62-4fcaab486fec","Type":"ContainerStarted","Data":"c75826fa40d039c64c583f86aa51a49760165f28d1f1544d95847286614de4b5"} Apr 16 18:10:42.601799 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.601615 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57dc669799-6pc56" event={"ID":"8668273d-77af-4f62-9e62-4fcaab486fec","Type":"ContainerStarted","Data":"736dd0ddaac4698804cbc8c6dc5a7c9ab97da60da3c31ecc0d939a1ae9f8486d"} Apr 16 18:10:42.601799 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.601659 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:10:42.602662 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.602639 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf" event={"ID":"8e947e1e-4646-485c-a1cf-45fa455fe359","Type":"ContainerStarted","Data":"b9ab726e4729b020d6c6623333a3b930e34f9592d7b28e68220f18eb820ea435"} Apr 16 18:10:42.603810 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.603779 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" event={"ID":"4ba55740-f7dd-4c3c-9da7-00fa99217735","Type":"ContainerStarted","Data":"5b03f5bbead7a0f6eaf76d41a782cc402727b6e64a91ceb72fe9f7a947e27c42"} Apr 16 18:10:42.604876 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.604855 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7gpt8" event={"ID":"3399f6b2-54c3-4ead-9fed-519bebb162da","Type":"ContainerStarted","Data":"d90862905ad4809779374a87b96004f3010d195c8c54dcdc61682d4d1489cefa"} Apr 16 18:10:42.606026 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.606002 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4d9lv" event={"ID":"ac8b3146-c9ef-45ff-a401-4847e957c45c","Type":"ContainerStarted","Data":"e7ae2f97c5302bb3936201abb6cda15e643dc8706e5ce5e2704b11f5783d2a14"} Apr 16 18:10:42.607479 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.607458 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-c6f985f48-6wxzx" event={"ID":"138781da-d4c5-4c2a-b64e-b740488095cf","Type":"ContainerStarted","Data":"f424201aad5d7b30147a2898b121ccb8885b237d8757d0b1f0fbac4fb40a020f"} Apr 16 18:10:42.607566 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.607484 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-c6f985f48-6wxzx" event={"ID":"138781da-d4c5-4c2a-b64e-b740488095cf","Type":"ContainerStarted","Data":"28940b07bf8b558642c11ac90683de57f7110271add4a7e123010d1b6e0aad41"} Apr 16 18:10:42.608611 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.608589 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l" event={"ID":"3c225701-67ef-433b-93da-7745170f4769","Type":"ContainerStarted","Data":"68c41f5da0a4748486e0ce5d729b5ad572607e44b176720a1f7ccd1eaed57dd4"} Apr 16 18:10:42.620834 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.620779 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-57dc669799-6pc56" podStartSLOduration=64.620761116 podStartE2EDuration="1m4.620761116s" podCreationTimestamp="2026-04-16 18:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:10:42.619712118 +0000 UTC m=+65.088320903" watchObservedRunningTime="2026-04-16 18:10:42.620761116 +0000 UTC m=+65.089369893" Apr 16 18:10:42.637828 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:42.637780 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-c6f985f48-6wxzx" podStartSLOduration=58.637764034 podStartE2EDuration="58.637764034s" podCreationTimestamp="2026-04-16 18:09:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:10:42.637206103 +0000 UTC m=+65.105814880" watchObservedRunningTime="2026-04-16 18:10:42.637764034 +0000 UTC m=+65.106372810" Apr 16 18:10:43.078483 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:43.078444 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:43.081717 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:43.081364 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:43.210015 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:43.209968 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f08fcec0-1201-4e71-9f48-1e8276df4e23-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:43.219815 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:43.219745 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f08fcec0-1201-4e71-9f48-1e8276df4e23-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8pgh6\" (UID: \"f08fcec0-1201-4e71-9f48-1e8276df4e23\") " pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:43.267709 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:43.267494 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bxtnw\"" Apr 16 18:10:43.274933 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:43.274908 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8pgh6" Apr 16 18:10:43.448983 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:43.448920 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8pgh6"] Apr 16 18:10:43.613774 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:43.613171 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:43.614727 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:43.614532 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-c6f985f48-6wxzx" Apr 16 18:10:43.816510 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:43.815879 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs\") pod \"network-metrics-daemon-pq587\" (UID: \"5b1617ae-f25b-4a90-adf4-ca28c7c22774\") " pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:10:43.822923 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:43.822853 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b1617ae-f25b-4a90-adf4-ca28c7c22774-metrics-certs\") pod \"network-metrics-daemon-pq587\" (UID: \"5b1617ae-f25b-4a90-adf4-ca28c7c22774\") " pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:10:43.975662 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:10:43.975621 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf08fcec0_1201_4e71_9f48_1e8276df4e23.slice/crio-68c254440ec43e0a88e0c8871bc5136d27500c1489200018cef0929a1dab2841 WatchSource:0}: Error finding container 68c254440ec43e0a88e0c8871bc5136d27500c1489200018cef0929a1dab2841: Status 404 returned error can't find the container with id 68c254440ec43e0a88e0c8871bc5136d27500c1489200018cef0929a1dab2841 Apr 16 18:10:44.106015 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:44.105972 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vqckd\"" Apr 16 18:10:44.113631 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:44.113600 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pq587" Apr 16 18:10:44.617110 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:44.617072 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8pgh6" event={"ID":"f08fcec0-1201-4e71-9f48-1e8276df4e23","Type":"ContainerStarted","Data":"68c254440ec43e0a88e0c8871bc5136d27500c1489200018cef0929a1dab2841"} Apr 16 18:10:46.220537 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:46.220488 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pq587"] Apr 16 18:10:46.624618 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:46.624509 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pq587" event={"ID":"5b1617ae-f25b-4a90-adf4-ca28c7c22774","Type":"ContainerStarted","Data":"ed10b177af668587da6c1f95696a2d09aa8a590a689a53d43ef98f4ecb34e92e"} Apr 16 18:10:46.626379 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:46.626348 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4d9lv" event={"ID":"ac8b3146-c9ef-45ff-a401-4847e957c45c","Type":"ContainerStarted","Data":"0ede357c24315f2c311bd5c3967d0fe2781aa9d562f7d3fa02f1a3f32fb69a2c"} Apr 16 18:10:46.626539 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:46.626386 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4d9lv" event={"ID":"ac8b3146-c9ef-45ff-a401-4847e957c45c","Type":"ContainerStarted","Data":"b85d17396d9dbba1311e57db74e8b81fb8c99b4f9fc8f5a3bc49c851077641a0"} Apr 16 18:10:46.626539 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:46.626466 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-4d9lv" Apr 16 18:10:46.628066 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:46.628028 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l" event={"ID":"3c225701-67ef-433b-93da-7745170f4769","Type":"ContainerStarted","Data":"3a873cbb96b2cf035eeff0bf5e2029ba0630040668c8c49b2a4700c93b9c5b26"} Apr 16 18:10:46.628066 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:46.628059 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l" event={"ID":"3c225701-67ef-433b-93da-7745170f4769","Type":"ContainerStarted","Data":"0e748f461919d9d629420672d40c1912ce3039bd8f380230e92453dfebdd74f4"} Apr 16 18:10:46.629522 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:46.629502 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8pgh6" event={"ID":"f08fcec0-1201-4e71-9f48-1e8276df4e23","Type":"ContainerStarted","Data":"971037d814ef0a0ba59a7b3ae79ecfdde840a4af710789e910784230aa6a915d"} Apr 16 18:10:46.631619 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:46.631586 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf" event={"ID":"8e947e1e-4646-485c-a1cf-45fa455fe359","Type":"ContainerStarted","Data":"5154461553fd35e68328d2a2b3dbaafec1faa1db5e6d69eb3478ad2c210274f1"} Apr 16 18:10:46.633041 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:46.633011 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" event={"ID":"4ba55740-f7dd-4c3c-9da7-00fa99217735","Type":"ContainerStarted","Data":"56cf9764a731bf6fc182635b41bb35d61f55a31e14c7017d5d08bfb3349e545d"} Apr 16 18:10:46.634564 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:46.634544 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7gpt8" event={"ID":"3399f6b2-54c3-4ead-9fed-519bebb162da","Type":"ContainerStarted","Data":"702c0f282825c7540b6f8846aba43dd6f7548d277d17385d5ee451a2f93929ce"} Apr 16 18:10:46.643060 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:46.643010 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4d9lv" podStartSLOduration=34.031033381 podStartE2EDuration="37.642994761s" podCreationTimestamp="2026-04-16 18:10:09 +0000 UTC" firstStartedPulling="2026-04-16 18:10:42.452809979 +0000 UTC m=+64.921418738" lastFinishedPulling="2026-04-16 18:10:46.064771352 +0000 UTC m=+68.533380118" observedRunningTime="2026-04-16 18:10:46.642391811 +0000 UTC m=+69.111000589" watchObservedRunningTime="2026-04-16 18:10:46.642994761 +0000 UTC m=+69.111603539" Apr 16 18:10:46.658296 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:46.658232 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-tjvkf" podStartSLOduration=56.961548989 podStartE2EDuration="1m0.65821167s" podCreationTimestamp="2026-04-16 18:09:46 +0000 UTC" firstStartedPulling="2026-04-16 18:10:42.368107582 +0000 UTC m=+64.836716351" lastFinishedPulling="2026-04-16 18:10:46.064770277 +0000 UTC m=+68.533379032" observedRunningTime="2026-04-16 18:10:46.657557973 +0000 UTC m=+69.126166752" watchObservedRunningTime="2026-04-16 18:10:46.65821167 +0000 UTC m=+69.126820449" Apr 16 18:10:46.674305 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:46.674243 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-8mn2l" podStartSLOduration=58.939318949 podStartE2EDuration="1m2.674222694s" podCreationTimestamp="2026-04-16 18:09:44 +0000 UTC" firstStartedPulling="2026-04-16 18:10:42.332594154 +0000 UTC m=+64.801202915" lastFinishedPulling="2026-04-16 18:10:46.067497889 +0000 UTC m=+68.536106660" observedRunningTime="2026-04-16 18:10:46.672328598 +0000 UTC m=+69.140937376" watchObservedRunningTime="2026-04-16 18:10:46.674222694 +0000 UTC m=+69.142831472" Apr 16 18:10:46.690138 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:46.689800 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-8k8jm" podStartSLOduration=58.958673062 podStartE2EDuration="1m2.689780733s" podCreationTimestamp="2026-04-16 18:09:44 +0000 UTC" firstStartedPulling="2026-04-16 18:10:42.33696456 +0000 UTC m=+64.805573344" lastFinishedPulling="2026-04-16 18:10:46.068072256 +0000 UTC m=+68.536681015" observedRunningTime="2026-04-16 18:10:46.688087593 +0000 UTC m=+69.156696371" watchObservedRunningTime="2026-04-16 18:10:46.689780733 +0000 UTC m=+69.158389512" Apr 16 18:10:46.703969 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:46.703863 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7gpt8" podStartSLOduration=34.071474365 podStartE2EDuration="37.703844745s" podCreationTimestamp="2026-04-16 18:10:09 +0000 UTC" firstStartedPulling="2026-04-16 18:10:42.434815912 +0000 UTC m=+64.903424666" lastFinishedPulling="2026-04-16 18:10:46.067186279 +0000 UTC m=+68.535795046" observedRunningTime="2026-04-16 18:10:46.703045012 +0000 UTC m=+69.171653791" watchObservedRunningTime="2026-04-16 18:10:46.703844745 +0000 UTC m=+69.172453524" Apr 16 18:10:47.150343 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:47.150265 2563 scope.go:117] "RemoveContainer" containerID="7f7ee726319bff0adabdd2cba1056daa6d027efda4290df404c5c0215a55c775" Apr 16 18:10:47.639846 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:47.639810 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pq587" event={"ID":"5b1617ae-f25b-4a90-adf4-ca28c7c22774","Type":"ContainerStarted","Data":"8fefef79196648a65011d5452170e1bbb2d97e1401da5a929f5821155931c9c7"} Apr 16 18:10:47.639846 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:47.639852 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pq587" event={"ID":"5b1617ae-f25b-4a90-adf4-ca28c7c22774","Type":"ContainerStarted","Data":"d80f06a663db0066f1a1363db7e0fdf95e6b709d9fd9796a01d90229ed4a4018"} Apr 16 18:10:47.641604 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:47.641579 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:10:47.641735 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:47.641682 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" event={"ID":"e658950a-b3d3-49ff-a5ce-4445a68ef06f","Type":"ContainerStarted","Data":"28837274a98a24acc25d825dc63ee3aa8bc13560dc125e72152470cea6e61063"} Apr 16 18:10:47.642013 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:47.641984 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:47.643464 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:47.643355 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8pgh6" event={"ID":"f08fcec0-1201-4e71-9f48-1e8276df4e23","Type":"ContainerStarted","Data":"64843401baf2446bbeff975ed5dba292557e8ef9e9e53648052e958c960f3421"} Apr 16 18:10:47.656077 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:47.655972 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-pq587" podStartSLOduration=68.614530755 podStartE2EDuration="1m9.655955372s" podCreationTimestamp="2026-04-16 18:09:38 +0000 UTC" firstStartedPulling="2026-04-16 18:10:46.234460097 +0000 UTC m=+68.703068853" lastFinishedPulling="2026-04-16 18:10:47.275884703 +0000 UTC m=+69.744493470" observedRunningTime="2026-04-16 18:10:47.655460103 +0000 UTC m=+70.124068879" watchObservedRunningTime="2026-04-16 18:10:47.655955372 +0000 UTC m=+70.124564151" Apr 16 18:10:48.642251 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:48.642207 2563 patch_prober.go:28] interesting pod/console-operator-d87b8d5fc-z7qz9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.132.0.9:8443/readyz\": context deadline exceeded" start-of-body= Apr 16 18:10:48.642678 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:48.642312 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" podUID="e658950a-b3d3-49ff-a5ce-4445a68ef06f" containerName="console-operator" probeResult="failure" output="Get \"https://10.132.0.9:8443/readyz\": context deadline exceeded" Apr 16 18:10:48.754710 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:48.754674 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-z7qz9" Apr 16 18:10:49.650425 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:49.650367 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8pgh6" event={"ID":"f08fcec0-1201-4e71-9f48-1e8276df4e23","Type":"ContainerStarted","Data":"55c093142c8bc5cc440fcba4b5021a3aecb3254564f5268a8fad9355d37f8cc4"} Apr 16 18:10:49.675754 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:49.675694 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8pgh6" podStartSLOduration=20.204814265 podStartE2EDuration="22.675676029s" podCreationTimestamp="2026-04-16 18:10:27 +0000 UTC" firstStartedPulling="2026-04-16 18:10:46.1797284 +0000 UTC m=+68.648337169" lastFinishedPulling="2026-04-16 18:10:48.650590165 +0000 UTC m=+71.119198933" observedRunningTime="2026-04-16 18:10:49.672852746 +0000 UTC m=+72.141461533" watchObservedRunningTime="2026-04-16 18:10:49.675676029 +0000 UTC m=+72.144284857" Apr 16 18:10:52.499774 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.499742 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-hf7xl"] Apr 16 18:10:52.542801 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.542769 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-pffnh"] Apr 16 18:10:52.542996 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.542883 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-hf7xl" Apr 16 18:10:52.545929 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.545904 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:10:52.546066 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.545907 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:10:52.546066 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.545974 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-gx78z\"" Apr 16 18:10:52.573574 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.573551 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-pffnh"] Apr 16 18:10:52.573715 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.573579 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-hf7xl"] Apr 16 18:10:52.573715 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.573689 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-pffnh" Apr 16 18:10:52.576472 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.576450 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-zt76h\"" Apr 16 18:10:52.576608 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.576499 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 18:10:52.697142 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.697113 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/72ac79cd-2f56-4005-b2db-3224d498cafe-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-pffnh\" (UID: \"72ac79cd-2f56-4005-b2db-3224d498cafe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-pffnh" Apr 16 18:10:52.697321 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.697168 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gvnj\" (UniqueName: \"kubernetes.io/projected/44f4ed53-7754-478c-a9d2-daaa094109e3-kube-api-access-4gvnj\") pod \"downloads-586b57c7b4-hf7xl\" (UID: \"44f4ed53-7754-478c-a9d2-daaa094109e3\") " pod="openshift-console/downloads-586b57c7b4-hf7xl" Apr 16 18:10:52.797790 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.797705 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/72ac79cd-2f56-4005-b2db-3224d498cafe-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-pffnh\" (UID: \"72ac79cd-2f56-4005-b2db-3224d498cafe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-pffnh" Apr 16 18:10:52.797790 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.797768 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gvnj\" (UniqueName: \"kubernetes.io/projected/44f4ed53-7754-478c-a9d2-daaa094109e3-kube-api-access-4gvnj\") pod \"downloads-586b57c7b4-hf7xl\" (UID: \"44f4ed53-7754-478c-a9d2-daaa094109e3\") " pod="openshift-console/downloads-586b57c7b4-hf7xl" Apr 16 18:10:52.800411 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.800371 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/72ac79cd-2f56-4005-b2db-3224d498cafe-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-pffnh\" (UID: \"72ac79cd-2f56-4005-b2db-3224d498cafe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-pffnh" Apr 16 18:10:52.807294 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.807267 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gvnj\" (UniqueName: \"kubernetes.io/projected/44f4ed53-7754-478c-a9d2-daaa094109e3-kube-api-access-4gvnj\") pod \"downloads-586b57c7b4-hf7xl\" (UID: \"44f4ed53-7754-478c-a9d2-daaa094109e3\") " pod="openshift-console/downloads-586b57c7b4-hf7xl" Apr 16 18:10:52.852507 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.852480 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-hf7xl" Apr 16 18:10:52.883039 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.882999 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-pffnh" Apr 16 18:10:52.986151 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:52.986085 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-hf7xl"] Apr 16 18:10:52.989680 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:10:52.989650 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44f4ed53_7754_478c_a9d2_daaa094109e3.slice/crio-f7d5ba5c5346596fe05c0cc4a8c8db7776ef2797992ab5598ccf9748810ef441 WatchSource:0}: Error finding container f7d5ba5c5346596fe05c0cc4a8c8db7776ef2797992ab5598ccf9748810ef441: Status 404 returned error can't find the container with id f7d5ba5c5346596fe05c0cc4a8c8db7776ef2797992ab5598ccf9748810ef441 Apr 16 18:10:53.028302 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:53.028273 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-pffnh"] Apr 16 18:10:53.031811 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:10:53.031781 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72ac79cd_2f56_4005_b2db_3224d498cafe.slice/crio-4a1587d2929d48c6b7aaf2cfd81ab8e2b12f37001301774d586953e09672fc48 WatchSource:0}: Error finding container 4a1587d2929d48c6b7aaf2cfd81ab8e2b12f37001301774d586953e09672fc48: Status 404 returned error can't find the container with id 4a1587d2929d48c6b7aaf2cfd81ab8e2b12f37001301774d586953e09672fc48 Apr 16 18:10:53.664565 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:53.664520 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-pffnh" event={"ID":"72ac79cd-2f56-4005-b2db-3224d498cafe","Type":"ContainerStarted","Data":"4a1587d2929d48c6b7aaf2cfd81ab8e2b12f37001301774d586953e09672fc48"} Apr 16 18:10:53.665775 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:53.665740 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-hf7xl" event={"ID":"44f4ed53-7754-478c-a9d2-daaa094109e3","Type":"ContainerStarted","Data":"f7d5ba5c5346596fe05c0cc4a8c8db7776ef2797992ab5598ccf9748810ef441"} Apr 16 18:10:55.673489 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:55.673449 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-pffnh" event={"ID":"72ac79cd-2f56-4005-b2db-3224d498cafe","Type":"ContainerStarted","Data":"feb20cf7123569006e191b4e51d8933cbd518e913b50408ab64997b9741385be"} Apr 16 18:10:55.673961 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:55.673675 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-pffnh" Apr 16 18:10:55.680488 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:55.680466 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-pffnh" Apr 16 18:10:55.691431 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:55.691366 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-pffnh" podStartSLOduration=2.073927296 podStartE2EDuration="3.691354424s" podCreationTimestamp="2026-04-16 18:10:52 +0000 UTC" firstStartedPulling="2026-04-16 18:10:53.033802494 +0000 UTC m=+75.502411248" lastFinishedPulling="2026-04-16 18:10:54.651229605 +0000 UTC m=+77.119838376" observedRunningTime="2026-04-16 18:10:55.690177606 +0000 UTC m=+78.158786385" watchObservedRunningTime="2026-04-16 18:10:55.691354424 +0000 UTC m=+78.159963200" Apr 16 18:10:56.547085 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:56.546967 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4xm4j" Apr 16 18:10:56.646516 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:10:56.646484 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4d9lv" Apr 16 18:11:01.143810 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.143770 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-wp4mg"] Apr 16 18:11:01.178520 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.178496 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.181610 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.181584 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:11:01.182077 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.181728 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:11:01.182077 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.181818 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-rnk47\"" Apr 16 18:11:01.182077 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.181963 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:11:01.182077 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.182002 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:11:01.265990 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.265727 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/55d17b00-445e-4125-883e-c5af9e702a30-node-exporter-textfile\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.265990 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.265780 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/55d17b00-445e-4125-883e-c5af9e702a30-node-exporter-wtmp\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.265990 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.265810 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/55d17b00-445e-4125-883e-c5af9e702a30-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.265990 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.265844 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/55d17b00-445e-4125-883e-c5af9e702a30-node-exporter-accelerators-collector-config\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.265990 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.265896 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk8pz\" (UniqueName: \"kubernetes.io/projected/55d17b00-445e-4125-883e-c5af9e702a30-kube-api-access-sk8pz\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.265990 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.265923 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/55d17b00-445e-4125-883e-c5af9e702a30-metrics-client-ca\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.266462 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.266027 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/55d17b00-445e-4125-883e-c5af9e702a30-sys\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.266462 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.266063 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/55d17b00-445e-4125-883e-c5af9e702a30-node-exporter-tls\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.266462 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.266092 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/55d17b00-445e-4125-883e-c5af9e702a30-root\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.367234 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.367201 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/55d17b00-445e-4125-883e-c5af9e702a30-sys\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.367428 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.367250 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/55d17b00-445e-4125-883e-c5af9e702a30-node-exporter-tls\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.367428 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.367282 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/55d17b00-445e-4125-883e-c5af9e702a30-root\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.367428 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.367305 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/55d17b00-445e-4125-883e-c5af9e702a30-sys\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.367428 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.367317 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/55d17b00-445e-4125-883e-c5af9e702a30-node-exporter-textfile\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.367428 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.367371 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/55d17b00-445e-4125-883e-c5af9e702a30-node-exporter-wtmp\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.367428 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.367418 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/55d17b00-445e-4125-883e-c5af9e702a30-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.367744 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.367454 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/55d17b00-445e-4125-883e-c5af9e702a30-node-exporter-accelerators-collector-config\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.367744 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.367507 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sk8pz\" (UniqueName: \"kubernetes.io/projected/55d17b00-445e-4125-883e-c5af9e702a30-kube-api-access-sk8pz\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.367744 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.367537 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/55d17b00-445e-4125-883e-c5af9e702a30-metrics-client-ca\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.368346 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.368126 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/55d17b00-445e-4125-883e-c5af9e702a30-metrics-client-ca\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.368346 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.368292 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/55d17b00-445e-4125-883e-c5af9e702a30-node-exporter-wtmp\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.368346 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.368335 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/55d17b00-445e-4125-883e-c5af9e702a30-node-exporter-textfile\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.368795 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.368352 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/55d17b00-445e-4125-883e-c5af9e702a30-root\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.368795 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.368497 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/55d17b00-445e-4125-883e-c5af9e702a30-node-exporter-accelerators-collector-config\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.370723 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.370698 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/55d17b00-445e-4125-883e-c5af9e702a30-node-exporter-tls\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.371251 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.371207 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/55d17b00-445e-4125-883e-c5af9e702a30-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.377475 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.377449 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk8pz\" (UniqueName: \"kubernetes.io/projected/55d17b00-445e-4125-883e-c5af9e702a30-kube-api-access-sk8pz\") pod \"node-exporter-wp4mg\" (UID: \"55d17b00-445e-4125-883e-c5af9e702a30\") " pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.491385 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.491346 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wp4mg" Apr 16 18:11:01.700481 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:01.700440 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wp4mg" event={"ID":"55d17b00-445e-4125-883e-c5af9e702a30","Type":"ContainerStarted","Data":"d34bcc586fe8c59fa7144b255ddd71d7753d391465186b4942755f1ccaa380c3"} Apr 16 18:11:02.045466 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:02.045114 2563 patch_prober.go:28] interesting pod/image-registry-57dc669799-6pc56 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:11:02.045466 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:02.045180 2563 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-57dc669799-6pc56" podUID="8668273d-77af-4f62-9e62-4fcaab486fec" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:11:03.618678 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:03.618646 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:11:03.709253 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:03.709212 2563 generic.go:358] "Generic (PLEG): container finished" podID="55d17b00-445e-4125-883e-c5af9e702a30" containerID="4e8d9c954f9c1aca4d3d8bbe56523852353100205ef5022a9a7c5113ad446564" exitCode=0 Apr 16 18:11:03.709438 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:03.709260 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wp4mg" event={"ID":"55d17b00-445e-4125-883e-c5af9e702a30","Type":"ContainerDied","Data":"4e8d9c954f9c1aca4d3d8bbe56523852353100205ef5022a9a7c5113ad446564"} Apr 16 18:11:10.740025 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:10.739980 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wp4mg" event={"ID":"55d17b00-445e-4125-883e-c5af9e702a30","Type":"ContainerStarted","Data":"512400a61fc73d932634cc87bf3cc9b29c7ab4fd0f258090e01bb4d1713191ea"} Apr 16 18:11:10.740025 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:10.740027 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wp4mg" event={"ID":"55d17b00-445e-4125-883e-c5af9e702a30","Type":"ContainerStarted","Data":"2eb8ecf7975f25e585394ca7a9063c951293c7c9171a4e58cd5aa5fbf766d8fa"} Apr 16 18:11:10.741600 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:10.741569 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-hf7xl" event={"ID":"44f4ed53-7754-478c-a9d2-daaa094109e3","Type":"ContainerStarted","Data":"3b3c240e91712a96523eb13227794b3bb9e8911088263e35bc21ffdec3af0542"} Apr 16 18:11:10.741832 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:10.741813 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-hf7xl" Apr 16 18:11:10.760701 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:10.760644 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-wp4mg" podStartSLOduration=8.397120696 podStartE2EDuration="9.760625966s" podCreationTimestamp="2026-04-16 18:11:01 +0000 UTC" firstStartedPulling="2026-04-16 18:11:01.50593882 +0000 UTC m=+83.974547588" lastFinishedPulling="2026-04-16 18:11:02.869444099 +0000 UTC m=+85.338052858" observedRunningTime="2026-04-16 18:11:10.758859329 +0000 UTC m=+93.227468130" watchObservedRunningTime="2026-04-16 18:11:10.760625966 +0000 UTC m=+93.229234743" Apr 16 18:11:10.760957 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:10.760930 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-hf7xl" Apr 16 18:11:10.775851 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:10.775788 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-hf7xl" podStartSLOduration=1.482603968 podStartE2EDuration="18.775768985s" podCreationTimestamp="2026-04-16 18:10:52 +0000 UTC" firstStartedPulling="2026-04-16 18:10:52.992225074 +0000 UTC m=+75.460833843" lastFinishedPulling="2026-04-16 18:11:10.285390105 +0000 UTC m=+92.753998860" observedRunningTime="2026-04-16 18:11:10.774794151 +0000 UTC m=+93.243402930" watchObservedRunningTime="2026-04-16 18:11:10.775768985 +0000 UTC m=+93.244377763" Apr 16 18:11:15.134994 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:15.134954 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-57dc669799-6pc56"] Apr 16 18:11:32.815341 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:32.815307 2563 generic.go:358] "Generic (PLEG): container finished" podID="7d0b4ef5-4d33-472c-bc36-98eecd77e026" containerID="a9fc3b3d408ccbe8648e7917cc22a4379bbf4eec09d0a1bbbc0ad4bd229b2328" exitCode=0 Apr 16 18:11:32.815756 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:32.815381 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd" event={"ID":"7d0b4ef5-4d33-472c-bc36-98eecd77e026","Type":"ContainerDied","Data":"a9fc3b3d408ccbe8648e7917cc22a4379bbf4eec09d0a1bbbc0ad4bd229b2328"} Apr 16 18:11:32.815756 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:32.815739 2563 scope.go:117] "RemoveContainer" containerID="a9fc3b3d408ccbe8648e7917cc22a4379bbf4eec09d0a1bbbc0ad4bd229b2328" Apr 16 18:11:33.819274 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:33.819240 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qvbqd" event={"ID":"7d0b4ef5-4d33-472c-bc36-98eecd77e026","Type":"ContainerStarted","Data":"c44e29faf7f28224520009b6af166dbee95a645ffc1d6088dd0679bf7162d2df"} Apr 16 18:11:35.826339 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:35.826259 2563 generic.go:358] "Generic (PLEG): container finished" podID="94f90437-3d9f-443f-9a60-c3237a421595" containerID="7aee0a3ff859d01a834e56512cf2a69c7742604b8f69202af3953861db66f6dd" exitCode=0 Apr 16 18:11:35.826339 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:35.826331 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr" event={"ID":"94f90437-3d9f-443f-9a60-c3237a421595","Type":"ContainerDied","Data":"7aee0a3ff859d01a834e56512cf2a69c7742604b8f69202af3953861db66f6dd"} Apr 16 18:11:35.826840 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:35.826688 2563 scope.go:117] "RemoveContainer" containerID="7aee0a3ff859d01a834e56512cf2a69c7742604b8f69202af3953861db66f6dd" Apr 16 18:11:35.827789 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:35.827763 2563 generic.go:358] "Generic (PLEG): container finished" podID="5869ca3b-420e-46b0-ac02-e5572d8d6b05" containerID="fd3a42b3e94dde3fceeffeba5a2d9cff2dc2163df615fd34e604d3bc73bc3529" exitCode=0 Apr 16 18:11:35.827907 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:35.827798 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" event={"ID":"5869ca3b-420e-46b0-ac02-e5572d8d6b05","Type":"ContainerDied","Data":"fd3a42b3e94dde3fceeffeba5a2d9cff2dc2163df615fd34e604d3bc73bc3529"} Apr 16 18:11:35.828163 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:35.828088 2563 scope.go:117] "RemoveContainer" containerID="fd3a42b3e94dde3fceeffeba5a2d9cff2dc2163df615fd34e604d3bc73bc3529" Apr 16 18:11:36.832727 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:36.832688 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-rxktg" event={"ID":"5869ca3b-420e-46b0-ac02-e5572d8d6b05","Type":"ContainerStarted","Data":"f699015933970be89bce04f1c3c5794e0f1c833f586d05f0373602f1e5b7e8b3"} Apr 16 18:11:36.834377 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:36.834349 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-whkpr" event={"ID":"94f90437-3d9f-443f-9a60-c3237a421595","Type":"ContainerStarted","Data":"173b629767ea7ffcaac8cbc089280cafe4284cc2eff0a09ed8b507329e8b3a9c"} Apr 16 18:11:40.158103 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.158047 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-57dc669799-6pc56" podUID="8668273d-77af-4f62-9e62-4fcaab486fec" containerName="registry" containerID="cri-o://c75826fa40d039c64c583f86aa51a49760165f28d1f1544d95847286614de4b5" gracePeriod=30 Apr 16 18:11:40.409498 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.409432 2563 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" podUID="8743f843-ea56-44b5-baa1-1718ae9ee68a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:11:40.414973 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.414954 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:11:40.521417 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.521359 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8668273d-77af-4f62-9e62-4fcaab486fec-ca-trust-extracted\") pod \"8668273d-77af-4f62-9e62-4fcaab486fec\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " Apr 16 18:11:40.521579 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.521438 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-bound-sa-token\") pod \"8668273d-77af-4f62-9e62-4fcaab486fec\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " Apr 16 18:11:40.521579 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.521491 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8668273d-77af-4f62-9e62-4fcaab486fec-installation-pull-secrets\") pod \"8668273d-77af-4f62-9e62-4fcaab486fec\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " Apr 16 18:11:40.521579 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.521531 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls\") pod \"8668273d-77af-4f62-9e62-4fcaab486fec\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " Apr 16 18:11:40.521579 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.521569 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq8h7\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-kube-api-access-sq8h7\") pod \"8668273d-77af-4f62-9e62-4fcaab486fec\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " Apr 16 18:11:40.521797 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.521597 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8668273d-77af-4f62-9e62-4fcaab486fec-trusted-ca\") pod \"8668273d-77af-4f62-9e62-4fcaab486fec\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " Apr 16 18:11:40.521797 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.521628 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8668273d-77af-4f62-9e62-4fcaab486fec-image-registry-private-configuration\") pod \"8668273d-77af-4f62-9e62-4fcaab486fec\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " Apr 16 18:11:40.521797 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.521653 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8668273d-77af-4f62-9e62-4fcaab486fec-registry-certificates\") pod \"8668273d-77af-4f62-9e62-4fcaab486fec\" (UID: \"8668273d-77af-4f62-9e62-4fcaab486fec\") " Apr 16 18:11:40.522242 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.522137 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8668273d-77af-4f62-9e62-4fcaab486fec-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8668273d-77af-4f62-9e62-4fcaab486fec" (UID: "8668273d-77af-4f62-9e62-4fcaab486fec"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:11:40.522242 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.522187 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8668273d-77af-4f62-9e62-4fcaab486fec-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8668273d-77af-4f62-9e62-4fcaab486fec" (UID: "8668273d-77af-4f62-9e62-4fcaab486fec"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:11:40.524383 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.524303 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8668273d-77af-4f62-9e62-4fcaab486fec" (UID: "8668273d-77af-4f62-9e62-4fcaab486fec"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:11:40.524383 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.524308 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8668273d-77af-4f62-9e62-4fcaab486fec-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8668273d-77af-4f62-9e62-4fcaab486fec" (UID: "8668273d-77af-4f62-9e62-4fcaab486fec"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:11:40.524383 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.524346 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8668273d-77af-4f62-9e62-4fcaab486fec" (UID: "8668273d-77af-4f62-9e62-4fcaab486fec"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:11:40.524383 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.524368 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8668273d-77af-4f62-9e62-4fcaab486fec-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "8668273d-77af-4f62-9e62-4fcaab486fec" (UID: "8668273d-77af-4f62-9e62-4fcaab486fec"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:11:40.524709 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.524517 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-kube-api-access-sq8h7" (OuterVolumeSpecName: "kube-api-access-sq8h7") pod "8668273d-77af-4f62-9e62-4fcaab486fec" (UID: "8668273d-77af-4f62-9e62-4fcaab486fec"). InnerVolumeSpecName "kube-api-access-sq8h7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:11:40.529905 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.529881 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8668273d-77af-4f62-9e62-4fcaab486fec-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8668273d-77af-4f62-9e62-4fcaab486fec" (UID: "8668273d-77af-4f62-9e62-4fcaab486fec"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:40.622846 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.622808 2563 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8668273d-77af-4f62-9e62-4fcaab486fec-ca-trust-extracted\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 16 18:11:40.622846 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.622841 2563 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-bound-sa-token\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 16 18:11:40.622846 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.622851 2563 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8668273d-77af-4f62-9e62-4fcaab486fec-installation-pull-secrets\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 16 18:11:40.623068 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.622862 2563 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-registry-tls\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 16 18:11:40.623068 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.622871 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sq8h7\" (UniqueName: \"kubernetes.io/projected/8668273d-77af-4f62-9e62-4fcaab486fec-kube-api-access-sq8h7\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 16 18:11:40.623068 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.622880 2563 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8668273d-77af-4f62-9e62-4fcaab486fec-trusted-ca\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 16 18:11:40.623068 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.622889 2563 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8668273d-77af-4f62-9e62-4fcaab486fec-image-registry-private-configuration\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 16 18:11:40.623068 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.622898 2563 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8668273d-77af-4f62-9e62-4fcaab486fec-registry-certificates\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 16 18:11:40.851855 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.851815 2563 generic.go:358] "Generic (PLEG): container finished" podID="8668273d-77af-4f62-9e62-4fcaab486fec" containerID="c75826fa40d039c64c583f86aa51a49760165f28d1f1544d95847286614de4b5" exitCode=0 Apr 16 18:11:40.852015 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.851893 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57dc669799-6pc56" Apr 16 18:11:40.852015 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.851891 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57dc669799-6pc56" event={"ID":"8668273d-77af-4f62-9e62-4fcaab486fec","Type":"ContainerDied","Data":"c75826fa40d039c64c583f86aa51a49760165f28d1f1544d95847286614de4b5"} Apr 16 18:11:40.852015 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.851939 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57dc669799-6pc56" event={"ID":"8668273d-77af-4f62-9e62-4fcaab486fec","Type":"ContainerDied","Data":"736dd0ddaac4698804cbc8c6dc5a7c9ab97da60da3c31ecc0d939a1ae9f8486d"} Apr 16 18:11:40.852015 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.851960 2563 scope.go:117] "RemoveContainer" containerID="c75826fa40d039c64c583f86aa51a49760165f28d1f1544d95847286614de4b5" Apr 16 18:11:40.862868 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.862851 2563 scope.go:117] "RemoveContainer" containerID="c75826fa40d039c64c583f86aa51a49760165f28d1f1544d95847286614de4b5" Apr 16 18:11:40.863123 ip-10-0-137-102 kubenswrapper[2563]: E0416 18:11:40.863101 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c75826fa40d039c64c583f86aa51a49760165f28d1f1544d95847286614de4b5\": container with ID starting with c75826fa40d039c64c583f86aa51a49760165f28d1f1544d95847286614de4b5 not found: ID does not exist" containerID="c75826fa40d039c64c583f86aa51a49760165f28d1f1544d95847286614de4b5" Apr 16 18:11:40.863172 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.863132 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c75826fa40d039c64c583f86aa51a49760165f28d1f1544d95847286614de4b5"} err="failed to get container status \"c75826fa40d039c64c583f86aa51a49760165f28d1f1544d95847286614de4b5\": rpc error: code = NotFound desc = could not find container \"c75826fa40d039c64c583f86aa51a49760165f28d1f1544d95847286614de4b5\": container with ID starting with c75826fa40d039c64c583f86aa51a49760165f28d1f1544d95847286614de4b5 not found: ID does not exist" Apr 16 18:11:40.875021 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.874996 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-57dc669799-6pc56"] Apr 16 18:11:40.878710 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:40.878682 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-57dc669799-6pc56"] Apr 16 18:11:42.155024 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:42.154985 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8668273d-77af-4f62-9e62-4fcaab486fec" path="/var/lib/kubelet/pods/8668273d-77af-4f62-9e62-4fcaab486fec/volumes" Apr 16 18:11:50.409302 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:11:50.409259 2563 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" podUID="8743f843-ea56-44b5-baa1-1718ae9ee68a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:12:00.409759 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:12:00.409716 2563 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" podUID="8743f843-ea56-44b5-baa1-1718ae9ee68a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:12:00.410220 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:12:00.409789 2563 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" Apr 16 18:12:00.410424 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:12:00.410386 2563 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"784a813537c61336b8d20862609724a48ca07a1aee9e28c857f793ee02cb459c"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 18:12:00.410469 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:12:00.410451 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" podUID="8743f843-ea56-44b5-baa1-1718ae9ee68a" containerName="service-proxy" containerID="cri-o://784a813537c61336b8d20862609724a48ca07a1aee9e28c857f793ee02cb459c" gracePeriod=30 Apr 16 18:12:00.921282 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:12:00.921246 2563 generic.go:358] "Generic (PLEG): container finished" podID="8743f843-ea56-44b5-baa1-1718ae9ee68a" containerID="784a813537c61336b8d20862609724a48ca07a1aee9e28c857f793ee02cb459c" exitCode=2 Apr 16 18:12:00.921472 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:12:00.921319 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" event={"ID":"8743f843-ea56-44b5-baa1-1718ae9ee68a","Type":"ContainerDied","Data":"784a813537c61336b8d20862609724a48ca07a1aee9e28c857f793ee02cb459c"} Apr 16 18:12:00.921472 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:12:00.921359 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f5d8556-cbvv8" event={"ID":"8743f843-ea56-44b5-baa1-1718ae9ee68a","Type":"ContainerStarted","Data":"51ee64dcb0594e2a9045546924c94135c41419323c8ce0591752687fba7eb01d"} Apr 16 18:14:38.017912 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:14:38.017874 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:14:38.019859 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:14:38.019838 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:14:38.028710 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:14:38.028682 2563 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:19:38.041587 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:19:38.041561 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:19:38.042038 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:19:38.041918 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:24:38.064418 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:24:38.064324 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:24:38.065621 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:24:38.065597 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:29:38.084349 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:29:38.084317 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:29:38.085734 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:29:38.085708 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:34:38.104296 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:34:38.104263 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:34:38.106746 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:34:38.106701 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:39:38.126933 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:39:38.126814 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:39:38.131932 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:39:38.129803 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:44:38.149341 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:44:38.149224 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:44:38.153341 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:44:38.152274 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:49:38.170871 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:49:38.170767 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:49:38.174791 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:49:38.174686 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:54:38.191215 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:54:38.191176 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:54:38.195362 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:54:38.195340 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:58:30.827234 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:30.827198 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-5987t_50d6b6c6-7edb-4c20-9fa7-9d7f97465217/global-pull-secret-syncer/0.log" Apr 16 18:58:30.929676 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:30.929642 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-k4n6b_ebb55998-f72d-48cf-bc9a-55fbe1047ae8/konnectivity-agent/0.log" Apr 16 18:58:31.065998 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:31.065974 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-102.ec2.internal_51e2afdeb323f70af661db7224d1a09f/haproxy/0.log" Apr 16 18:58:34.765609 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:34.765578 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-8k8jm_4ba55740-f7dd-4c3c-9da7-00fa99217735/cluster-monitoring-operator/0.log" Apr 16 18:58:35.030284 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:35.030248 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wp4mg_55d17b00-445e-4125-883e-c5af9e702a30/node-exporter/0.log" Apr 16 18:58:35.057529 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:35.057494 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wp4mg_55d17b00-445e-4125-883e-c5af9e702a30/kube-rbac-proxy/0.log" Apr 16 18:58:35.086616 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:35.086592 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wp4mg_55d17b00-445e-4125-883e-c5af9e702a30/init-textfile/0.log" Apr 16 18:58:35.545173 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:35.545128 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-pffnh_72ac79cd-2f56-4005-b2db-3224d498cafe/prometheus-operator-admission-webhook/0.log" Apr 16 18:58:37.059773 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:37.059735 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-tjvkf_8e947e1e-4646-485c-a1cf-45fa455fe359/networking-console-plugin/0.log" Apr 16 18:58:37.513620 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:37.513590 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/1.log" Apr 16 18:58:37.517783 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:37.517756 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-z7qz9_e658950a-b3d3-49ff-a5ce-4445a68ef06f/console-operator/2.log" Apr 16 18:58:37.930682 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:37.930651 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-hf7xl_44f4ed53-7754-478c-a9d2-daaa094109e3/download-server/0.log" Apr 16 18:58:38.206821 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.206733 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6"] Apr 16 18:58:38.207255 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.207167 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8668273d-77af-4f62-9e62-4fcaab486fec" containerName="registry" Apr 16 18:58:38.207255 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.207187 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="8668273d-77af-4f62-9e62-4fcaab486fec" containerName="registry" Apr 16 18:58:38.207367 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.207263 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="8668273d-77af-4f62-9e62-4fcaab486fec" containerName="registry" Apr 16 18:58:38.210336 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.210315 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:38.213436 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.213417 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lv8lf\"/\"kube-root-ca.crt\"" Apr 16 18:58:38.213551 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.213463 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lv8lf\"/\"openshift-service-ca.crt\"" Apr 16 18:58:38.214677 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.214660 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-lv8lf\"/\"default-dockercfg-r8bqr\"" Apr 16 18:58:38.219147 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.219128 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6"] Apr 16 18:58:38.267772 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.267737 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/db854d32-706f-4723-9db8-88f9f6b81c39-podres\") pod \"perf-node-gather-daemonset-7tqw6\" (UID: \"db854d32-706f-4723-9db8-88f9f6b81c39\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:38.267772 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.267774 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/db854d32-706f-4723-9db8-88f9f6b81c39-sys\") pod \"perf-node-gather-daemonset-7tqw6\" (UID: \"db854d32-706f-4723-9db8-88f9f6b81c39\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:38.267985 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.267854 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/db854d32-706f-4723-9db8-88f9f6b81c39-proc\") pod \"perf-node-gather-daemonset-7tqw6\" (UID: \"db854d32-706f-4723-9db8-88f9f6b81c39\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:38.267985 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.267899 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b68sz\" (UniqueName: \"kubernetes.io/projected/db854d32-706f-4723-9db8-88f9f6b81c39-kube-api-access-b68sz\") pod \"perf-node-gather-daemonset-7tqw6\" (UID: \"db854d32-706f-4723-9db8-88f9f6b81c39\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:38.267985 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.267917 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/db854d32-706f-4723-9db8-88f9f6b81c39-lib-modules\") pod \"perf-node-gather-daemonset-7tqw6\" (UID: \"db854d32-706f-4723-9db8-88f9f6b81c39\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:38.369206 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.369167 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/db854d32-706f-4723-9db8-88f9f6b81c39-proc\") pod \"perf-node-gather-daemonset-7tqw6\" (UID: \"db854d32-706f-4723-9db8-88f9f6b81c39\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:38.369389 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.369226 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b68sz\" (UniqueName: \"kubernetes.io/projected/db854d32-706f-4723-9db8-88f9f6b81c39-kube-api-access-b68sz\") pod \"perf-node-gather-daemonset-7tqw6\" (UID: \"db854d32-706f-4723-9db8-88f9f6b81c39\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:38.369389 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.369246 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/db854d32-706f-4723-9db8-88f9f6b81c39-lib-modules\") pod \"perf-node-gather-daemonset-7tqw6\" (UID: \"db854d32-706f-4723-9db8-88f9f6b81c39\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:38.369389 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.369278 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/db854d32-706f-4723-9db8-88f9f6b81c39-podres\") pod \"perf-node-gather-daemonset-7tqw6\" (UID: \"db854d32-706f-4723-9db8-88f9f6b81c39\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:38.369389 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.369296 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/db854d32-706f-4723-9db8-88f9f6b81c39-sys\") pod \"perf-node-gather-daemonset-7tqw6\" (UID: \"db854d32-706f-4723-9db8-88f9f6b81c39\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:38.369389 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.369304 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/db854d32-706f-4723-9db8-88f9f6b81c39-proc\") pod \"perf-node-gather-daemonset-7tqw6\" (UID: \"db854d32-706f-4723-9db8-88f9f6b81c39\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:38.369389 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.369389 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/db854d32-706f-4723-9db8-88f9f6b81c39-sys\") pod \"perf-node-gather-daemonset-7tqw6\" (UID: \"db854d32-706f-4723-9db8-88f9f6b81c39\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:38.369626 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.369457 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/db854d32-706f-4723-9db8-88f9f6b81c39-podres\") pod \"perf-node-gather-daemonset-7tqw6\" (UID: \"db854d32-706f-4723-9db8-88f9f6b81c39\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:38.369626 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.369457 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/db854d32-706f-4723-9db8-88f9f6b81c39-lib-modules\") pod \"perf-node-gather-daemonset-7tqw6\" (UID: \"db854d32-706f-4723-9db8-88f9f6b81c39\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:38.378363 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.378334 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b68sz\" (UniqueName: \"kubernetes.io/projected/db854d32-706f-4723-9db8-88f9f6b81c39-kube-api-access-b68sz\") pod \"perf-node-gather-daemonset-7tqw6\" (UID: \"db854d32-706f-4723-9db8-88f9f6b81c39\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:38.382533 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.382482 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-z46bv_ea87fd15-995a-4cb6-9f35-8ef427ef8e52/volume-data-source-validator/0.log" Apr 16 18:58:38.521011 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.520921 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:38.641835 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.641666 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6"] Apr 16 18:58:38.644530 ip-10-0-137-102 kubenswrapper[2563]: W0416 18:58:38.644501 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddb854d32_706f_4723_9db8_88f9f6b81c39.slice/crio-4b211b9676b906349002cd0e11699cae82c8d348e46fcf670812956c3445fd69 WatchSource:0}: Error finding container 4b211b9676b906349002cd0e11699cae82c8d348e46fcf670812956c3445fd69: Status 404 returned error can't find the container with id 4b211b9676b906349002cd0e11699cae82c8d348e46fcf670812956c3445fd69 Apr 16 18:58:38.646025 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:38.646006 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:58:39.005735 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:39.005695 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" event={"ID":"db854d32-706f-4723-9db8-88f9f6b81c39","Type":"ContainerStarted","Data":"3a2a9c4f5ebc5804088a8684d1a6b65c4df0324b168ec81d1b6a1518b98f60e5"} Apr 16 18:58:39.005735 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:39.005738 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" event={"ID":"db854d32-706f-4723-9db8-88f9f6b81c39","Type":"ContainerStarted","Data":"4b211b9676b906349002cd0e11699cae82c8d348e46fcf670812956c3445fd69"} Apr 16 18:58:39.005929 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:39.005860 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:39.024047 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:39.023996 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" podStartSLOduration=1.023983478 podStartE2EDuration="1.023983478s" podCreationTimestamp="2026-04-16 18:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:58:39.022385542 +0000 UTC m=+2941.490994313" watchObservedRunningTime="2026-04-16 18:58:39.023983478 +0000 UTC m=+2941.492592274" Apr 16 18:58:39.102862 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:39.102839 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4d9lv_ac8b3146-c9ef-45ff-a401-4847e957c45c/dns/0.log" Apr 16 18:58:39.127251 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:39.127224 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4d9lv_ac8b3146-c9ef-45ff-a401-4847e957c45c/kube-rbac-proxy/0.log" Apr 16 18:58:39.290705 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:39.290626 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5lcj4_1297cac1-5827-4690-b5d6-38c2ba71da4e/dns-node-resolver/0.log" Apr 16 18:58:39.865605 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:39.865571 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4fj5x_757de476-6da8-4345-b7fc-6c36ed994dea/node-ca/0.log" Apr 16 18:58:40.693169 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:40.693142 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-c6f985f48-6wxzx_138781da-d4c5-4c2a-b64e-b740488095cf/router/0.log" Apr 16 18:58:41.035444 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:41.035356 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7gpt8_3399f6b2-54c3-4ead-9fed-519bebb162da/serve-healthcheck-canary/0.log" Apr 16 18:58:41.449630 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:41.449595 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-rxktg_5869ca3b-420e-46b0-ac02-e5572d8d6b05/insights-operator/0.log" Apr 16 18:58:41.450473 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:41.450453 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-rxktg_5869ca3b-420e-46b0-ac02-e5572d8d6b05/insights-operator/1.log" Apr 16 18:58:41.479856 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:41.479832 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8pgh6_f08fcec0-1201-4e71-9f48-1e8276df4e23/kube-rbac-proxy/0.log" Apr 16 18:58:41.506798 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:41.506769 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8pgh6_f08fcec0-1201-4e71-9f48-1e8276df4e23/exporter/0.log" Apr 16 18:58:41.532944 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:41.532895 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8pgh6_f08fcec0-1201-4e71-9f48-1e8276df4e23/extractor/0.log" Apr 16 18:58:45.019147 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:45.019120 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-7tqw6" Apr 16 18:58:48.611818 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:48.611787 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-5hrgg_00d41a7d-3c9c-48f9-adde-5ceda2b430a0/migrator/0.log" Apr 16 18:58:48.637300 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:48.637268 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-5hrgg_00d41a7d-3c9c-48f9-adde-5ceda2b430a0/graceful-termination/0.log" Apr 16 18:58:48.979696 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:48.979662 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-qvbqd_7d0b4ef5-4d33-472c-bc36-98eecd77e026/kube-storage-version-migrator-operator/1.log" Apr 16 18:58:48.980571 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:48.980551 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-qvbqd_7d0b4ef5-4d33-472c-bc36-98eecd77e026/kube-storage-version-migrator-operator/0.log" Apr 16 18:58:49.992123 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:49.992051 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9f96s_495e5ec9-68fc-4e69-a6b1-a92f31029302/kube-multus/0.log" Apr 16 18:58:50.502554 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:50.502522 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v9jdb_afb3aa46-f688-46a6-9d9f-7529d606c9dc/kube-multus-additional-cni-plugins/0.log" Apr 16 18:58:50.546304 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:50.546275 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v9jdb_afb3aa46-f688-46a6-9d9f-7529d606c9dc/egress-router-binary-copy/0.log" Apr 16 18:58:50.588314 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:50.588288 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v9jdb_afb3aa46-f688-46a6-9d9f-7529d606c9dc/cni-plugins/0.log" Apr 16 18:58:50.630611 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:50.630576 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v9jdb_afb3aa46-f688-46a6-9d9f-7529d606c9dc/bond-cni-plugin/0.log" Apr 16 18:58:50.670804 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:50.670771 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v9jdb_afb3aa46-f688-46a6-9d9f-7529d606c9dc/routeoverride-cni/0.log" Apr 16 18:58:50.694758 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:50.694729 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v9jdb_afb3aa46-f688-46a6-9d9f-7529d606c9dc/whereabouts-cni-bincopy/0.log" Apr 16 18:58:50.722177 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:50.722150 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v9jdb_afb3aa46-f688-46a6-9d9f-7529d606c9dc/whereabouts-cni/0.log" Apr 16 18:58:50.943109 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:50.943080 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-pq587_5b1617ae-f25b-4a90-adf4-ca28c7c22774/network-metrics-daemon/0.log" Apr 16 18:58:50.981591 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:50.981562 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-pq587_5b1617ae-f25b-4a90-adf4-ca28c7c22774/kube-rbac-proxy/0.log" Apr 16 18:58:51.764142 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:51.764112 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qpjc_2d1cf278-4df4-49a3-930a-9184e51a38b8/ovn-controller/0.log" Apr 16 18:58:51.802484 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:51.802449 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qpjc_2d1cf278-4df4-49a3-930a-9184e51a38b8/ovn-acl-logging/0.log" Apr 16 18:58:51.824221 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:51.824196 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qpjc_2d1cf278-4df4-49a3-930a-9184e51a38b8/kube-rbac-proxy-node/0.log" Apr 16 18:58:51.848770 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:51.848739 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qpjc_2d1cf278-4df4-49a3-930a-9184e51a38b8/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:58:51.876206 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:51.876178 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qpjc_2d1cf278-4df4-49a3-930a-9184e51a38b8/northd/0.log" Apr 16 18:58:51.903875 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:51.903847 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qpjc_2d1cf278-4df4-49a3-930a-9184e51a38b8/nbdb/0.log" Apr 16 18:58:51.929460 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:51.929429 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qpjc_2d1cf278-4df4-49a3-930a-9184e51a38b8/sbdb/0.log" Apr 16 18:58:52.037154 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:52.037039 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qpjc_2d1cf278-4df4-49a3-930a-9184e51a38b8/ovnkube-controller/0.log" Apr 16 18:58:53.810296 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:53.810254 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-h6q2h_367599ae-c563-446b-95e4-1f750b698283/check-endpoints/0.log" Apr 16 18:58:53.838902 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:53.838871 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-4xm4j_76ba3cac-7c44-4ba0-aefc-cfded09ee26e/network-check-target-container/0.log" Apr 16 18:58:54.888821 ip-10-0-137-102 kubenswrapper[2563]: I0416 18:58:54.888791 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-zxsjz_ffb79d59-c86a-4570-91f6-5257796c1cb9/iptables-alerter/0.log"