Apr 22 15:33:48.541540 ip-10-0-143-128 systemd[1]: Starting Kubernetes Kubelet... Apr 22 15:33:48.997901 ip-10-0-143-128 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:33:48.997901 ip-10-0-143-128 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 15:33:48.997901 ip-10-0-143-128 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:33:48.997901 ip-10-0-143-128 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 15:33:48.997901 ip-10-0-143-128 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:33:49.000428 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.000318 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 15:33:49.003770 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003745 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:33:49.003770 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003768 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:33:49.003770 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003772 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003776 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003780 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003783 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003786 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003788 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003792 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003796 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003798 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003801 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003804 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003807 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003817 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003820 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003823 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003825 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003828 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003831 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003833 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003836 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:33:49.003875 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003839 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003841 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003844 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003846 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003849 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003851 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003854 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003857 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003860 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003862 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003864 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003867 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003870 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003872 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003875 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003878 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003881 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003884 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003886 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003889 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:33:49.004382 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003893 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003896 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003898 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003900 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003904 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003909 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003913 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003916 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003918 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003921 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003924 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003927 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003929 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003932 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003935 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003937 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003940 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003942 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003947 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:33:49.004912 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003951 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003954 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003957 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003960 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003963 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003967 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003969 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003973 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003976 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003978 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003981 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003983 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003986 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003989 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003992 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003995 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.003997 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004000 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004002 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004005 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:33:49.005397 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004007 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004010 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004013 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004015 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004018 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004466 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004472 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004475 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004477 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004480 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004482 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004485 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004488 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004491 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004493 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004496 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004498 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004501 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004504 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:33:49.005964 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004507 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004509 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004512 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004514 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004517 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004519 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004522 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004525 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004528 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004531 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004533 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004536 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004539 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004541 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004544 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004547 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004549 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004551 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004554 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004556 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:33:49.006481 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004559 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004561 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004564 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004566 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004568 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004571 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004574 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004576 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004579 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004582 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004584 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004587 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004590 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004592 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004595 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004597 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004599 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004602 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004606 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004608 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:33:49.007028 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004611 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004613 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004616 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004618 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004621 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004623 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004625 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004628 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004630 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004633 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004635 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004638 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004640 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004642 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004645 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004647 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004649 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004652 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004654 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004656 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:33:49.007537 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004660 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004664 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004667 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004670 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004673 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004675 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004678 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004681 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004683 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004686 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004690 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.004693 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005652 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005671 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005678 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005684 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005690 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005693 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005698 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005702 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 15:33:49.008021 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005706 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005709 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005713 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005716 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005719 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005722 2573 flags.go:64] FLAG: --cgroup-root="" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005725 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005728 2573 flags.go:64] FLAG: --client-ca-file="" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005731 2573 flags.go:64] FLAG: --cloud-config="" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005734 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005737 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005743 2573 flags.go:64] FLAG: --cluster-domain="" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005747 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005750 2573 flags.go:64] FLAG: --config-dir="" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005753 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005756 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005760 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005764 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005767 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005770 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005773 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005776 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005779 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005782 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005785 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 15:33:49.008527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005790 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005793 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005796 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005799 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005802 2573 flags.go:64] FLAG: --enable-server="true" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005805 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005809 2573 flags.go:64] FLAG: --event-burst="100" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005813 2573 flags.go:64] FLAG: --event-qps="50" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005816 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005819 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005822 2573 flags.go:64] FLAG: --eviction-hard="" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005826 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005829 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005832 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005836 2573 flags.go:64] FLAG: --eviction-soft="" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005839 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005842 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005845 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005847 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005851 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005854 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005857 2573 flags.go:64] FLAG: --feature-gates="" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005861 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005864 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005867 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 15:33:49.009160 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005871 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005875 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005878 2573 flags.go:64] FLAG: --help="false" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005881 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-143-128.ec2.internal" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005884 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005887 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005890 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005894 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005897 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005900 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005903 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005906 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005909 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005911 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005915 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005918 2573 flags.go:64] FLAG: --kube-reserved="" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005921 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005924 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005927 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005929 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005932 2573 flags.go:64] FLAG: --lock-file="" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005935 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005938 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005941 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 15:33:49.009758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005947 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005950 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005953 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005957 2573 flags.go:64] FLAG: --logging-format="text" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005960 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005963 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005970 2573 flags.go:64] FLAG: --manifest-url="" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005974 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005978 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005982 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005986 2573 flags.go:64] FLAG: --max-pods="110" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005989 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005992 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005995 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.005998 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006001 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006004 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006007 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006015 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006018 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006021 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006025 2573 flags.go:64] FLAG: --pod-cidr="" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006028 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 15:33:49.010349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006035 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006038 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006041 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006044 2573 flags.go:64] FLAG: --port="10250" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006047 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006050 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ece21eb192c40475" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006053 2573 flags.go:64] FLAG: --qos-reserved="" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006056 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006071 2573 flags.go:64] FLAG: --register-node="true" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006075 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006078 2573 flags.go:64] FLAG: --register-with-taints="" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006081 2573 flags.go:64] FLAG: --registry-burst="10" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006085 2573 flags.go:64] FLAG: --registry-qps="5" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006088 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006091 2573 flags.go:64] FLAG: --reserved-memory="" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006096 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006099 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006102 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006105 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006109 2573 flags.go:64] FLAG: --runonce="false" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006112 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006115 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006118 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006121 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006124 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006127 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 15:33:49.010920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006130 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006134 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006137 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006140 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006142 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006145 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006148 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006151 2573 flags.go:64] FLAG: --system-cgroups="" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006154 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006159 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006162 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006165 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006169 2573 flags.go:64] FLAG: --tls-min-version="" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006172 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006178 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006181 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006184 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006188 2573 flags.go:64] FLAG: --v="2" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006192 2573 flags.go:64] FLAG: --version="false" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006196 2573 flags.go:64] FLAG: --vmodule="" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006201 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.006208 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006319 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006323 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:33:49.011566 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006326 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006329 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006333 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006335 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006338 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006341 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006343 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006345 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006349 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006351 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006354 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006356 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006359 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006361 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006364 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006367 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006369 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006372 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006375 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:33:49.012196 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006377 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006380 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006382 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006396 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006400 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006402 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006405 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006408 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006410 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006413 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006417 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006419 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006423 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006425 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006429 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006433 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006438 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006441 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006444 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006447 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:33:49.012718 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006449 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006451 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006454 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006456 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006459 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006462 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006464 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006466 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006469 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006472 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006474 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006476 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006479 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006481 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006484 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006488 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006490 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006493 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006496 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:33:49.013234 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006500 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006503 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006505 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006509 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006511 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006513 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006516 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006518 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006521 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006523 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006527 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006529 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006532 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006534 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006536 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006539 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006541 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006544 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006546 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006549 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:33:49.013700 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006552 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:33:49.014200 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006554 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:33:49.014200 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006557 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:33:49.014200 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006559 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:33:49.014200 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006562 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:33:49.014200 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.006565 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:33:49.014200 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.007279 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:33:49.016832 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.016672 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 15:33:49.016881 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.016835 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 15:33:49.016914 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016897 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:33:49.016914 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016903 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:33:49.016914 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016906 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:33:49.016914 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016909 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:33:49.016914 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016912 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:33:49.016914 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016917 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016920 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016923 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016926 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016929 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016933 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016935 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016938 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016941 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016943 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016946 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016949 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016951 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016954 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016956 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016959 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016961 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016964 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016966 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016969 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:33:49.017175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016972 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016975 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016981 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016984 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016987 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016990 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016993 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016996 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.016998 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017001 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017003 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017006 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017009 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017012 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017015 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017017 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017020 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017023 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017025 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:33:49.017670 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017028 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017031 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017033 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017036 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017039 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017041 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017044 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017047 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017049 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017052 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017054 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017057 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017077 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017081 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017085 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017089 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017092 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017096 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017102 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:33:49.018158 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017105 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017108 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017111 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017113 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017116 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017118 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017122 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017125 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017128 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017130 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017133 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017136 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017138 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017140 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017143 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017146 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017149 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017151 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017154 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017157 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:33:49.018675 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017159 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:33:49.019198 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017162 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:33:49.019198 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017164 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:33:49.019198 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.017170 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:33:49.019198 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017308 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:33:49.019198 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017313 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:33:49.019198 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017317 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:33:49.019198 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017320 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:33:49.019198 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017323 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:33:49.019198 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017325 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:33:49.019198 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017328 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:33:49.019198 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017331 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:33:49.019198 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017334 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:33:49.019198 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017336 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:33:49.019198 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017340 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:33:49.019198 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017343 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017346 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017349 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017352 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017354 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017358 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017362 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017364 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017367 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017369 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017372 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017374 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017377 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017379 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017381 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017384 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017387 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017389 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017392 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:33:49.019578 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017394 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017398 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017401 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017404 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017407 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017409 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017412 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017414 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017417 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017419 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017422 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017424 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017428 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017431 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017434 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017436 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017439 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017441 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017444 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:33:49.020042 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017446 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017448 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017451 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017453 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017455 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017458 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017460 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017463 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017465 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017467 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017470 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017473 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017475 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017477 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017480 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017482 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017485 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017487 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017489 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017491 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:33:49.020517 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017494 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:33:49.020997 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017497 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:33:49.020997 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017500 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:33:49.020997 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017502 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:33:49.020997 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017505 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:33:49.020997 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017507 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:33:49.020997 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017509 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:33:49.020997 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017520 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:33:49.020997 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017523 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:33:49.020997 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017525 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:33:49.020997 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017528 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:33:49.020997 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017530 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:33:49.020997 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017533 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:33:49.020997 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017535 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:33:49.020997 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017537 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:33:49.020997 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017540 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:33:49.020997 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:49.017543 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:33:49.021412 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.017547 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:33:49.021412 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.017699 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 15:33:49.021412 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.019789 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 15:33:49.021412 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.020658 2573 server.go:1019] "Starting client certificate rotation" Apr 22 15:33:49.021412 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.020753 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:33:49.021412 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.021157 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:33:49.045984 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.045957 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:33:49.050568 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.050540 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:33:49.066385 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.066348 2573 log.go:25] "Validated CRI v1 runtime API" Apr 22 15:33:49.072145 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.072124 2573 log.go:25] "Validated CRI v1 image API" Apr 22 15:33:49.074217 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.074199 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 15:33:49.075336 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.075318 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:33:49.076539 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.076510 2573 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 aca54d50-f14b-499d-89dc-f2ad61ae1f7e:/dev/nvme0n1p4 bffd4ac8-7b1a-4fbe-b04e-5a4871e1e910:/dev/nvme0n1p3] Apr 22 15:33:49.076605 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.076538 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 15:33:49.083121 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.082955 2573 manager.go:217] Machine: {Timestamp:2026-04-22 15:33:49.080833675 +0000 UTC m=+0.406587316 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100426 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec22fff00ae60632fa8c6b404a6e115f SystemUUID:ec22fff0-0ae6-0632-fa8c-6b404a6e115f BootID:eb9df683-b7a1-4abf-902e-1b3c30f3cd08 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:3e:f8:c9:56:07 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:3e:f8:c9:56:07 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:e2:1e:13:0b:b6:58 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 15:33:49.083121 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.083113 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 15:33:49.083284 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.083272 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 15:33:49.086027 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.085999 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 15:33:49.086200 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.086029 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-128.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 15:33:49.086290 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.086212 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 15:33:49.086290 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.086222 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 15:33:49.086290 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.086236 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:33:49.086962 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.086950 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:33:49.088227 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.088213 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:33:49.088356 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.088347 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 15:33:49.090785 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.090772 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 22 15:33:49.090832 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.090787 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 15:33:49.090832 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.090802 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 15:33:49.090832 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.090814 2573 kubelet.go:397] "Adding apiserver pod source" Apr 22 15:33:49.090832 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.090822 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 15:33:49.091909 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.091895 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:33:49.091944 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.091915 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:33:49.095431 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.095410 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 15:33:49.096710 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.096695 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 15:33:49.098545 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.098533 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 15:33:49.098598 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.098551 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 15:33:49.098598 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.098557 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 15:33:49.098598 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.098562 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 15:33:49.098598 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.098568 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 15:33:49.098598 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.098574 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 15:33:49.098598 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.098580 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 15:33:49.098598 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.098585 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 15:33:49.098598 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.098592 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 15:33:49.098598 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.098599 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 15:33:49.098841 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.098607 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 15:33:49.098841 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.098616 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 15:33:49.100275 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.100261 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 15:33:49.100318 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.100278 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 15:33:49.104522 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.104496 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 15:33:49.104641 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.104564 2573 server.go:1295] "Started kubelet" Apr 22 15:33:49.104742 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.104712 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 15:33:49.105163 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.105106 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 15:33:49.105315 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.105195 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 15:33:49.105895 ip-10-0-143-128 systemd[1]: Started Kubernetes Kubelet. Apr 22 15:33:49.106538 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.106213 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-128.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 15:33:49.106538 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.106289 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 15:33:49.106538 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.106497 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-128.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 15:33:49.106957 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.106941 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 22 15:33:49.107126 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.107106 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 15:33:49.112790 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.112764 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 15:33:49.112937 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.112795 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 15:33:49.114359 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.113526 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 15:33:49.114359 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.113713 2573 factory.go:55] Registering systemd factory Apr 22 15:33:49.114359 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.113764 2573 factory.go:223] Registration of the systemd container factory successfully Apr 22 15:33:49.114359 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.114099 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-128.ec2.internal\" not found" Apr 22 15:33:49.114607 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.114586 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 15:33:49.114707 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.114621 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 15:33:49.114707 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.114691 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 22 15:33:49.114808 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.114714 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 22 15:33:49.118299 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.118281 2573 factory.go:153] Registering CRI-O factory Apr 22 15:33:49.118479 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.118466 2573 factory.go:223] Registration of the crio container factory successfully Apr 22 15:33:49.118591 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.118564 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-128.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 15:33:49.118673 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.118652 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 15:33:49.118759 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.118747 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 15:33:49.118849 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.118840 2573 factory.go:103] Registering Raw factory Apr 22 15:33:49.118926 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.118918 2573 manager.go:1196] Started watching for new ooms in manager Apr 22 15:33:49.119865 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.118767 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-128.ec2.internal.18a8b7b23a1a37d3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-128.ec2.internal,UID:ip-10-0-143-128.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-128.ec2.internal,},FirstTimestamp:2026-04-22 15:33:49.104519123 +0000 UTC m=+0.430272767,LastTimestamp:2026-04-22 15:33:49.104519123 +0000 UTC m=+0.430272767,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-128.ec2.internal,}" Apr 22 15:33:49.120130 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.120113 2573 manager.go:319] Starting recovery of all containers Apr 22 15:33:49.120226 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.120115 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 15:33:49.124549 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.124521 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nc8gv" Apr 22 15:33:49.129929 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.129903 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nc8gv" Apr 22 15:33:49.130132 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.130053 2573 manager.go:324] Recovery completed Apr 22 15:33:49.134734 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.134719 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:33:49.137206 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.137180 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:33:49.137266 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.137211 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:33:49.137266 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.137223 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:33:49.137800 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.137781 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 15:33:49.137800 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.137798 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 15:33:49.137950 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.137817 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:33:49.139322 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.139232 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-128.ec2.internal.18a8b7b23c0cd355 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-128.ec2.internal,UID:ip-10-0-143-128.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-143-128.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-143-128.ec2.internal,},FirstTimestamp:2026-04-22 15:33:49.137195861 +0000 UTC m=+0.462949506,LastTimestamp:2026-04-22 15:33:49.137195861 +0000 UTC m=+0.462949506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-128.ec2.internal,}" Apr 22 15:33:49.141143 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.141129 2573 policy_none.go:49] "None policy: Start" Apr 22 15:33:49.141213 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.141148 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 15:33:49.141213 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.141160 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 22 15:33:49.192837 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.192819 2573 manager.go:341] "Starting Device Plugin manager" Apr 22 15:33:49.192950 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.192869 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 15:33:49.192950 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.192885 2573 server.go:85] "Starting device plugin registration server" Apr 22 15:33:49.193245 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.193232 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 15:33:49.193312 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.193249 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 15:33:49.193417 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.193398 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 15:33:49.193508 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.193487 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 15:33:49.193508 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.193501 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 15:33:49.194308 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.194288 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 15:33:49.194406 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.194334 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-128.ec2.internal\" not found" Apr 22 15:33:49.247887 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.247834 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 15:33:49.249263 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.249213 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 15:33:49.249263 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.249249 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 15:33:49.249366 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.249275 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 15:33:49.249366 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.249285 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 15:33:49.249471 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.249394 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 15:33:49.252429 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.252399 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:33:49.294390 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.294338 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:33:49.295386 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.295369 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:33:49.295506 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.295401 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:33:49.295506 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.295411 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:33:49.295506 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.295438 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-128.ec2.internal" Apr 22 15:33:49.303696 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.303668 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-128.ec2.internal" Apr 22 15:33:49.303880 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.303704 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-128.ec2.internal\": node \"ip-10-0-143-128.ec2.internal\" not found" Apr 22 15:33:49.318540 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.318508 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-128.ec2.internal\" not found" Apr 22 15:33:49.349828 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.349791 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-128.ec2.internal"] Apr 22 15:33:49.349894 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.349884 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:33:49.350790 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.350772 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:33:49.350889 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.350812 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:33:49.350889 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.350827 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:33:49.352170 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.352153 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:33:49.352319 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.352302 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal" Apr 22 15:33:49.352368 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.352336 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:33:49.352902 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.352882 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:33:49.352992 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.352882 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:33:49.352992 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.352919 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:33:49.352992 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.352948 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:33:49.352992 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.352950 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:33:49.352992 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.352966 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:33:49.354094 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.354077 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-128.ec2.internal" Apr 22 15:33:49.354184 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.354113 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:33:49.354965 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.354951 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:33:49.355033 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.354977 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:33:49.355033 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.354989 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:33:49.378110 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.378084 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-128.ec2.internal\" not found" node="ip-10-0-143-128.ec2.internal" Apr 22 15:33:49.382721 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.382696 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-128.ec2.internal\" not found" node="ip-10-0-143-128.ec2.internal" Apr 22 15:33:49.416129 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.416090 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ea5299ef480a56322702cea2246e1048-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal\" (UID: \"ea5299ef480a56322702cea2246e1048\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal" Apr 22 15:33:49.416129 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.416125 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea5299ef480a56322702cea2246e1048-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal\" (UID: \"ea5299ef480a56322702cea2246e1048\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal" Apr 22 15:33:49.416334 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.416144 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bc217a568b1f424e4a361f02b88acbe4-config\") pod \"kube-apiserver-proxy-ip-10-0-143-128.ec2.internal\" (UID: \"bc217a568b1f424e4a361f02b88acbe4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-128.ec2.internal" Apr 22 15:33:49.419184 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.419166 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-128.ec2.internal\" not found" Apr 22 15:33:49.517127 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.517018 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bc217a568b1f424e4a361f02b88acbe4-config\") pod \"kube-apiserver-proxy-ip-10-0-143-128.ec2.internal\" (UID: \"bc217a568b1f424e4a361f02b88acbe4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-128.ec2.internal" Apr 22 15:33:49.517127 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.517088 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ea5299ef480a56322702cea2246e1048-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal\" (UID: \"ea5299ef480a56322702cea2246e1048\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal" Apr 22 15:33:49.517127 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.517109 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea5299ef480a56322702cea2246e1048-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal\" (UID: \"ea5299ef480a56322702cea2246e1048\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal" Apr 22 15:33:49.517331 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.517130 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bc217a568b1f424e4a361f02b88acbe4-config\") pod \"kube-apiserver-proxy-ip-10-0-143-128.ec2.internal\" (UID: \"bc217a568b1f424e4a361f02b88acbe4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-128.ec2.internal" Apr 22 15:33:49.517331 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.517155 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea5299ef480a56322702cea2246e1048-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal\" (UID: \"ea5299ef480a56322702cea2246e1048\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal" Apr 22 15:33:49.517331 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.517196 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ea5299ef480a56322702cea2246e1048-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal\" (UID: \"ea5299ef480a56322702cea2246e1048\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal" Apr 22 15:33:49.520116 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.520093 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-128.ec2.internal\" not found" Apr 22 15:33:49.620985 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.620943 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-128.ec2.internal\" not found" Apr 22 15:33:49.680144 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.680115 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal" Apr 22 15:33:49.684701 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:49.684674 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-128.ec2.internal" Apr 22 15:33:49.721376 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.721334 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-128.ec2.internal\" not found" Apr 22 15:33:49.821968 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.821865 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-128.ec2.internal\" not found" Apr 22 15:33:49.922459 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:49.922418 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-128.ec2.internal\" not found" Apr 22 15:33:50.020924 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:50.020886 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 15:33:50.021632 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:50.021097 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:33:50.023043 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:50.023017 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-128.ec2.internal\" not found" Apr 22 15:33:50.113841 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:50.113660 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 15:33:50.123221 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:50.123188 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-128.ec2.internal\" not found" Apr 22 15:33:50.123378 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:50.123324 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:33:50.132014 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:50.131967 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 15:28:49 +0000 UTC" deadline="2027-10-27 21:53:01.416549445 +0000 UTC" Apr 22 15:33:50.132014 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:50.132009 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13278h19m11.284543192s" Apr 22 15:33:50.146723 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:50.146692 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-5n9nz" Apr 22 15:33:50.154584 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:50.154546 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-5n9nz" Apr 22 15:33:50.167092 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:50.167037 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea5299ef480a56322702cea2246e1048.slice/crio-fc794ebe65bd447258e933e591dc7feca8d6c2462381756c17d9681c74079145 WatchSource:0}: Error finding container fc794ebe65bd447258e933e591dc7feca8d6c2462381756c17d9681c74079145: Status 404 returned error can't find the container with id fc794ebe65bd447258e933e591dc7feca8d6c2462381756c17d9681c74079145 Apr 22 15:33:50.168960 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:50.168934 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc217a568b1f424e4a361f02b88acbe4.slice/crio-9ec994a0bbd36208b85456da394cb99ba4b5c13e40e15d32b77ad195a28fc98a WatchSource:0}: Error finding container 9ec994a0bbd36208b85456da394cb99ba4b5c13e40e15d32b77ad195a28fc98a: Status 404 returned error can't find the container with id 9ec994a0bbd36208b85456da394cb99ba4b5c13e40e15d32b77ad195a28fc98a Apr 22 15:33:50.172750 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:50.172730 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:33:50.223897 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:50.223857 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-128.ec2.internal\" not found" Apr 22 15:33:50.253097 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:50.253026 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-128.ec2.internal" event={"ID":"bc217a568b1f424e4a361f02b88acbe4","Type":"ContainerStarted","Data":"9ec994a0bbd36208b85456da394cb99ba4b5c13e40e15d32b77ad195a28fc98a"} Apr 22 15:33:50.253930 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:50.253909 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal" event={"ID":"ea5299ef480a56322702cea2246e1048","Type":"ContainerStarted","Data":"fc794ebe65bd447258e933e591dc7feca8d6c2462381756c17d9681c74079145"} Apr 22 15:33:50.324504 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:50.324410 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-128.ec2.internal\" not found" Apr 22 15:33:50.352910 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:50.352888 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:33:50.419928 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:50.419710 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:33:50.424592 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:50.424562 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-128.ec2.internal\" not found" Apr 22 15:33:50.525653 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:50.525616 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-128.ec2.internal\" not found" Apr 22 15:33:50.626569 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:50.626435 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-128.ec2.internal\" not found" Apr 22 15:33:50.675018 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:50.674988 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:33:50.713353 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:50.713316 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal" Apr 22 15:33:50.727158 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:50.727130 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:33:50.728212 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:50.728190 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-128.ec2.internal" Apr 22 15:33:50.736486 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:50.736369 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:33:51.092354 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.092321 2573 apiserver.go:52] "Watching apiserver" Apr 22 15:33:51.099985 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.099957 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 15:33:51.100472 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.100438 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9","openshift-cluster-node-tuning-operator/tuned-kvwxs","openshift-dns/node-resolver-64nrn","openshift-image-registry/node-ca-8sqml","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal","openshift-multus/multus-g55gp","openshift-multus/network-metrics-daemon-vk5nl","kube-system/kube-apiserver-proxy-ip-10-0-143-128.ec2.internal","openshift-multus/multus-additional-cni-plugins-zr4rf","openshift-network-diagnostics/network-check-target-jgsl7","openshift-network-operator/iptables-alerter-49km7","openshift-ovn-kubernetes/ovnkube-node-kwt7w","kube-system/konnectivity-agent-m2d4r"] Apr 22 15:33:51.102805 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.102777 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:33:51.102942 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.102877 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vk5nl" podUID="0708298d-9f47-4968-9489-c7cb22cb282c" Apr 22 15:33:51.104818 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.104792 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.106002 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.105975 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8sqml" Apr 22 15:33:51.107611 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.107589 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.108493 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.108472 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 15:33:51.108721 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.108706 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 15:33:51.108777 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.108722 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:33:51.108877 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.108865 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gmt97\"" Apr 22 15:33:51.109297 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.109273 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-64nrn" Apr 22 15:33:51.109384 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.109369 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.109604 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.109581 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 15:33:51.109847 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.109828 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-bwr7v\"" Apr 22 15:33:51.109924 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.109830 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 15:33:51.110003 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.109983 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 15:33:51.110564 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.110544 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.110768 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.110748 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 15:33:51.111213 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.111198 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 15:33:51.111278 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.111228 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-m6qv5\"" Apr 22 15:33:51.111416 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.111403 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 15:33:51.112273 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.111664 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-ldwqs\"" Apr 22 15:33:51.112273 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.111764 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:33:51.112273 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.111828 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgsl7" podUID="07586edf-24f7-4873-81ac-df167bc41e5e" Apr 22 15:33:51.112273 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.112189 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 15:33:51.112273 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.112258 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 15:33:51.112529 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.112314 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 15:33:51.112529 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.112397 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wthx2\"" Apr 22 15:33:51.112529 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.112492 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 15:33:51.112529 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.112525 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 15:33:51.112775 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.112755 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 15:33:51.112854 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.112755 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zbbc9\"" Apr 22 15:33:51.113188 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.113147 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-49km7" Apr 22 15:33:51.113733 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.113714 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 15:33:51.115118 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.115096 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.117077 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.117037 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 15:33:51.117184 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.117131 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:33:51.117184 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.117151 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-222gn\"" Apr 22 15:33:51.117375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.117356 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 15:33:51.117849 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.117825 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 15:33:51.118388 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.118356 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-m2d4r" Apr 22 15:33:51.118487 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.118468 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 15:33:51.119248 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.119227 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 15:33:51.119351 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.119309 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 15:33:51.119490 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.119235 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 15:33:51.119532 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.119481 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 15:33:51.119569 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.119560 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-db5kj\"" Apr 22 15:33:51.120946 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.120923 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 15:33:51.121042 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.120984 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zdcdt\"" Apr 22 15:33:51.121042 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.120929 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 15:33:51.125866 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.125834 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/88c02699-dde6-4f8d-bf08-671bfdb840da-tmp\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.125989 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.125892 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-run-systemd\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.125989 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.125924 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.125989 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.125953 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-host-run-k8s-cni-cncf-io\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.125989 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.125977 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-host-var-lib-kubelet\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.126226 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126003 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-multus-socket-dir-parent\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.126226 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126029 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hf9p\" (UniqueName: \"kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p\") pod \"network-check-target-jgsl7\" (UID: \"07586edf-24f7-4873-81ac-df167bc41e5e\") " pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:33:51.126226 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126052 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-run-netns\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.126226 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126113 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-systemd\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.126226 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126141 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm95c\" (UniqueName: \"kubernetes.io/projected/0708298d-9f47-4968-9489-c7cb22cb282c-kube-api-access-xm95c\") pod \"network-metrics-daemon-vk5nl\" (UID: \"0708298d-9f47-4968-9489-c7cb22cb282c\") " pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:33:51.126226 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126161 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-multus-cni-dir\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.126226 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126189 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/37d4dbd3-61f0-47a0-bd23-69d3cd755850-cnibin\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.126226 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126217 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-os-release\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.126487 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126233 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-hostroot\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.126487 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126256 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fadfec8b-d979-4153-81da-c1de52954dd2-socket-dir\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.126487 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126296 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/37d4dbd3-61f0-47a0-bd23-69d3cd755850-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.126487 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126326 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfr7m\" (UniqueName: \"kubernetes.io/projected/3377967e-b456-4b8d-922f-ecf8e91bf364-kube-api-access-sfr7m\") pod \"iptables-alerter-49km7\" (UID: \"3377967e-b456-4b8d-922f-ecf8e91bf364\") " pod="openshift-network-operator/iptables-alerter-49km7" Apr 22 15:33:51.126487 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126341 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-log-socket\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.126487 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126362 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-host-var-lib-cni-bin\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.126487 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126384 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-host-var-lib-cni-multus\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.126487 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126398 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj4jx\" (UniqueName: \"kubernetes.io/projected/37d4dbd3-61f0-47a0-bd23-69d3cd755850-kube-api-access-fj4jx\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.126487 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126412 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-var-lib-kubelet\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.126487 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126432 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-sysconfig\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.126487 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126447 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-sysctl-d\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.126487 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126462 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-sys\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.126487 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126485 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f9cf0d97-f5d8-44fe-a781-fa3940c08f48-hosts-file\") pod \"node-resolver-64nrn\" (UID: \"f9cf0d97-f5d8-44fe-a781-fa3940c08f48\") " pod="openshift-dns/node-resolver-64nrn" Apr 22 15:33:51.126871 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126504 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj5cl\" (UniqueName: \"kubernetes.io/projected/f9cf0d97-f5d8-44fe-a781-fa3940c08f48-kube-api-access-lj5cl\") pod \"node-resolver-64nrn\" (UID: \"f9cf0d97-f5d8-44fe-a781-fa3940c08f48\") " pod="openshift-dns/node-resolver-64nrn" Apr 22 15:33:51.126871 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126522 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69-serviceca\") pod \"node-ca-8sqml\" (UID: \"2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69\") " pod="openshift-image-registry/node-ca-8sqml" Apr 22 15:33:51.126871 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126541 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e237e451-58c6-4255-bef9-a4ac5f2d06c7-env-overrides\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.126871 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126560 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb856\" (UniqueName: \"kubernetes.io/projected/2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69-kube-api-access-pb856\") pod \"node-ca-8sqml\" (UID: \"2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69\") " pod="openshift-image-registry/node-ca-8sqml" Apr 22 15:33:51.126871 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126575 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-slash\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.126871 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126595 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-etc-openvswitch\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.126871 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126608 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e237e451-58c6-4255-bef9-a4ac5f2d06c7-ovnkube-script-lib\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.126871 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126627 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fadfec8b-d979-4153-81da-c1de52954dd2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.126871 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126643 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnv5x\" (UniqueName: \"kubernetes.io/projected/fadfec8b-d979-4153-81da-c1de52954dd2-kube-api-access-mnv5x\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.126871 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126662 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/37d4dbd3-61f0-47a0-bd23-69d3cd755850-os-release\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.126871 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126681 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/37d4dbd3-61f0-47a0-bd23-69d3cd755850-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.126871 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126711 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnd75\" (UniqueName: \"kubernetes.io/projected/a74db667-642b-4eca-91b6-af4048b9410f-kube-api-access-dnd75\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.126871 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126733 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e237e451-58c6-4255-bef9-a4ac5f2d06c7-ovnkube-config\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.126871 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126746 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-system-cni-dir\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.126871 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126765 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-multus-conf-dir\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.126871 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126789 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-etc-kubernetes\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.127424 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126815 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37d4dbd3-61f0-47a0-bd23-69d3cd755850-system-cni-dir\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.127424 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126830 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3377967e-b456-4b8d-922f-ecf8e91bf364-iptables-alerter-script\") pod \"iptables-alerter-49km7\" (UID: \"3377967e-b456-4b8d-922f-ecf8e91bf364\") " pod="openshift-network-operator/iptables-alerter-49km7" Apr 22 15:33:51.127424 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126844 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-lib-modules\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.127424 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126856 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-host\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.127424 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126870 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-run\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.127424 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126897 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f9cf0d97-f5d8-44fe-a781-fa3940c08f48-tmp-dir\") pod \"node-resolver-64nrn\" (UID: \"f9cf0d97-f5d8-44fe-a781-fa3940c08f48\") " pod="openshift-dns/node-resolver-64nrn" Apr 22 15:33:51.127424 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126911 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-run-openvswitch\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.127424 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126927 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-run-ovn-kubernetes\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.127424 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126941 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fadfec8b-d979-4153-81da-c1de52954dd2-sys-fs\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.127424 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126956 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/37d4dbd3-61f0-47a0-bd23-69d3cd755850-cni-binary-copy\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.127424 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.126997 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69-host\") pod \"node-ca-8sqml\" (UID: \"2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69\") " pod="openshift-image-registry/node-ca-8sqml" Apr 22 15:33:51.127424 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127035 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-run-ovn\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.127424 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127080 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a74db667-642b-4eca-91b6-af4048b9410f-cni-binary-copy\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.127424 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127125 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-sysctl-conf\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.127424 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127168 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwsl2\" (UniqueName: \"kubernetes.io/projected/88c02699-dde6-4f8d-bf08-671bfdb840da-kube-api-access-rwsl2\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.127424 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127191 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-kubelet\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.127905 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127213 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-node-log\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.127905 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127233 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-cni-netd\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.127905 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127251 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-host-run-multus-certs\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.127905 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127275 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fadfec8b-d979-4153-81da-c1de52954dd2-etc-selinux\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.127905 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127298 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-modprobe-d\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.127905 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127312 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-kubernetes\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.127905 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127326 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-cni-bin\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.127905 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127343 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e237e451-58c6-4255-bef9-a4ac5f2d06c7-ovn-node-metrics-cert\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.127905 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127372 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a74db667-642b-4eca-91b6-af4048b9410f-multus-daemon-config\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.127905 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127397 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-systemd-units\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.127905 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127420 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jsp5\" (UniqueName: \"kubernetes.io/projected/e237e451-58c6-4255-bef9-a4ac5f2d06c7-kube-api-access-9jsp5\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.127905 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127442 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fadfec8b-d979-4153-81da-c1de52954dd2-registration-dir\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.127905 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127466 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fadfec8b-d979-4153-81da-c1de52954dd2-device-dir\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.127905 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127490 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-tuned\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.127905 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127508 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-var-lib-openvswitch\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.127905 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127523 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs\") pod \"network-metrics-daemon-vk5nl\" (UID: \"0708298d-9f47-4968-9489-c7cb22cb282c\") " pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:33:51.128454 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127537 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-cnibin\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.128454 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127560 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/37d4dbd3-61f0-47a0-bd23-69d3cd755850-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.128454 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127581 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-host-run-netns\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.128454 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.127636 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3377967e-b456-4b8d-922f-ecf8e91bf364-host-slash\") pod \"iptables-alerter-49km7\" (UID: \"3377967e-b456-4b8d-922f-ecf8e91bf364\") " pod="openshift-network-operator/iptables-alerter-49km7" Apr 22 15:33:51.155342 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.155299 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:28:50 +0000 UTC" deadline="2028-01-27 15:36:57.29827622 +0000 UTC" Apr 22 15:33:51.155342 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.155331 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15480h3m6.142948199s" Apr 22 15:33:51.214363 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.214330 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 15:33:51.228119 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228076 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-sysconfig\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.228119 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228126 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-sysctl-d\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.228349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228143 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-sys\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.228349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228157 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f9cf0d97-f5d8-44fe-a781-fa3940c08f48-hosts-file\") pod \"node-resolver-64nrn\" (UID: \"f9cf0d97-f5d8-44fe-a781-fa3940c08f48\") " pod="openshift-dns/node-resolver-64nrn" Apr 22 15:33:51.228349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228168 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-sysconfig\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.228349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228173 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lj5cl\" (UniqueName: \"kubernetes.io/projected/f9cf0d97-f5d8-44fe-a781-fa3940c08f48-kube-api-access-lj5cl\") pod \"node-resolver-64nrn\" (UID: \"f9cf0d97-f5d8-44fe-a781-fa3940c08f48\") " pod="openshift-dns/node-resolver-64nrn" Apr 22 15:33:51.228349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228225 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69-serviceca\") pod \"node-ca-8sqml\" (UID: \"2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69\") " pod="openshift-image-registry/node-ca-8sqml" Apr 22 15:33:51.228349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228258 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-sys\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.228349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228272 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e237e451-58c6-4255-bef9-a4ac5f2d06c7-env-overrides\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.228349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228328 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb856\" (UniqueName: \"kubernetes.io/projected/2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69-kube-api-access-pb856\") pod \"node-ca-8sqml\" (UID: \"2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69\") " pod="openshift-image-registry/node-ca-8sqml" Apr 22 15:33:51.228349 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228262 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f9cf0d97-f5d8-44fe-a781-fa3940c08f48-hosts-file\") pod \"node-resolver-64nrn\" (UID: \"f9cf0d97-f5d8-44fe-a781-fa3940c08f48\") " pod="openshift-dns/node-resolver-64nrn" Apr 22 15:33:51.228803 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228356 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-slash\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.228803 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228384 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-etc-openvswitch\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.228803 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228409 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e237e451-58c6-4255-bef9-a4ac5f2d06c7-ovnkube-script-lib\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.228803 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228427 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fadfec8b-d979-4153-81da-c1de52954dd2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.228803 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228444 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnv5x\" (UniqueName: \"kubernetes.io/projected/fadfec8b-d979-4153-81da-c1de52954dd2-kube-api-access-mnv5x\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.228803 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228459 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/37d4dbd3-61f0-47a0-bd23-69d3cd755850-os-release\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.228803 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228466 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-etc-openvswitch\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.228803 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228474 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/37d4dbd3-61f0-47a0-bd23-69d3cd755850-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.228803 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228493 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnd75\" (UniqueName: \"kubernetes.io/projected/a74db667-642b-4eca-91b6-af4048b9410f-kube-api-access-dnd75\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.228803 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228501 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-slash\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.228803 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228513 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/538e0c6d-a79c-4576-9d34-fc920e2c9aef-konnectivity-ca\") pod \"konnectivity-agent-m2d4r\" (UID: \"538e0c6d-a79c-4576-9d34-fc920e2c9aef\") " pod="kube-system/konnectivity-agent-m2d4r" Apr 22 15:33:51.228803 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228530 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e237e451-58c6-4255-bef9-a4ac5f2d06c7-ovnkube-config\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.228803 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228546 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-system-cni-dir\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.228803 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228573 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-multus-conf-dir\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.228803 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228588 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-etc-kubernetes\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.228803 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228604 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37d4dbd3-61f0-47a0-bd23-69d3cd755850-system-cni-dir\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.228803 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228620 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3377967e-b456-4b8d-922f-ecf8e91bf364-iptables-alerter-script\") pod \"iptables-alerter-49km7\" (UID: \"3377967e-b456-4b8d-922f-ecf8e91bf364\") " pod="openshift-network-operator/iptables-alerter-49km7" Apr 22 15:33:51.229648 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228637 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-lib-modules\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.229648 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228652 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-host\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.229648 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228666 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-run\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.229648 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f9cf0d97-f5d8-44fe-a781-fa3940c08f48-tmp-dir\") pod \"node-resolver-64nrn\" (UID: \"f9cf0d97-f5d8-44fe-a781-fa3940c08f48\") " pod="openshift-dns/node-resolver-64nrn" Apr 22 15:33:51.229648 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69-serviceca\") pod \"node-ca-8sqml\" (UID: \"2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69\") " pod="openshift-image-registry/node-ca-8sqml" Apr 22 15:33:51.229648 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228734 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e237e451-58c6-4255-bef9-a4ac5f2d06c7-env-overrides\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.229648 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228782 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-run-openvswitch\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.229648 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228696 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-run-openvswitch\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.229648 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228826 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-run-ovn-kubernetes\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.229648 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228852 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fadfec8b-d979-4153-81da-c1de52954dd2-sys-fs\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.229648 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228876 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/37d4dbd3-61f0-47a0-bd23-69d3cd755850-cni-binary-copy\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.229648 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228917 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-sysctl-d\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.229648 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228923 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69-host\") pod \"node-ca-8sqml\" (UID: \"2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69\") " pod="openshift-image-registry/node-ca-8sqml" Apr 22 15:33:51.229648 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228951 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69-host\") pod \"node-ca-8sqml\" (UID: \"2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69\") " pod="openshift-image-registry/node-ca-8sqml" Apr 22 15:33:51.229648 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228972 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-run-ovn\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.229648 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228973 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-run-ovn-kubernetes\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.229648 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228795 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fadfec8b-d979-4153-81da-c1de52954dd2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.229648 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229000 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fadfec8b-d979-4153-81da-c1de52954dd2-sys-fs\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.230543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229015 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-multus-conf-dir\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.230543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229048 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-run-ovn\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.230543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229051 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a74db667-642b-4eca-91b6-af4048b9410f-cni-binary-copy\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.230543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229108 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-sysctl-conf\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.230543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229111 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-etc-kubernetes\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.230543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229135 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwsl2\" (UniqueName: \"kubernetes.io/projected/88c02699-dde6-4f8d-bf08-671bfdb840da-kube-api-access-rwsl2\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.230543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229157 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-kubelet\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.230543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229163 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-system-cni-dir\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.230543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229191 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-node-log\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.230543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-cni-netd\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.230543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229233 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e237e451-58c6-4255-bef9-a4ac5f2d06c7-ovnkube-script-lib\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.230543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229244 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-host-run-multus-certs\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.230543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229279 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-host-run-multus-certs\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.230543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229309 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-kubelet\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.230543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229322 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e237e451-58c6-4255-bef9-a4ac5f2d06c7-ovnkube-config\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.230543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229332 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/37d4dbd3-61f0-47a0-bd23-69d3cd755850-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.230543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229336 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-node-log\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.230543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229349 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3377967e-b456-4b8d-922f-ecf8e91bf364-iptables-alerter-script\") pod \"iptables-alerter-49km7\" (UID: \"3377967e-b456-4b8d-922f-ecf8e91bf364\") " pod="openshift-network-operator/iptables-alerter-49km7" Apr 22 15:33:51.231117 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229367 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-sysctl-conf\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.231117 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229373 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-cni-netd\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.231117 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.228676 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/37d4dbd3-61f0-47a0-bd23-69d3cd755850-os-release\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.231117 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229398 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-run\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.231117 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229398 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fadfec8b-d979-4153-81da-c1de52954dd2-etc-selinux\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.231117 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229431 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-modprobe-d\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.231117 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229440 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fadfec8b-d979-4153-81da-c1de52954dd2-etc-selinux\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.231117 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229457 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-kubernetes\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.231117 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229480 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37d4dbd3-61f0-47a0-bd23-69d3cd755850-system-cni-dir\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.231117 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229479 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-lib-modules\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.231117 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229483 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-cni-bin\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.231117 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229512 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-cni-bin\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.231117 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229513 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-host\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.231117 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229457 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/37d4dbd3-61f0-47a0-bd23-69d3cd755850-cni-binary-copy\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.231117 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229531 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e237e451-58c6-4255-bef9-a4ac5f2d06c7-ovn-node-metrics-cert\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.231117 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229541 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-modprobe-d\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.231117 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229559 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-kubernetes\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.231706 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229561 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a74db667-642b-4eca-91b6-af4048b9410f-multus-daemon-config\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.231706 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229597 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-systemd-units\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.231706 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229624 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jsp5\" (UniqueName: \"kubernetes.io/projected/e237e451-58c6-4255-bef9-a4ac5f2d06c7-kube-api-access-9jsp5\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.231706 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229627 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f9cf0d97-f5d8-44fe-a781-fa3940c08f48-tmp-dir\") pod \"node-resolver-64nrn\" (UID: \"f9cf0d97-f5d8-44fe-a781-fa3940c08f48\") " pod="openshift-dns/node-resolver-64nrn" Apr 22 15:33:51.231706 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229639 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-systemd-units\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.231706 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229649 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fadfec8b-d979-4153-81da-c1de52954dd2-registration-dir\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.231706 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229675 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fadfec8b-d979-4153-81da-c1de52954dd2-device-dir\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.231706 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229700 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-tuned\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.231706 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229724 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-var-lib-openvswitch\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.231706 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229754 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs\") pod \"network-metrics-daemon-vk5nl\" (UID: \"0708298d-9f47-4968-9489-c7cb22cb282c\") " pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:33:51.231706 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229764 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fadfec8b-d979-4153-81da-c1de52954dd2-device-dir\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.231706 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-cnibin\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.231706 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229801 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-var-lib-openvswitch\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.231706 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229804 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/37d4dbd3-61f0-47a0-bd23-69d3cd755850-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.231706 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229830 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-host-run-netns\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.231706 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229853 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3377967e-b456-4b8d-922f-ecf8e91bf364-host-slash\") pod \"iptables-alerter-49km7\" (UID: \"3377967e-b456-4b8d-922f-ecf8e91bf364\") " pod="openshift-network-operator/iptables-alerter-49km7" Apr 22 15:33:51.231706 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229877 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/88c02699-dde6-4f8d-bf08-671bfdb840da-tmp\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.231706 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.229891 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:33:51.232438 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229900 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-run-systemd\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.232438 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229929 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.232438 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.229982 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs podName:0708298d-9f47-4968-9489-c7cb22cb282c nodeName:}" failed. No retries permitted until 2026-04-22 15:33:51.729948095 +0000 UTC m=+3.055701729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs") pod "network-metrics-daemon-vk5nl" (UID: "0708298d-9f47-4968-9489-c7cb22cb282c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:33:51.232438 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229983 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.232438 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.230005 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-host-run-k8s-cni-cncf-io\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.232438 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.230014 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 15:33:51.232438 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.230090 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-host-var-lib-kubelet\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.232438 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.229725 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fadfec8b-d979-4153-81da-c1de52954dd2-registration-dir\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.232438 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.230160 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a74db667-642b-4eca-91b6-af4048b9410f-cni-binary-copy\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.232438 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.230213 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3377967e-b456-4b8d-922f-ecf8e91bf364-host-slash\") pod \"iptables-alerter-49km7\" (UID: \"3377967e-b456-4b8d-922f-ecf8e91bf364\") " pod="openshift-network-operator/iptables-alerter-49km7" Apr 22 15:33:51.232438 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.230230 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-host-run-k8s-cni-cncf-io\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.232907 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.232496 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/37d4dbd3-61f0-47a0-bd23-69d3cd755850-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.232907 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.232560 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-host-run-netns\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.232907 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.230036 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-host-var-lib-kubelet\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.232907 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.232742 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-multus-socket-dir-parent\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.232907 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.232769 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hf9p\" (UniqueName: \"kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p\") pod \"network-check-target-jgsl7\" (UID: \"07586edf-24f7-4873-81ac-df167bc41e5e\") " pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:33:51.232907 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.232799 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-run-netns\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.232907 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.232823 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-systemd\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.232907 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.232844 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xm95c\" (UniqueName: \"kubernetes.io/projected/0708298d-9f47-4968-9489-c7cb22cb282c-kube-api-access-xm95c\") pod \"network-metrics-daemon-vk5nl\" (UID: \"0708298d-9f47-4968-9489-c7cb22cb282c\") " pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:33:51.232907 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.232884 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-multus-cni-dir\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.232907 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.232904 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/37d4dbd3-61f0-47a0-bd23-69d3cd755850-cnibin\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.233375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.232925 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-os-release\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.233375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.232949 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-hostroot\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.233375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.232974 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fadfec8b-d979-4153-81da-c1de52954dd2-socket-dir\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.233375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.233005 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/37d4dbd3-61f0-47a0-bd23-69d3cd755850-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.233375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.233042 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfr7m\" (UniqueName: \"kubernetes.io/projected/3377967e-b456-4b8d-922f-ecf8e91bf364-kube-api-access-sfr7m\") pod \"iptables-alerter-49km7\" (UID: \"3377967e-b456-4b8d-922f-ecf8e91bf364\") " pod="openshift-network-operator/iptables-alerter-49km7" Apr 22 15:33:51.233375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.233091 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-log-socket\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.233375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.233134 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-host-var-lib-cni-bin\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.233375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.233173 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/538e0c6d-a79c-4576-9d34-fc920e2c9aef-agent-certs\") pod \"konnectivity-agent-m2d4r\" (UID: \"538e0c6d-a79c-4576-9d34-fc920e2c9aef\") " pod="kube-system/konnectivity-agent-m2d4r" Apr 22 15:33:51.233375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.233202 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-host-var-lib-cni-multus\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.233375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.233237 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fj4jx\" (UniqueName: \"kubernetes.io/projected/37d4dbd3-61f0-47a0-bd23-69d3cd755850-kube-api-access-fj4jx\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.233375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.233274 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-var-lib-kubelet\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.233862 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.233392 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-var-lib-kubelet\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.233862 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.233458 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-multus-socket-dir-parent\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.233862 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.233601 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a74db667-642b-4eca-91b6-af4048b9410f-multus-daemon-config\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.233862 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.233648 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-cnibin\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.233862 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.233723 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-host-run-netns\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.233862 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.233743 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-systemd\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.233862 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.233750 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-host-var-lib-cni-bin\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.233862 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.233781 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-run-systemd\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.233862 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.233809 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-host-var-lib-cni-multus\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.234335 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.234129 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e237e451-58c6-4255-bef9-a4ac5f2d06c7-log-socket\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.234335 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.234185 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-hostroot\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.234335 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.234256 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-multus-cni-dir\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.234335 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.234281 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a74db667-642b-4eca-91b6-af4048b9410f-os-release\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.234335 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.234330 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/37d4dbd3-61f0-47a0-bd23-69d3cd755850-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.234523 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.234404 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fadfec8b-d979-4153-81da-c1de52954dd2-socket-dir\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.234523 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.234462 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/37d4dbd3-61f0-47a0-bd23-69d3cd755850-cnibin\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.236541 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.236503 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/88c02699-dde6-4f8d-bf08-671bfdb840da-tmp\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.236667 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.236650 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e237e451-58c6-4255-bef9-a4ac5f2d06c7-ovn-node-metrics-cert\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.239015 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.238989 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/88c02699-dde6-4f8d-bf08-671bfdb840da-etc-tuned\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.255031 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.254994 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj5cl\" (UniqueName: \"kubernetes.io/projected/f9cf0d97-f5d8-44fe-a781-fa3940c08f48-kube-api-access-lj5cl\") pod \"node-resolver-64nrn\" (UID: \"f9cf0d97-f5d8-44fe-a781-fa3940c08f48\") " pod="openshift-dns/node-resolver-64nrn" Apr 22 15:33:51.256734 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.256685 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:33:51.256734 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.256714 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:33:51.256734 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.256729 2573 projected.go:194] Error preparing data for projected volume kube-api-access-8hf9p for pod openshift-network-diagnostics/network-check-target-jgsl7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:33:51.256968 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.256840 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p podName:07586edf-24f7-4873-81ac-df167bc41e5e nodeName:}" failed. No retries permitted until 2026-04-22 15:33:51.756817426 +0000 UTC m=+3.082571058 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8hf9p" (UniqueName: "kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p") pod "network-check-target-jgsl7" (UID: "07586edf-24f7-4873-81ac-df167bc41e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:33:51.259486 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.259452 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jsp5\" (UniqueName: \"kubernetes.io/projected/e237e451-58c6-4255-bef9-a4ac5f2d06c7-kube-api-access-9jsp5\") pod \"ovnkube-node-kwt7w\" (UID: \"e237e451-58c6-4255-bef9-a4ac5f2d06c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.260017 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.259906 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnd75\" (UniqueName: \"kubernetes.io/projected/a74db667-642b-4eca-91b6-af4048b9410f-kube-api-access-dnd75\") pod \"multus-g55gp\" (UID: \"a74db667-642b-4eca-91b6-af4048b9410f\") " pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.260017 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.259956 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwsl2\" (UniqueName: \"kubernetes.io/projected/88c02699-dde6-4f8d-bf08-671bfdb840da-kube-api-access-rwsl2\") pod \"tuned-kvwxs\" (UID: \"88c02699-dde6-4f8d-bf08-671bfdb840da\") " pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.260210 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.260038 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb856\" (UniqueName: \"kubernetes.io/projected/2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69-kube-api-access-pb856\") pod \"node-ca-8sqml\" (UID: \"2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69\") " pod="openshift-image-registry/node-ca-8sqml" Apr 22 15:33:51.261393 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.261366 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnv5x\" (UniqueName: \"kubernetes.io/projected/fadfec8b-d979-4153-81da-c1de52954dd2-kube-api-access-mnv5x\") pod \"aws-ebs-csi-driver-node-pmfn9\" (UID: \"fadfec8b-d979-4153-81da-c1de52954dd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.261505 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.261400 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj4jx\" (UniqueName: \"kubernetes.io/projected/37d4dbd3-61f0-47a0-bd23-69d3cd755850-kube-api-access-fj4jx\") pod \"multus-additional-cni-plugins-zr4rf\" (UID: \"37d4dbd3-61f0-47a0-bd23-69d3cd755850\") " pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.262179 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.262144 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm95c\" (UniqueName: \"kubernetes.io/projected/0708298d-9f47-4968-9489-c7cb22cb282c-kube-api-access-xm95c\") pod \"network-metrics-daemon-vk5nl\" (UID: \"0708298d-9f47-4968-9489-c7cb22cb282c\") " pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:33:51.263406 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.263382 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfr7m\" (UniqueName: \"kubernetes.io/projected/3377967e-b456-4b8d-922f-ecf8e91bf364-kube-api-access-sfr7m\") pod \"iptables-alerter-49km7\" (UID: \"3377967e-b456-4b8d-922f-ecf8e91bf364\") " pod="openshift-network-operator/iptables-alerter-49km7" Apr 22 15:33:51.284209 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.284170 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hgz25"] Apr 22 15:33:51.287164 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.287133 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:33:51.287331 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.287212 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hgz25" podUID="d6e0d117-87ac-43fe-bf80-ea2add6000f1" Apr 22 15:33:51.333812 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.333764 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d6e0d117-87ac-43fe-bf80-ea2add6000f1-kubelet-config\") pod \"global-pull-secret-syncer-hgz25\" (UID: \"d6e0d117-87ac-43fe-bf80-ea2add6000f1\") " pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:33:51.333974 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.333832 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/538e0c6d-a79c-4576-9d34-fc920e2c9aef-agent-certs\") pod \"konnectivity-agent-m2d4r\" (UID: \"538e0c6d-a79c-4576-9d34-fc920e2c9aef\") " pod="kube-system/konnectivity-agent-m2d4r" Apr 22 15:33:51.333974 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.333862 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret\") pod \"global-pull-secret-syncer-hgz25\" (UID: \"d6e0d117-87ac-43fe-bf80-ea2add6000f1\") " pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:33:51.333974 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.333901 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/538e0c6d-a79c-4576-9d34-fc920e2c9aef-konnectivity-ca\") pod \"konnectivity-agent-m2d4r\" (UID: \"538e0c6d-a79c-4576-9d34-fc920e2c9aef\") " pod="kube-system/konnectivity-agent-m2d4r" Apr 22 15:33:51.333974 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.333928 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d6e0d117-87ac-43fe-bf80-ea2add6000f1-dbus\") pod \"global-pull-secret-syncer-hgz25\" (UID: \"d6e0d117-87ac-43fe-bf80-ea2add6000f1\") " pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:33:51.334661 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.334638 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/538e0c6d-a79c-4576-9d34-fc920e2c9aef-konnectivity-ca\") pod \"konnectivity-agent-m2d4r\" (UID: \"538e0c6d-a79c-4576-9d34-fc920e2c9aef\") " pod="kube-system/konnectivity-agent-m2d4r" Apr 22 15:33:51.336565 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.336541 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/538e0c6d-a79c-4576-9d34-fc920e2c9aef-agent-certs\") pod \"konnectivity-agent-m2d4r\" (UID: \"538e0c6d-a79c-4576-9d34-fc920e2c9aef\") " pod="kube-system/konnectivity-agent-m2d4r" Apr 22 15:33:51.425621 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.425535 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" Apr 22 15:33:51.432392 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.432353 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8sqml" Apr 22 15:33:51.434255 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.434221 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret\") pod \"global-pull-secret-syncer-hgz25\" (UID: \"d6e0d117-87ac-43fe-bf80-ea2add6000f1\") " pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:33:51.434375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.434289 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d6e0d117-87ac-43fe-bf80-ea2add6000f1-dbus\") pod \"global-pull-secret-syncer-hgz25\" (UID: \"d6e0d117-87ac-43fe-bf80-ea2add6000f1\") " pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:33:51.434375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.434345 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d6e0d117-87ac-43fe-bf80-ea2add6000f1-kubelet-config\") pod \"global-pull-secret-syncer-hgz25\" (UID: \"d6e0d117-87ac-43fe-bf80-ea2add6000f1\") " pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:33:51.434491 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.434399 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:33:51.434491 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.434429 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d6e0d117-87ac-43fe-bf80-ea2add6000f1-dbus\") pod \"global-pull-secret-syncer-hgz25\" (UID: \"d6e0d117-87ac-43fe-bf80-ea2add6000f1\") " pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:33:51.434491 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.434433 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d6e0d117-87ac-43fe-bf80-ea2add6000f1-kubelet-config\") pod \"global-pull-secret-syncer-hgz25\" (UID: \"d6e0d117-87ac-43fe-bf80-ea2add6000f1\") " pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:33:51.434491 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.434472 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret podName:d6e0d117-87ac-43fe-bf80-ea2add6000f1 nodeName:}" failed. No retries permitted until 2026-04-22 15:33:51.934450943 +0000 UTC m=+3.260204572 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret") pod "global-pull-secret-syncer-hgz25" (UID: "d6e0d117-87ac-43fe-bf80-ea2add6000f1") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:33:51.441408 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.441374 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g55gp" Apr 22 15:33:51.446357 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.446325 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-64nrn" Apr 22 15:33:51.454043 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.454011 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" Apr 22 15:33:51.460839 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.460799 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zr4rf" Apr 22 15:33:51.468629 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.468597 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-49km7" Apr 22 15:33:51.475522 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.475462 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:33:51.476718 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.476692 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:33:51.481219 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.481186 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-m2d4r" Apr 22 15:33:51.736566 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.736533 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs\") pod \"network-metrics-daemon-vk5nl\" (UID: \"0708298d-9f47-4968-9489-c7cb22cb282c\") " pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:33:51.736727 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.736705 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:33:51.736818 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.736805 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs podName:0708298d-9f47-4968-9489-c7cb22cb282c nodeName:}" failed. No retries permitted until 2026-04-22 15:33:52.736782442 +0000 UTC m=+4.062536070 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs") pod "network-metrics-daemon-vk5nl" (UID: "0708298d-9f47-4968-9489-c7cb22cb282c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:33:51.837916 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.837870 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hf9p\" (UniqueName: \"kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p\") pod \"network-check-target-jgsl7\" (UID: \"07586edf-24f7-4873-81ac-df167bc41e5e\") " pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:33:51.838093 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.838039 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:33:51.838093 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.838080 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:33:51.838210 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.838095 2573 projected.go:194] Error preparing data for projected volume kube-api-access-8hf9p for pod openshift-network-diagnostics/network-check-target-jgsl7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:33:51.838210 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.838170 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p podName:07586edf-24f7-4873-81ac-df167bc41e5e nodeName:}" failed. No retries permitted until 2026-04-22 15:33:52.838147849 +0000 UTC m=+4.163901484 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8hf9p" (UniqueName: "kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p") pod "network-check-target-jgsl7" (UID: "07586edf-24f7-4873-81ac-df167bc41e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:33:51.939013 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:51.938965 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret\") pod \"global-pull-secret-syncer-hgz25\" (UID: \"d6e0d117-87ac-43fe-bf80-ea2add6000f1\") " pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:33:51.939178 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.939121 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:33:51.939225 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:51.939205 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret podName:d6e0d117-87ac-43fe-bf80-ea2add6000f1 nodeName:}" failed. No retries permitted until 2026-04-22 15:33:52.939189828 +0000 UTC m=+4.264943456 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret") pod "global-pull-secret-syncer-hgz25" (UID: "d6e0d117-87ac-43fe-bf80-ea2add6000f1") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:33:51.945424 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:51.945397 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod538e0c6d_a79c_4576_9d34_fc920e2c9aef.slice/crio-3e482ecb686c0644fa65c2ba70ee46aa7225ec093d7df9d23a53e6287e24a005 WatchSource:0}: Error finding container 3e482ecb686c0644fa65c2ba70ee46aa7225ec093d7df9d23a53e6287e24a005: Status 404 returned error can't find the container with id 3e482ecb686c0644fa65c2ba70ee46aa7225ec093d7df9d23a53e6287e24a005 Apr 22 15:33:51.955175 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:51.955146 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda74db667_642b_4eca_91b6_af4048b9410f.slice/crio-11072097029eb1ef98918abc348761db63938dcde70200c1413c1468478019b0 WatchSource:0}: Error finding container 11072097029eb1ef98918abc348761db63938dcde70200c1413c1468478019b0: Status 404 returned error can't find the container with id 11072097029eb1ef98918abc348761db63938dcde70200c1413c1468478019b0 Apr 22 15:33:51.956370 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:51.956333 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37d4dbd3_61f0_47a0_bd23_69d3cd755850.slice/crio-a00a09c5ac59410804ad6dd0e9c23a951fb8b43c010168341161d56af97a76a7 WatchSource:0}: Error finding container a00a09c5ac59410804ad6dd0e9c23a951fb8b43c010168341161d56af97a76a7: Status 404 returned error can't find the container with id a00a09c5ac59410804ad6dd0e9c23a951fb8b43c010168341161d56af97a76a7 Apr 22 15:33:51.957444 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:51.957417 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode237e451_58c6_4255_bef9_a4ac5f2d06c7.slice/crio-789a170a7a633b5bb2cc4dd21e2de58f3587a938be1c4da592a26a6c4af0dfc3 WatchSource:0}: Error finding container 789a170a7a633b5bb2cc4dd21e2de58f3587a938be1c4da592a26a6c4af0dfc3: Status 404 returned error can't find the container with id 789a170a7a633b5bb2cc4dd21e2de58f3587a938be1c4da592a26a6c4af0dfc3 Apr 22 15:33:51.958630 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:51.958479 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bb9d855_0dbd_4a3b_93cc_7fb30fd48f69.slice/crio-f651d39ae5d9c1c1102dae39493d49601f30b988fa913b7e925a2e5c8befd20f WatchSource:0}: Error finding container f651d39ae5d9c1c1102dae39493d49601f30b988fa913b7e925a2e5c8befd20f: Status 404 returned error can't find the container with id f651d39ae5d9c1c1102dae39493d49601f30b988fa913b7e925a2e5c8befd20f Apr 22 15:33:51.961191 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:33:51.959388 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88c02699_dde6_4f8d_bf08_671bfdb840da.slice/crio-6ad5acd08a63c6d7ae8aa9bfd411269f61e678fd8f0d76ecfca687e4f34e0529 WatchSource:0}: Error finding container 6ad5acd08a63c6d7ae8aa9bfd411269f61e678fd8f0d76ecfca687e4f34e0529: Status 404 returned error can't find the container with id 6ad5acd08a63c6d7ae8aa9bfd411269f61e678fd8f0d76ecfca687e4f34e0529 Apr 22 15:33:52.156112 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:52.156077 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:28:50 +0000 UTC" deadline="2028-01-16 10:26:20.838338684 +0000 UTC" Apr 22 15:33:52.156112 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:52.156107 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15210h52m28.682234361s" Apr 22 15:33:52.261999 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:52.261894 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-128.ec2.internal" event={"ID":"bc217a568b1f424e4a361f02b88acbe4","Type":"ContainerStarted","Data":"a5a361bb8c29756da6f6d555711eb57eb78e5e8260eaeeae3f3e7e1af42949cd"} Apr 22 15:33:52.262929 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:52.262904 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" event={"ID":"88c02699-dde6-4f8d-bf08-671bfdb840da","Type":"ContainerStarted","Data":"6ad5acd08a63c6d7ae8aa9bfd411269f61e678fd8f0d76ecfca687e4f34e0529"} Apr 22 15:33:52.263920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:52.263893 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g55gp" event={"ID":"a74db667-642b-4eca-91b6-af4048b9410f","Type":"ContainerStarted","Data":"11072097029eb1ef98918abc348761db63938dcde70200c1413c1468478019b0"} Apr 22 15:33:52.264817 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:52.264795 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-64nrn" event={"ID":"f9cf0d97-f5d8-44fe-a781-fa3940c08f48","Type":"ContainerStarted","Data":"4603ff2309b4f51d2c17f496119b778e84dcbf4bee93cd7ed8192e76b56a433a"} Apr 22 15:33:52.265920 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:52.265895 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-m2d4r" event={"ID":"538e0c6d-a79c-4576-9d34-fc920e2c9aef","Type":"ContainerStarted","Data":"3e482ecb686c0644fa65c2ba70ee46aa7225ec093d7df9d23a53e6287e24a005"} Apr 22 15:33:52.267466 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:52.267447 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8sqml" event={"ID":"2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69","Type":"ContainerStarted","Data":"f651d39ae5d9c1c1102dae39493d49601f30b988fa913b7e925a2e5c8befd20f"} Apr 22 15:33:52.268381 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:52.268362 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" event={"ID":"e237e451-58c6-4255-bef9-a4ac5f2d06c7","Type":"ContainerStarted","Data":"789a170a7a633b5bb2cc4dd21e2de58f3587a938be1c4da592a26a6c4af0dfc3"} Apr 22 15:33:52.269357 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:52.269339 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zr4rf" event={"ID":"37d4dbd3-61f0-47a0-bd23-69d3cd755850","Type":"ContainerStarted","Data":"a00a09c5ac59410804ad6dd0e9c23a951fb8b43c010168341161d56af97a76a7"} Apr 22 15:33:52.270144 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:52.270127 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" event={"ID":"fadfec8b-d979-4153-81da-c1de52954dd2","Type":"ContainerStarted","Data":"57b02c0e8ba18a321e152b9e90702599645c8265e63d3ada422c5da65cdfa6fd"} Apr 22 15:33:52.272199 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:52.272178 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-49km7" event={"ID":"3377967e-b456-4b8d-922f-ecf8e91bf364","Type":"ContainerStarted","Data":"37a6539c1da59e0b33862cb21856eb7804eb76c26918b2ac3819dfcb82dd4ce6"} Apr 22 15:33:52.274587 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:52.274541 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-128.ec2.internal" podStartSLOduration=2.27453095 podStartE2EDuration="2.27453095s" podCreationTimestamp="2026-04-22 15:33:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:33:52.274270188 +0000 UTC m=+3.600023828" watchObservedRunningTime="2026-04-22 15:33:52.27453095 +0000 UTC m=+3.600284619" Apr 22 15:33:52.746460 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:52.745816 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs\") pod \"network-metrics-daemon-vk5nl\" (UID: \"0708298d-9f47-4968-9489-c7cb22cb282c\") " pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:33:52.746460 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:52.745992 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:33:52.746460 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:52.746079 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs podName:0708298d-9f47-4968-9489-c7cb22cb282c nodeName:}" failed. No retries permitted until 2026-04-22 15:33:54.746038965 +0000 UTC m=+6.071792597 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs") pod "network-metrics-daemon-vk5nl" (UID: "0708298d-9f47-4968-9489-c7cb22cb282c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:33:52.846414 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:52.846370 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hf9p\" (UniqueName: \"kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p\") pod \"network-check-target-jgsl7\" (UID: \"07586edf-24f7-4873-81ac-df167bc41e5e\") " pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:33:52.846730 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:52.846711 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:33:52.846816 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:52.846737 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:33:52.846816 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:52.846749 2573 projected.go:194] Error preparing data for projected volume kube-api-access-8hf9p for pod openshift-network-diagnostics/network-check-target-jgsl7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:33:52.846816 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:52.846811 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p podName:07586edf-24f7-4873-81ac-df167bc41e5e nodeName:}" failed. No retries permitted until 2026-04-22 15:33:54.84679192 +0000 UTC m=+6.172545554 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8hf9p" (UniqueName: "kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p") pod "network-check-target-jgsl7" (UID: "07586edf-24f7-4873-81ac-df167bc41e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:33:52.948223 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:52.947537 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret\") pod \"global-pull-secret-syncer-hgz25\" (UID: \"d6e0d117-87ac-43fe-bf80-ea2add6000f1\") " pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:33:52.948223 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:52.947694 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:33:52.948223 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:52.947761 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret podName:d6e0d117-87ac-43fe-bf80-ea2add6000f1 nodeName:}" failed. No retries permitted until 2026-04-22 15:33:54.947740505 +0000 UTC m=+6.273494158 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret") pod "global-pull-secret-syncer-hgz25" (UID: "d6e0d117-87ac-43fe-bf80-ea2add6000f1") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:33:53.250758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:53.250083 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:33:53.250758 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:53.250217 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgsl7" podUID="07586edf-24f7-4873-81ac-df167bc41e5e" Apr 22 15:33:53.250758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:53.250609 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:33:53.250758 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:53.250716 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vk5nl" podUID="0708298d-9f47-4968-9489-c7cb22cb282c" Apr 22 15:33:53.252531 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:53.252340 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:33:53.252531 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:53.252449 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hgz25" podUID="d6e0d117-87ac-43fe-bf80-ea2add6000f1" Apr 22 15:33:53.297918 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:53.296743 2573 generic.go:358] "Generic (PLEG): container finished" podID="ea5299ef480a56322702cea2246e1048" containerID="28d03d0a69314d117bb978c527944ff7dfb10ebe6ddc6d3b7dafb75db6bdc5a5" exitCode=0 Apr 22 15:33:53.297918 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:53.297701 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal" event={"ID":"ea5299ef480a56322702cea2246e1048","Type":"ContainerDied","Data":"28d03d0a69314d117bb978c527944ff7dfb10ebe6ddc6d3b7dafb75db6bdc5a5"} Apr 22 15:33:54.303802 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:54.303016 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal" event={"ID":"ea5299ef480a56322702cea2246e1048","Type":"ContainerStarted","Data":"78891b8edac7c334012b52dcf253cbb6123741db22b9ce57149e61e9d042edef"} Apr 22 15:33:54.763040 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:54.762995 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs\") pod \"network-metrics-daemon-vk5nl\" (UID: \"0708298d-9f47-4968-9489-c7cb22cb282c\") " pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:33:54.763253 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:54.763219 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:33:54.763331 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:54.763276 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs podName:0708298d-9f47-4968-9489-c7cb22cb282c nodeName:}" failed. No retries permitted until 2026-04-22 15:33:58.763258886 +0000 UTC m=+10.089012515 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs") pod "network-metrics-daemon-vk5nl" (UID: "0708298d-9f47-4968-9489-c7cb22cb282c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:33:54.863611 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:54.863576 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hf9p\" (UniqueName: \"kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p\") pod \"network-check-target-jgsl7\" (UID: \"07586edf-24f7-4873-81ac-df167bc41e5e\") " pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:33:54.863782 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:54.863715 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:33:54.863782 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:54.863734 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:33:54.863782 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:54.863746 2573 projected.go:194] Error preparing data for projected volume kube-api-access-8hf9p for pod openshift-network-diagnostics/network-check-target-jgsl7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:33:54.864040 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:54.863801 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p podName:07586edf-24f7-4873-81ac-df167bc41e5e nodeName:}" failed. No retries permitted until 2026-04-22 15:33:58.863783097 +0000 UTC m=+10.189536730 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8hf9p" (UniqueName: "kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p") pod "network-check-target-jgsl7" (UID: "07586edf-24f7-4873-81ac-df167bc41e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:33:54.964968 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:54.964926 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret\") pod \"global-pull-secret-syncer-hgz25\" (UID: \"d6e0d117-87ac-43fe-bf80-ea2add6000f1\") " pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:33:54.965228 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:54.965148 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:33:54.965228 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:54.965214 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret podName:d6e0d117-87ac-43fe-bf80-ea2add6000f1 nodeName:}" failed. No retries permitted until 2026-04-22 15:33:58.965195906 +0000 UTC m=+10.290949541 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret") pod "global-pull-secret-syncer-hgz25" (UID: "d6e0d117-87ac-43fe-bf80-ea2add6000f1") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:33:55.253413 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:55.253384 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:33:55.253568 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:55.253495 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vk5nl" podUID="0708298d-9f47-4968-9489-c7cb22cb282c" Apr 22 15:33:55.253954 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:55.253931 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:33:55.254075 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:55.254040 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hgz25" podUID="d6e0d117-87ac-43fe-bf80-ea2add6000f1" Apr 22 15:33:55.254154 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:55.254141 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:33:55.254244 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:55.254225 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgsl7" podUID="07586edf-24f7-4873-81ac-df167bc41e5e" Apr 22 15:33:57.279380 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:57.279348 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:33:57.279832 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:57.279479 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hgz25" podUID="d6e0d117-87ac-43fe-bf80-ea2add6000f1" Apr 22 15:33:57.279832 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:57.279528 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:33:57.279832 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:57.279595 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:33:57.279832 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:57.279695 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgsl7" podUID="07586edf-24f7-4873-81ac-df167bc41e5e" Apr 22 15:33:57.279832 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:57.279801 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vk5nl" podUID="0708298d-9f47-4968-9489-c7cb22cb282c" Apr 22 15:33:58.797732 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:58.797680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs\") pod \"network-metrics-daemon-vk5nl\" (UID: \"0708298d-9f47-4968-9489-c7cb22cb282c\") " pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:33:58.798235 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:58.797821 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:33:58.798235 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:58.797892 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs podName:0708298d-9f47-4968-9489-c7cb22cb282c nodeName:}" failed. No retries permitted until 2026-04-22 15:34:06.797876068 +0000 UTC m=+18.123629696 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs") pod "network-metrics-daemon-vk5nl" (UID: "0708298d-9f47-4968-9489-c7cb22cb282c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:33:58.899034 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:58.898990 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hf9p\" (UniqueName: \"kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p\") pod \"network-check-target-jgsl7\" (UID: \"07586edf-24f7-4873-81ac-df167bc41e5e\") " pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:33:58.899261 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:58.899205 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:33:58.899261 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:58.899231 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:33:58.899261 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:58.899244 2573 projected.go:194] Error preparing data for projected volume kube-api-access-8hf9p for pod openshift-network-diagnostics/network-check-target-jgsl7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:33:58.899375 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:58.899312 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p podName:07586edf-24f7-4873-81ac-df167bc41e5e nodeName:}" failed. No retries permitted until 2026-04-22 15:34:06.899293248 +0000 UTC m=+18.225046879 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-8hf9p" (UniqueName: "kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p") pod "network-check-target-jgsl7" (UID: "07586edf-24f7-4873-81ac-df167bc41e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:33:58.999861 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:58.999822 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret\") pod \"global-pull-secret-syncer-hgz25\" (UID: \"d6e0d117-87ac-43fe-bf80-ea2add6000f1\") " pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:33:59.000093 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:59.000001 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:33:59.000148 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:59.000106 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret podName:d6e0d117-87ac-43fe-bf80-ea2add6000f1 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:07.000050612 +0000 UTC m=+18.325804268 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret") pod "global-pull-secret-syncer-hgz25" (UID: "d6e0d117-87ac-43fe-bf80-ea2add6000f1") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:33:59.250811 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:59.250598 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:33:59.250973 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:59.250878 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgsl7" podUID="07586edf-24f7-4873-81ac-df167bc41e5e" Apr 22 15:33:59.250973 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:59.250716 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:33:59.250973 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:59.250940 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hgz25" podUID="d6e0d117-87ac-43fe-bf80-ea2add6000f1" Apr 22 15:33:59.250973 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:33:59.250654 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:33:59.251138 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:33:59.251012 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vk5nl" podUID="0708298d-9f47-4968-9489-c7cb22cb282c" Apr 22 15:34:01.249981 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:01.249944 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:34:01.250476 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:01.249944 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:34:01.250476 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:01.250051 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vk5nl" podUID="0708298d-9f47-4968-9489-c7cb22cb282c" Apr 22 15:34:01.250476 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:01.250148 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hgz25" podUID="d6e0d117-87ac-43fe-bf80-ea2add6000f1" Apr 22 15:34:01.250476 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:01.249944 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:01.250476 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:01.250253 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgsl7" podUID="07586edf-24f7-4873-81ac-df167bc41e5e" Apr 22 15:34:03.251802 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:03.251767 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:34:03.252269 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:03.251898 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vk5nl" podUID="0708298d-9f47-4968-9489-c7cb22cb282c" Apr 22 15:34:03.252269 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:03.252246 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:34:03.252368 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:03.252330 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hgz25" podUID="d6e0d117-87ac-43fe-bf80-ea2add6000f1" Apr 22 15:34:03.252436 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:03.252370 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:03.252484 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:03.252435 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgsl7" podUID="07586edf-24f7-4873-81ac-df167bc41e5e" Apr 22 15:34:05.250539 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:05.250498 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:34:05.250936 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:05.250495 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:34:05.250936 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:05.250632 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hgz25" podUID="d6e0d117-87ac-43fe-bf80-ea2add6000f1" Apr 22 15:34:05.250936 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:05.250495 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:05.250936 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:05.250691 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vk5nl" podUID="0708298d-9f47-4968-9489-c7cb22cb282c" Apr 22 15:34:05.250936 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:05.250769 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgsl7" podUID="07586edf-24f7-4873-81ac-df167bc41e5e" Apr 22 15:34:06.858488 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:06.858436 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs\") pod \"network-metrics-daemon-vk5nl\" (UID: \"0708298d-9f47-4968-9489-c7cb22cb282c\") " pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:34:06.858973 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:06.858591 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:06.858973 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:06.858671 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs podName:0708298d-9f47-4968-9489-c7cb22cb282c nodeName:}" failed. No retries permitted until 2026-04-22 15:34:22.858649768 +0000 UTC m=+34.184403416 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs") pod "network-metrics-daemon-vk5nl" (UID: "0708298d-9f47-4968-9489-c7cb22cb282c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:06.959859 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:06.959819 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hf9p\" (UniqueName: \"kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p\") pod \"network-check-target-jgsl7\" (UID: \"07586edf-24f7-4873-81ac-df167bc41e5e\") " pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:06.960044 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:06.959981 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:34:06.960044 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:06.960004 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:34:06.960044 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:06.960019 2573 projected.go:194] Error preparing data for projected volume kube-api-access-8hf9p for pod openshift-network-diagnostics/network-check-target-jgsl7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:06.960188 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:06.960095 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p podName:07586edf-24f7-4873-81ac-df167bc41e5e nodeName:}" failed. No retries permitted until 2026-04-22 15:34:22.960080791 +0000 UTC m=+34.285834423 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-8hf9p" (UniqueName: "kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p") pod "network-check-target-jgsl7" (UID: "07586edf-24f7-4873-81ac-df167bc41e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:07.060394 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:07.060356 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret\") pod \"global-pull-secret-syncer-hgz25\" (UID: \"d6e0d117-87ac-43fe-bf80-ea2add6000f1\") " pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:34:07.060579 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:07.060484 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:34:07.060579 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:07.060562 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret podName:d6e0d117-87ac-43fe-bf80-ea2add6000f1 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:23.060545649 +0000 UTC m=+34.386299282 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret") pod "global-pull-secret-syncer-hgz25" (UID: "d6e0d117-87ac-43fe-bf80-ea2add6000f1") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:34:07.250329 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:07.250285 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:34:07.250329 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:07.250320 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:07.250576 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:07.250414 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:34:07.250576 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:07.250417 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hgz25" podUID="d6e0d117-87ac-43fe-bf80-ea2add6000f1" Apr 22 15:34:07.250576 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:07.250499 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgsl7" podUID="07586edf-24f7-4873-81ac-df167bc41e5e" Apr 22 15:34:07.250693 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:07.250585 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vk5nl" podUID="0708298d-9f47-4968-9489-c7cb22cb282c" Apr 22 15:34:09.253813 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:09.252259 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:09.253813 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:09.253641 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:34:09.254275 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:09.253913 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgsl7" podUID="07586edf-24f7-4873-81ac-df167bc41e5e" Apr 22 15:34:09.254644 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:09.254304 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hgz25" podUID="d6e0d117-87ac-43fe-bf80-ea2add6000f1" Apr 22 15:34:09.254644 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:09.254346 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:34:09.254644 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:09.254437 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vk5nl" podUID="0708298d-9f47-4968-9489-c7cb22cb282c" Apr 22 15:34:10.336829 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.336552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" event={"ID":"fadfec8b-d979-4153-81da-c1de52954dd2","Type":"ContainerStarted","Data":"bbb91e4b08bd7002877bec34a397b082766f95a6ccf291cde3a47dbf2de44824"} Apr 22 15:34:10.337861 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.337834 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" event={"ID":"88c02699-dde6-4f8d-bf08-671bfdb840da","Type":"ContainerStarted","Data":"5ef3b2762ef52ac6a7e1dd0b89ad83ddc042aa4121df948fd6fa0b044b9e2d66"} Apr 22 15:34:10.339299 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.339272 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g55gp" event={"ID":"a74db667-642b-4eca-91b6-af4048b9410f","Type":"ContainerStarted","Data":"810399a8db545593b12a29a45857c67d0d2cf195a43ddd6552fdba330932ae26"} Apr 22 15:34:10.340481 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.340449 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-64nrn" event={"ID":"f9cf0d97-f5d8-44fe-a781-fa3940c08f48","Type":"ContainerStarted","Data":"e680247ae7fccf6457af6159a0a686743a4aa5268f227c3e28d60e73589d0b55"} Apr 22 15:34:10.341726 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.341702 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-m2d4r" event={"ID":"538e0c6d-a79c-4576-9d34-fc920e2c9aef","Type":"ContainerStarted","Data":"a322e6fd6324ca308ab8fef9264e751d43bce1182de49fb0c75119ef92804802"} Apr 22 15:34:10.342915 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.342895 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8sqml" event={"ID":"2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69","Type":"ContainerStarted","Data":"50fa0899a3a51ae9b6f824aac1ab76c858138c5bff619eea775152b11e0ee0c0"} Apr 22 15:34:10.345250 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.345229 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 15:34:10.345554 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.345529 2573 generic.go:358] "Generic (PLEG): container finished" podID="e237e451-58c6-4255-bef9-a4ac5f2d06c7" containerID="4f236b000bcb115c1899ce3841a33afa8bb6b5f2753d06ddc1774ffa01d3cae9" exitCode=1 Apr 22 15:34:10.345680 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.345603 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" event={"ID":"e237e451-58c6-4255-bef9-a4ac5f2d06c7","Type":"ContainerStarted","Data":"897a615c0a6e3467096524ed9ecfc74f687694ef5ac469588934ad061b042bbd"} Apr 22 15:34:10.345680 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.345625 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" event={"ID":"e237e451-58c6-4255-bef9-a4ac5f2d06c7","Type":"ContainerStarted","Data":"dcb568f7bffdd984e6926be88fab6813a681e09c39f9f3875ad5f70acd0a870e"} Apr 22 15:34:10.345680 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.345635 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" event={"ID":"e237e451-58c6-4255-bef9-a4ac5f2d06c7","Type":"ContainerStarted","Data":"e9ad21f7d91bb2282d05c06e3f392bc5c6d564fe477b2cd4770960abb2bad167"} Apr 22 15:34:10.345680 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.345648 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" event={"ID":"e237e451-58c6-4255-bef9-a4ac5f2d06c7","Type":"ContainerDied","Data":"4f236b000bcb115c1899ce3841a33afa8bb6b5f2753d06ddc1774ffa01d3cae9"} Apr 22 15:34:10.345680 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.345663 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" event={"ID":"e237e451-58c6-4255-bef9-a4ac5f2d06c7","Type":"ContainerStarted","Data":"dbb7d7a6ab1e6b6b82d971d12bb45648285e6c57788ea4fca4c4b66d43f3d81a"} Apr 22 15:34:10.347016 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.346992 2573 generic.go:358] "Generic (PLEG): container finished" podID="37d4dbd3-61f0-47a0-bd23-69d3cd755850" containerID="f1fe3de2d6f1446c956211b85913bafdea4fff3f5efe052b390d6b3090b4f386" exitCode=0 Apr 22 15:34:10.347136 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.347026 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zr4rf" event={"ID":"37d4dbd3-61f0-47a0-bd23-69d3cd755850","Type":"ContainerDied","Data":"f1fe3de2d6f1446c956211b85913bafdea4fff3f5efe052b390d6b3090b4f386"} Apr 22 15:34:10.356854 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.356804 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-kvwxs" podStartSLOduration=3.831031828 podStartE2EDuration="21.356789605s" podCreationTimestamp="2026-04-22 15:33:49 +0000 UTC" firstStartedPulling="2026-04-22 15:33:51.96178654 +0000 UTC m=+3.287540172" lastFinishedPulling="2026-04-22 15:34:09.487544304 +0000 UTC m=+20.813297949" observedRunningTime="2026-04-22 15:34:10.356209792 +0000 UTC m=+21.681963442" watchObservedRunningTime="2026-04-22 15:34:10.356789605 +0000 UTC m=+21.682543255" Apr 22 15:34:10.356996 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.356971 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-128.ec2.internal" podStartSLOduration=20.356964583 podStartE2EDuration="20.356964583s" podCreationTimestamp="2026-04-22 15:33:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:33:54.319101488 +0000 UTC m=+5.644855141" watchObservedRunningTime="2026-04-22 15:34:10.356964583 +0000 UTC m=+21.682718235" Apr 22 15:34:10.368567 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.368517 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8sqml" podStartSLOduration=4.07809215 podStartE2EDuration="21.368502401s" podCreationTimestamp="2026-04-22 15:33:49 +0000 UTC" firstStartedPulling="2026-04-22 15:33:51.960535253 +0000 UTC m=+3.286288881" lastFinishedPulling="2026-04-22 15:34:09.250945488 +0000 UTC m=+20.576699132" observedRunningTime="2026-04-22 15:34:10.368031875 +0000 UTC m=+21.693785526" watchObservedRunningTime="2026-04-22 15:34:10.368502401 +0000 UTC m=+21.694256052" Apr 22 15:34:10.381380 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.381327 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-64nrn" podStartSLOduration=4.080429978 podStartE2EDuration="21.381313249s" podCreationTimestamp="2026-04-22 15:33:49 +0000 UTC" firstStartedPulling="2026-04-22 15:33:51.949971753 +0000 UTC m=+3.275725385" lastFinishedPulling="2026-04-22 15:34:09.250855027 +0000 UTC m=+20.576608656" observedRunningTime="2026-04-22 15:34:10.380953235 +0000 UTC m=+21.706706885" watchObservedRunningTime="2026-04-22 15:34:10.381313249 +0000 UTC m=+21.707066900" Apr 22 15:34:10.424558 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.424497 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-g55gp" podStartSLOduration=3.868505003 podStartE2EDuration="21.424481864s" podCreationTimestamp="2026-04-22 15:33:49 +0000 UTC" firstStartedPulling="2026-04-22 15:33:51.957100686 +0000 UTC m=+3.282854317" lastFinishedPulling="2026-04-22 15:34:09.513077547 +0000 UTC m=+20.838831178" observedRunningTime="2026-04-22 15:34:10.424025777 +0000 UTC m=+21.749779428" watchObservedRunningTime="2026-04-22 15:34:10.424481864 +0000 UTC m=+21.750235514" Apr 22 15:34:10.438333 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:10.438282 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-m2d4r" podStartSLOduration=12.413477676 podStartE2EDuration="21.438265129s" podCreationTimestamp="2026-04-22 15:33:49 +0000 UTC" firstStartedPulling="2026-04-22 15:33:51.949970322 +0000 UTC m=+3.275723957" lastFinishedPulling="2026-04-22 15:34:00.974757777 +0000 UTC m=+12.300511410" observedRunningTime="2026-04-22 15:34:10.438042636 +0000 UTC m=+21.763796265" watchObservedRunningTime="2026-04-22 15:34:10.438265129 +0000 UTC m=+21.764018779" Apr 22 15:34:11.250423 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:11.250388 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:34:11.250673 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:11.250443 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:11.250673 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:11.250568 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:34:11.250673 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:11.250577 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgsl7" podUID="07586edf-24f7-4873-81ac-df167bc41e5e" Apr 22 15:34:11.250835 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:11.250702 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vk5nl" podUID="0708298d-9f47-4968-9489-c7cb22cb282c" Apr 22 15:34:11.250835 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:11.250790 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hgz25" podUID="d6e0d117-87ac-43fe-bf80-ea2add6000f1" Apr 22 15:34:11.353296 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:11.353263 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 15:34:11.353908 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:11.353647 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" event={"ID":"e237e451-58c6-4255-bef9-a4ac5f2d06c7","Type":"ContainerStarted","Data":"f4863f4bb0e6e5e59f87ee3c9ed64fade28387fedb739234ed7fd461dc5c587a"} Apr 22 15:34:11.355702 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:11.355529 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-49km7" event={"ID":"3377967e-b456-4b8d-922f-ecf8e91bf364","Type":"ContainerStarted","Data":"dcdb3fe76e197483d4e60d383c150883a7a74ebdf6df9fe2e4fe65d505796495"} Apr 22 15:34:11.369198 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:11.369143 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-49km7" podStartSLOduration=4.838273294 podStartE2EDuration="22.36912405s" podCreationTimestamp="2026-04-22 15:33:49 +0000 UTC" firstStartedPulling="2026-04-22 15:33:51.954244585 +0000 UTC m=+3.279998213" lastFinishedPulling="2026-04-22 15:34:09.48509534 +0000 UTC m=+20.810848969" observedRunningTime="2026-04-22 15:34:11.368647541 +0000 UTC m=+22.694401191" watchObservedRunningTime="2026-04-22 15:34:11.36912405 +0000 UTC m=+22.694877701" Apr 22 15:34:11.464652 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:11.464616 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 15:34:12.211043 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:12.210915 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T15:34:11.464643741Z","UUID":"c679ea66-c57a-4f48-9879-4fe662fe5d38","Handler":null,"Name":"","Endpoint":""} Apr 22 15:34:12.213013 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:12.212986 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 15:34:12.213013 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:12.213019 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 15:34:12.359382 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:12.359343 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" event={"ID":"fadfec8b-d979-4153-81da-c1de52954dd2","Type":"ContainerStarted","Data":"727471dc7e5cb1c16d9a59b05af1e0105b83a27a9b620c07ab5c013ef0a137cf"} Apr 22 15:34:12.906725 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:12.906689 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-m2d4r" Apr 22 15:34:12.907377 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:12.907345 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-m2d4r" Apr 22 15:34:13.250112 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:13.249711 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:13.250112 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:13.249711 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:34:13.250112 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:13.249772 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:34:13.250112 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:13.249873 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vk5nl" podUID="0708298d-9f47-4968-9489-c7cb22cb282c" Apr 22 15:34:13.250112 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:13.249963 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hgz25" podUID="d6e0d117-87ac-43fe-bf80-ea2add6000f1" Apr 22 15:34:13.250112 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:13.250029 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgsl7" podUID="07586edf-24f7-4873-81ac-df167bc41e5e" Apr 22 15:34:13.364538 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:13.364512 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 15:34:13.365168 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:13.365137 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" event={"ID":"e237e451-58c6-4255-bef9-a4ac5f2d06c7","Type":"ContainerStarted","Data":"05ec3da61fb35eb790c4c2b91797b8bf737cd3dec8db55ffb111668fb8849701"} Apr 22 15:34:14.649764 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:14.649729 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-m2d4r" Apr 22 15:34:14.650304 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:14.649854 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 15:34:14.650609 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:14.650581 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-m2d4r" Apr 22 15:34:15.250332 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:15.250122 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:15.250479 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:15.250122 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:34:15.250479 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:15.250422 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgsl7" podUID="07586edf-24f7-4873-81ac-df167bc41e5e" Apr 22 15:34:15.250579 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:15.250477 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hgz25" podUID="d6e0d117-87ac-43fe-bf80-ea2add6000f1" Apr 22 15:34:15.250579 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:15.250150 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:34:15.250579 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:15.250572 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vk5nl" podUID="0708298d-9f47-4968-9489-c7cb22cb282c" Apr 22 15:34:15.371882 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:15.371856 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 15:34:15.372227 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:15.372199 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" event={"ID":"e237e451-58c6-4255-bef9-a4ac5f2d06c7","Type":"ContainerStarted","Data":"66f738375a471a3f3c2863c129b59106637bf78040ca3707899f6699b788cb6c"} Apr 22 15:34:15.372654 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:15.372629 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:34:15.372766 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:15.372658 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:34:15.372766 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:15.372702 2573 scope.go:117] "RemoveContainer" containerID="4f236b000bcb115c1899ce3841a33afa8bb6b5f2753d06ddc1774ffa01d3cae9" Apr 22 15:34:15.374462 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:15.374435 2573 generic.go:358] "Generic (PLEG): container finished" podID="37d4dbd3-61f0-47a0-bd23-69d3cd755850" containerID="24bbb7447f235cb70ffb27f0078909fa9517a181ac9be4ba8e920a0e1c8a864b" exitCode=0 Apr 22 15:34:15.374588 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:15.374528 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zr4rf" event={"ID":"37d4dbd3-61f0-47a0-bd23-69d3cd755850","Type":"ContainerDied","Data":"24bbb7447f235cb70ffb27f0078909fa9517a181ac9be4ba8e920a0e1c8a864b"} Apr 22 15:34:15.377359 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:15.377330 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" event={"ID":"fadfec8b-d979-4153-81da-c1de52954dd2","Type":"ContainerStarted","Data":"96b9a3bf9613e59c1001af0a6aaa3307779e98c35ddfedb639f277efce7af201"} Apr 22 15:34:15.390549 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:15.390526 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:34:15.390635 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:15.390596 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:34:15.438223 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:15.438154 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pmfn9" podStartSLOduration=3.709270642 podStartE2EDuration="26.438135057s" podCreationTimestamp="2026-04-22 15:33:49 +0000 UTC" firstStartedPulling="2026-04-22 15:33:51.953995328 +0000 UTC m=+3.279748961" lastFinishedPulling="2026-04-22 15:34:14.682859744 +0000 UTC m=+26.008613376" observedRunningTime="2026-04-22 15:34:15.437754304 +0000 UTC m=+26.763507956" watchObservedRunningTime="2026-04-22 15:34:15.438135057 +0000 UTC m=+26.763888707" Apr 22 15:34:16.383737 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:16.383709 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 15:34:16.384256 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:16.384223 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" event={"ID":"e237e451-58c6-4255-bef9-a4ac5f2d06c7","Type":"ContainerStarted","Data":"40c4ee5322c836751cc0e3a731ec3c3ed617126249c9e62e3ac065ae31b34668"} Apr 22 15:34:16.384339 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:16.384311 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 15:34:16.413439 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:16.413354 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" podStartSLOduration=9.822359711 podStartE2EDuration="27.413335694s" podCreationTimestamp="2026-04-22 15:33:49 +0000 UTC" firstStartedPulling="2026-04-22 15:33:51.959758659 +0000 UTC m=+3.285512289" lastFinishedPulling="2026-04-22 15:34:09.550734629 +0000 UTC m=+20.876488272" observedRunningTime="2026-04-22 15:34:16.411806201 +0000 UTC m=+27.737559858" watchObservedRunningTime="2026-04-22 15:34:16.413335694 +0000 UTC m=+27.739089363" Apr 22 15:34:16.743778 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:16.743743 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hgz25"] Apr 22 15:34:16.743923 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:16.743901 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:34:16.744049 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:16.744019 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hgz25" podUID="d6e0d117-87ac-43fe-bf80-ea2add6000f1" Apr 22 15:34:16.750731 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:16.750697 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vk5nl"] Apr 22 15:34:16.750871 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:16.750824 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:34:16.750927 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:16.750910 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vk5nl" podUID="0708298d-9f47-4968-9489-c7cb22cb282c" Apr 22 15:34:16.756145 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:16.756108 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jgsl7"] Apr 22 15:34:16.756287 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:16.756255 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:16.756384 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:16.756362 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgsl7" podUID="07586edf-24f7-4873-81ac-df167bc41e5e" Apr 22 15:34:17.390724 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:17.390686 2573 generic.go:358] "Generic (PLEG): container finished" podID="37d4dbd3-61f0-47a0-bd23-69d3cd755850" containerID="538d74c2a21a2c841daaf10a958a4e935c4e5ae626f6f8a3ed971b42a6cb3e1c" exitCode=0 Apr 22 15:34:17.391317 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:17.390771 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zr4rf" event={"ID":"37d4dbd3-61f0-47a0-bd23-69d3cd755850","Type":"ContainerDied","Data":"538d74c2a21a2c841daaf10a958a4e935c4e5ae626f6f8a3ed971b42a6cb3e1c"} Apr 22 15:34:17.391317 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:17.390994 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 15:34:18.249777 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:18.249745 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:18.249947 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:18.249787 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:34:18.249947 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:18.249868 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgsl7" podUID="07586edf-24f7-4873-81ac-df167bc41e5e" Apr 22 15:34:18.249947 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:18.249931 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:34:18.250116 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:18.250098 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vk5nl" podUID="0708298d-9f47-4968-9489-c7cb22cb282c" Apr 22 15:34:18.250176 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:18.250149 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hgz25" podUID="d6e0d117-87ac-43fe-bf80-ea2add6000f1" Apr 22 15:34:18.394377 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:18.394291 2573 generic.go:358] "Generic (PLEG): container finished" podID="37d4dbd3-61f0-47a0-bd23-69d3cd755850" containerID="1d0c8c3855306b5dc334f16acb8eef8c375cebecb3d8e2cf1ab6ffcce981c55b" exitCode=0 Apr 22 15:34:18.394377 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:18.394335 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zr4rf" event={"ID":"37d4dbd3-61f0-47a0-bd23-69d3cd755850","Type":"ContainerDied","Data":"1d0c8c3855306b5dc334f16acb8eef8c375cebecb3d8e2cf1ab6ffcce981c55b"} Apr 22 15:34:20.250181 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:20.250139 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:20.250181 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:20.250166 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:34:20.250606 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:20.250248 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgsl7" podUID="07586edf-24f7-4873-81ac-df167bc41e5e" Apr 22 15:34:20.250606 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:20.250294 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:34:20.250606 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:20.250361 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vk5nl" podUID="0708298d-9f47-4968-9489-c7cb22cb282c" Apr 22 15:34:20.250606 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:20.250457 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hgz25" podUID="d6e0d117-87ac-43fe-bf80-ea2add6000f1" Apr 22 15:34:20.444895 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:20.444857 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:34:20.445142 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:20.445123 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 15:34:20.457192 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:20.456943 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" podUID="e237e451-58c6-4255-bef9-a4ac5f2d06c7" containerName="ovnkube-controller" probeResult="failure" output="" Apr 22 15:34:20.468729 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:20.468678 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" podUID="e237e451-58c6-4255-bef9-a4ac5f2d06c7" containerName="ovnkube-controller" probeResult="failure" output="" Apr 22 15:34:22.249839 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.249802 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:34:22.250445 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.249864 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:34:22.250445 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.249980 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:22.250445 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:22.250001 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hgz25" podUID="d6e0d117-87ac-43fe-bf80-ea2add6000f1" Apr 22 15:34:22.250445 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:22.250091 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgsl7" podUID="07586edf-24f7-4873-81ac-df167bc41e5e" Apr 22 15:34:22.250445 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:22.250196 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vk5nl" podUID="0708298d-9f47-4968-9489-c7cb22cb282c" Apr 22 15:34:22.457038 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.457003 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeReady" Apr 22 15:34:22.457237 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.457200 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 15:34:22.502242 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.499985 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-57f57db9dc-2l5nd"] Apr 22 15:34:22.504688 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.504668 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.507708 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.507682 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 15:34:22.507903 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.507736 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-92frt\"" Apr 22 15:34:22.508007 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.507994 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 15:34:22.508098 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.508037 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 15:34:22.514332 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.514308 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 15:34:22.516120 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.516095 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57f57db9dc-2l5nd"] Apr 22 15:34:22.578678 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.578632 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-trusted-ca\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.578863 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.578689 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.578863 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.578797 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-ca-trust-extracted\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.578953 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.578863 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6njk\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-kube-api-access-d6njk\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.578953 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.578898 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-image-registry-private-configuration\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.578953 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.578937 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-certificates\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.579125 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.578962 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-bound-sa-token\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.579125 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.579049 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-installation-pull-secrets\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.679717 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.679671 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6njk\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-kube-api-access-d6njk\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.679903 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.679737 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-image-registry-private-configuration\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.679903 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.679773 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-certificates\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.679903 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.679799 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-bound-sa-token\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.679903 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.679838 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-installation-pull-secrets\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.679903 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.679875 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-trusted-ca\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.680188 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.679909 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.680188 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.679947 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-ca-trust-extracted\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.680354 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.680316 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-ca-trust-extracted\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.680500 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:22.680428 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:34:22.680500 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:22.680445 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57f57db9dc-2l5nd: secret "image-registry-tls" not found Apr 22 15:34:22.680500 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.680478 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-certificates\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.680500 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:22.680501 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls podName:1aa0caed-5656-4ed9-8cc0-35f66d7d8123 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:23.180480828 +0000 UTC m=+34.506234456 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls") pod "image-registry-57f57db9dc-2l5nd" (UID: "1aa0caed-5656-4ed9-8cc0-35f66d7d8123") : secret "image-registry-tls" not found Apr 22 15:34:22.681322 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.681298 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-trusted-ca\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.684761 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.684727 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-image-registry-private-configuration\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.685082 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.685045 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-installation-pull-secrets\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.698471 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.698441 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-bound-sa-token\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.698629 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.698451 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6njk\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-kube-api-access-d6njk\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:22.762005 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.761910 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-x6h97"] Apr 22 15:34:22.764316 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.764285 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-x6h97" Apr 22 15:34:22.766875 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.766841 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 15:34:22.767030 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.766847 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 15:34:22.767030 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.766893 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fpsm4\"" Apr 22 15:34:22.767030 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.766904 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 15:34:22.775570 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.775187 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-x6h97"] Apr 22 15:34:22.782758 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.782714 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6w5vh"] Apr 22 15:34:22.784983 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.784954 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6w5vh" Apr 22 15:34:22.787731 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.787670 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 15:34:22.788283 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.788260 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xgzsv\"" Apr 22 15:34:22.788578 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.788562 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 15:34:22.799595 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.799564 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6w5vh"] Apr 22 15:34:22.882079 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.882018 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bcb9971-bb8e-460b-b9e5-409f39381abb-config-volume\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:34:22.882278 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.882096 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2bcb9971-bb8e-460b-b9e5-409f39381abb-tmp-dir\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:34:22.882278 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.882173 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxwrt\" (UniqueName: \"kubernetes.io/projected/2bcb9971-bb8e-460b-b9e5-409f39381abb-kube-api-access-lxwrt\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:34:22.882278 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.882248 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssmrx\" (UniqueName: \"kubernetes.io/projected/4078b99e-a844-47ea-8e8d-88fefc3efd1b-kube-api-access-ssmrx\") pod \"ingress-canary-x6h97\" (UID: \"4078b99e-a844-47ea-8e8d-88fefc3efd1b\") " pod="openshift-ingress-canary/ingress-canary-x6h97" Apr 22 15:34:22.882444 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.882277 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs\") pod \"network-metrics-daemon-vk5nl\" (UID: \"0708298d-9f47-4968-9489-c7cb22cb282c\") " pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:34:22.882444 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.882305 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert\") pod \"ingress-canary-x6h97\" (UID: \"4078b99e-a844-47ea-8e8d-88fefc3efd1b\") " pod="openshift-ingress-canary/ingress-canary-x6h97" Apr 22 15:34:22.882444 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.882338 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:34:22.882584 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:22.882447 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:22.882584 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:22.882562 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs podName:0708298d-9f47-4968-9489-c7cb22cb282c nodeName:}" failed. No retries permitted until 2026-04-22 15:34:54.882537673 +0000 UTC m=+66.208291316 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs") pod "network-metrics-daemon-vk5nl" (UID: "0708298d-9f47-4968-9489-c7cb22cb282c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:22.983027 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.982979 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxwrt\" (UniqueName: \"kubernetes.io/projected/2bcb9971-bb8e-460b-b9e5-409f39381abb-kube-api-access-lxwrt\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:34:22.983241 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.983049 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssmrx\" (UniqueName: \"kubernetes.io/projected/4078b99e-a844-47ea-8e8d-88fefc3efd1b-kube-api-access-ssmrx\") pod \"ingress-canary-x6h97\" (UID: \"4078b99e-a844-47ea-8e8d-88fefc3efd1b\") " pod="openshift-ingress-canary/ingress-canary-x6h97" Apr 22 15:34:22.983241 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.983103 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert\") pod \"ingress-canary-x6h97\" (UID: \"4078b99e-a844-47ea-8e8d-88fefc3efd1b\") " pod="openshift-ingress-canary/ingress-canary-x6h97" Apr 22 15:34:22.983241 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.983131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hf9p\" (UniqueName: \"kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p\") pod \"network-check-target-jgsl7\" (UID: \"07586edf-24f7-4873-81ac-df167bc41e5e\") " pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:22.983241 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:22.983229 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:34:22.983458 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.983255 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:34:22.983458 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:22.983228 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:34:22.983458 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:22.983290 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert podName:4078b99e-a844-47ea-8e8d-88fefc3efd1b nodeName:}" failed. No retries permitted until 2026-04-22 15:34:23.483272898 +0000 UTC m=+34.809026528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert") pod "ingress-canary-x6h97" (UID: "4078b99e-a844-47ea-8e8d-88fefc3efd1b") : secret "canary-serving-cert" not found Apr 22 15:34:22.983458 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:22.983307 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:34:22.983458 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:22.983322 2573 projected.go:194] Error preparing data for projected volume kube-api-access-8hf9p for pod openshift-network-diagnostics/network-check-target-jgsl7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:22.983458 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:22.983341 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:34:22.983458 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.983344 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bcb9971-bb8e-460b-b9e5-409f39381abb-config-volume\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:34:22.983458 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:22.983381 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p podName:07586edf-24f7-4873-81ac-df167bc41e5e nodeName:}" failed. No retries permitted until 2026-04-22 15:34:54.983365324 +0000 UTC m=+66.309118958 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-8hf9p" (UniqueName: "kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p") pod "network-check-target-jgsl7" (UID: "07586edf-24f7-4873-81ac-df167bc41e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:22.983458 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:22.983409 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls podName:2bcb9971-bb8e-460b-b9e5-409f39381abb nodeName:}" failed. No retries permitted until 2026-04-22 15:34:23.483394989 +0000 UTC m=+34.809148619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls") pod "dns-default-6w5vh" (UID: "2bcb9971-bb8e-460b-b9e5-409f39381abb") : secret "dns-default-metrics-tls" not found Apr 22 15:34:22.983458 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.983405 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2bcb9971-bb8e-460b-b9e5-409f39381abb-tmp-dir\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:34:22.984004 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.983753 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2bcb9971-bb8e-460b-b9e5-409f39381abb-tmp-dir\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:34:22.984004 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.983849 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bcb9971-bb8e-460b-b9e5-409f39381abb-config-volume\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:34:22.992532 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.992505 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxwrt\" (UniqueName: \"kubernetes.io/projected/2bcb9971-bb8e-460b-b9e5-409f39381abb-kube-api-access-lxwrt\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:34:22.992719 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:22.992639 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssmrx\" (UniqueName: \"kubernetes.io/projected/4078b99e-a844-47ea-8e8d-88fefc3efd1b-kube-api-access-ssmrx\") pod \"ingress-canary-x6h97\" (UID: \"4078b99e-a844-47ea-8e8d-88fefc3efd1b\") " pod="openshift-ingress-canary/ingress-canary-x6h97" Apr 22 15:34:23.084558 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:23.084458 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret\") pod \"global-pull-secret-syncer-hgz25\" (UID: \"d6e0d117-87ac-43fe-bf80-ea2add6000f1\") " pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:34:23.084738 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:23.084640 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:34:23.084738 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:23.084723 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret podName:d6e0d117-87ac-43fe-bf80-ea2add6000f1 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:55.084704149 +0000 UTC m=+66.410457790 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret") pod "global-pull-secret-syncer-hgz25" (UID: "d6e0d117-87ac-43fe-bf80-ea2add6000f1") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:34:23.185955 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:23.185898 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:23.186180 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:23.186100 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:34:23.186180 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:23.186125 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57f57db9dc-2l5nd: secret "image-registry-tls" not found Apr 22 15:34:23.186307 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:23.186202 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls podName:1aa0caed-5656-4ed9-8cc0-35f66d7d8123 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:24.18617284 +0000 UTC m=+35.511926479 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls") pod "image-registry-57f57db9dc-2l5nd" (UID: "1aa0caed-5656-4ed9-8cc0-35f66d7d8123") : secret "image-registry-tls" not found Apr 22 15:34:23.488663 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:23.488614 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert\") pod \"ingress-canary-x6h97\" (UID: \"4078b99e-a844-47ea-8e8d-88fefc3efd1b\") " pod="openshift-ingress-canary/ingress-canary-x6h97" Apr 22 15:34:23.489150 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:23.488681 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:34:23.489150 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:23.488784 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:34:23.489150 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:23.488861 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert podName:4078b99e-a844-47ea-8e8d-88fefc3efd1b nodeName:}" failed. No retries permitted until 2026-04-22 15:34:24.488842968 +0000 UTC m=+35.814596602 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert") pod "ingress-canary-x6h97" (UID: "4078b99e-a844-47ea-8e8d-88fefc3efd1b") : secret "canary-serving-cert" not found Apr 22 15:34:23.489150 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:23.488861 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:34:23.489150 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:23.488936 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls podName:2bcb9971-bb8e-460b-b9e5-409f39381abb nodeName:}" failed. No retries permitted until 2026-04-22 15:34:24.488920134 +0000 UTC m=+35.814673768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls") pod "dns-default-6w5vh" (UID: "2bcb9971-bb8e-460b-b9e5-409f39381abb") : secret "dns-default-metrics-tls" not found Apr 22 15:34:24.193862 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:24.193822 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:24.194095 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:24.193995 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:34:24.194095 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:24.194020 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57f57db9dc-2l5nd: secret "image-registry-tls" not found Apr 22 15:34:24.194232 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:24.194119 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls podName:1aa0caed-5656-4ed9-8cc0-35f66d7d8123 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:26.194096356 +0000 UTC m=+37.519849998 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls") pod "image-registry-57f57db9dc-2l5nd" (UID: "1aa0caed-5656-4ed9-8cc0-35f66d7d8123") : secret "image-registry-tls" not found Apr 22 15:34:24.250152 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:24.250113 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:24.250152 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:24.250133 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:34:24.250489 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:24.250467 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:34:24.253993 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:24.253961 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 15:34:24.253993 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:24.253980 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 15:34:24.254246 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:24.253986 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qxbwk\"" Apr 22 15:34:24.254246 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:24.254153 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xnsfv\"" Apr 22 15:34:24.254246 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:24.254152 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 15:34:24.254389 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:24.254320 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 15:34:24.496296 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:24.496252 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert\") pod \"ingress-canary-x6h97\" (UID: \"4078b99e-a844-47ea-8e8d-88fefc3efd1b\") " pod="openshift-ingress-canary/ingress-canary-x6h97" Apr 22 15:34:24.496721 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:24.496333 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:34:24.496721 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:24.496391 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:34:24.496721 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:24.496477 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert podName:4078b99e-a844-47ea-8e8d-88fefc3efd1b nodeName:}" failed. No retries permitted until 2026-04-22 15:34:26.496455501 +0000 UTC m=+37.822209133 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert") pod "ingress-canary-x6h97" (UID: "4078b99e-a844-47ea-8e8d-88fefc3efd1b") : secret "canary-serving-cert" not found Apr 22 15:34:24.496721 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:24.496503 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:34:24.496721 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:24.496576 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls podName:2bcb9971-bb8e-460b-b9e5-409f39381abb nodeName:}" failed. No retries permitted until 2026-04-22 15:34:26.496562769 +0000 UTC m=+37.822316398 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls") pod "dns-default-6w5vh" (UID: "2bcb9971-bb8e-460b-b9e5-409f39381abb") : secret "dns-default-metrics-tls" not found Apr 22 15:34:26.209881 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:26.209837 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:26.210344 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:26.209999 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:34:26.210344 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:26.210019 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57f57db9dc-2l5nd: secret "image-registry-tls" not found Apr 22 15:34:26.210344 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:26.210109 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls podName:1aa0caed-5656-4ed9-8cc0-35f66d7d8123 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:30.210092462 +0000 UTC m=+41.535846104 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls") pod "image-registry-57f57db9dc-2l5nd" (UID: "1aa0caed-5656-4ed9-8cc0-35f66d7d8123") : secret "image-registry-tls" not found Apr 22 15:34:26.414610 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:26.414571 2573 generic.go:358] "Generic (PLEG): container finished" podID="37d4dbd3-61f0-47a0-bd23-69d3cd755850" containerID="d04018f066dc6d495790969ea07a7fb28fe0a1b67e4e2d69ad038f9c56c802cf" exitCode=0 Apr 22 15:34:26.414799 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:26.414632 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zr4rf" event={"ID":"37d4dbd3-61f0-47a0-bd23-69d3cd755850","Type":"ContainerDied","Data":"d04018f066dc6d495790969ea07a7fb28fe0a1b67e4e2d69ad038f9c56c802cf"} Apr 22 15:34:26.512321 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:26.512279 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert\") pod \"ingress-canary-x6h97\" (UID: \"4078b99e-a844-47ea-8e8d-88fefc3efd1b\") " pod="openshift-ingress-canary/ingress-canary-x6h97" Apr 22 15:34:26.512499 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:26.512344 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:34:26.512499 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:26.512433 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:34:26.512499 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:26.512461 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:34:26.512617 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:26.512509 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert podName:4078b99e-a844-47ea-8e8d-88fefc3efd1b nodeName:}" failed. No retries permitted until 2026-04-22 15:34:30.512487255 +0000 UTC m=+41.838240888 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert") pod "ingress-canary-x6h97" (UID: "4078b99e-a844-47ea-8e8d-88fefc3efd1b") : secret "canary-serving-cert" not found Apr 22 15:34:26.512617 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:26.512528 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls podName:2bcb9971-bb8e-460b-b9e5-409f39381abb nodeName:}" failed. No retries permitted until 2026-04-22 15:34:30.512518916 +0000 UTC m=+41.838272545 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls") pod "dns-default-6w5vh" (UID: "2bcb9971-bb8e-460b-b9e5-409f39381abb") : secret "dns-default-metrics-tls" not found Apr 22 15:34:27.418878 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:27.418832 2573 generic.go:358] "Generic (PLEG): container finished" podID="37d4dbd3-61f0-47a0-bd23-69d3cd755850" containerID="1652d323511823e93bca60dc1bb2f6dd8d0423835ff287de7898372be90c7f26" exitCode=0 Apr 22 15:34:27.419299 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:27.418898 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zr4rf" event={"ID":"37d4dbd3-61f0-47a0-bd23-69d3cd755850","Type":"ContainerDied","Data":"1652d323511823e93bca60dc1bb2f6dd8d0423835ff287de7898372be90c7f26"} Apr 22 15:34:28.424433 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:28.424237 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zr4rf" event={"ID":"37d4dbd3-61f0-47a0-bd23-69d3cd755850","Type":"ContainerStarted","Data":"77b9ae5ca33180cce9278aafbb69f574bf7627ac30c0938f417374da24264541"} Apr 22 15:34:28.454164 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:28.454105 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zr4rf" podStartSLOduration=6.033807986 podStartE2EDuration="39.454088886s" podCreationTimestamp="2026-04-22 15:33:49 +0000 UTC" firstStartedPulling="2026-04-22 15:33:51.95854649 +0000 UTC m=+3.284300119" lastFinishedPulling="2026-04-22 15:34:25.378827387 +0000 UTC m=+36.704581019" observedRunningTime="2026-04-22 15:34:28.453587109 +0000 UTC m=+39.779340761" watchObservedRunningTime="2026-04-22 15:34:28.454088886 +0000 UTC m=+39.779842534" Apr 22 15:34:30.241448 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:30.241397 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:30.241944 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:30.241554 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:34:30.241944 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:30.241574 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57f57db9dc-2l5nd: secret "image-registry-tls" not found Apr 22 15:34:30.241944 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:30.241630 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls podName:1aa0caed-5656-4ed9-8cc0-35f66d7d8123 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:38.241615741 +0000 UTC m=+49.567369371 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls") pod "image-registry-57f57db9dc-2l5nd" (UID: "1aa0caed-5656-4ed9-8cc0-35f66d7d8123") : secret "image-registry-tls" not found Apr 22 15:34:30.543797 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:30.543702 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert\") pod \"ingress-canary-x6h97\" (UID: \"4078b99e-a844-47ea-8e8d-88fefc3efd1b\") " pod="openshift-ingress-canary/ingress-canary-x6h97" Apr 22 15:34:30.543797 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:30.543759 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:34:30.543978 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:30.543857 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:34:30.543978 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:30.543888 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:34:30.543978 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:30.543930 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls podName:2bcb9971-bb8e-460b-b9e5-409f39381abb nodeName:}" failed. No retries permitted until 2026-04-22 15:34:38.543917238 +0000 UTC m=+49.869670867 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls") pod "dns-default-6w5vh" (UID: "2bcb9971-bb8e-460b-b9e5-409f39381abb") : secret "dns-default-metrics-tls" not found Apr 22 15:34:30.543978 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:30.543941 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert podName:4078b99e-a844-47ea-8e8d-88fefc3efd1b nodeName:}" failed. No retries permitted until 2026-04-22 15:34:38.543936071 +0000 UTC m=+49.869689699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert") pod "ingress-canary-x6h97" (UID: "4078b99e-a844-47ea-8e8d-88fefc3efd1b") : secret "canary-serving-cert" not found Apr 22 15:34:38.302303 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:38.302262 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:38.302707 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:38.302410 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:34:38.302707 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:38.302424 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57f57db9dc-2l5nd: secret "image-registry-tls" not found Apr 22 15:34:38.302707 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:38.302491 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls podName:1aa0caed-5656-4ed9-8cc0-35f66d7d8123 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:54.302474277 +0000 UTC m=+65.628227909 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls") pod "image-registry-57f57db9dc-2l5nd" (UID: "1aa0caed-5656-4ed9-8cc0-35f66d7d8123") : secret "image-registry-tls" not found Apr 22 15:34:38.605129 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:38.605015 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert\") pod \"ingress-canary-x6h97\" (UID: \"4078b99e-a844-47ea-8e8d-88fefc3efd1b\") " pod="openshift-ingress-canary/ingress-canary-x6h97" Apr 22 15:34:38.605129 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:38.605083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:34:38.605329 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:38.605183 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:34:38.605329 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:38.605185 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:34:38.605329 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:38.605257 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert podName:4078b99e-a844-47ea-8e8d-88fefc3efd1b nodeName:}" failed. No retries permitted until 2026-04-22 15:34:54.605235972 +0000 UTC m=+65.930989615 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert") pod "ingress-canary-x6h97" (UID: "4078b99e-a844-47ea-8e8d-88fefc3efd1b") : secret "canary-serving-cert" not found Apr 22 15:34:38.605329 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:38.605275 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls podName:2bcb9971-bb8e-460b-b9e5-409f39381abb nodeName:}" failed. No retries permitted until 2026-04-22 15:34:54.605267233 +0000 UTC m=+65.931020862 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls") pod "dns-default-6w5vh" (UID: "2bcb9971-bb8e-460b-b9e5-409f39381abb") : secret "dns-default-metrics-tls" not found Apr 22 15:34:50.468752 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:50.468679 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwt7w" Apr 22 15:34:54.327392 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:54.327345 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:34:54.327858 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:54.327475 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:34:54.327858 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:54.327486 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57f57db9dc-2l5nd: secret "image-registry-tls" not found Apr 22 15:34:54.327858 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:54.327539 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls podName:1aa0caed-5656-4ed9-8cc0-35f66d7d8123 nodeName:}" failed. No retries permitted until 2026-04-22 15:35:26.327524793 +0000 UTC m=+97.653278422 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls") pod "image-registry-57f57db9dc-2l5nd" (UID: "1aa0caed-5656-4ed9-8cc0-35f66d7d8123") : secret "image-registry-tls" not found Apr 22 15:34:54.628876 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:54.628779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert\") pod \"ingress-canary-x6h97\" (UID: \"4078b99e-a844-47ea-8e8d-88fefc3efd1b\") " pod="openshift-ingress-canary/ingress-canary-x6h97" Apr 22 15:34:54.628876 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:54.628842 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:34:54.629084 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:54.628931 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:34:54.629084 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:54.628938 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:34:54.629084 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:54.628997 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls podName:2bcb9971-bb8e-460b-b9e5-409f39381abb nodeName:}" failed. No retries permitted until 2026-04-22 15:35:26.628981353 +0000 UTC m=+97.954734982 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls") pod "dns-default-6w5vh" (UID: "2bcb9971-bb8e-460b-b9e5-409f39381abb") : secret "dns-default-metrics-tls" not found Apr 22 15:34:54.629084 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:54.629010 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert podName:4078b99e-a844-47ea-8e8d-88fefc3efd1b nodeName:}" failed. No retries permitted until 2026-04-22 15:35:26.629004324 +0000 UTC m=+97.954757952 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert") pod "ingress-canary-x6h97" (UID: "4078b99e-a844-47ea-8e8d-88fefc3efd1b") : secret "canary-serving-cert" not found Apr 22 15:34:54.930724 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:54.930627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs\") pod \"network-metrics-daemon-vk5nl\" (UID: \"0708298d-9f47-4968-9489-c7cb22cb282c\") " pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:34:54.933289 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:54.933264 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 15:34:54.941679 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:54.941650 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 15:34:54.941748 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:34:54.941737 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs podName:0708298d-9f47-4968-9489-c7cb22cb282c nodeName:}" failed. No retries permitted until 2026-04-22 15:35:58.941720049 +0000 UTC m=+130.267473681 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs") pod "network-metrics-daemon-vk5nl" (UID: "0708298d-9f47-4968-9489-c7cb22cb282c") : secret "metrics-daemon-secret" not found Apr 22 15:34:55.031578 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:55.031535 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hf9p\" (UniqueName: \"kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p\") pod \"network-check-target-jgsl7\" (UID: \"07586edf-24f7-4873-81ac-df167bc41e5e\") " pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:55.034509 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:55.034482 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 15:34:55.044538 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:55.044515 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 15:34:55.056370 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:55.056334 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hf9p\" (UniqueName: \"kubernetes.io/projected/07586edf-24f7-4873-81ac-df167bc41e5e-kube-api-access-8hf9p\") pod \"network-check-target-jgsl7\" (UID: \"07586edf-24f7-4873-81ac-df167bc41e5e\") " pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:55.132392 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:55.132350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret\") pod \"global-pull-secret-syncer-hgz25\" (UID: \"d6e0d117-87ac-43fe-bf80-ea2add6000f1\") " pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:34:55.135341 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:55.135317 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 15:34:55.145958 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:55.145928 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d6e0d117-87ac-43fe-bf80-ea2add6000f1-original-pull-secret\") pod \"global-pull-secret-syncer-hgz25\" (UID: \"d6e0d117-87ac-43fe-bf80-ea2add6000f1\") " pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:34:55.165186 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:55.165151 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xnsfv\"" Apr 22 15:34:55.170287 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:55.170265 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hgz25" Apr 22 15:34:55.174017 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:55.173994 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:55.369219 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:55.369188 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hgz25"] Apr 22 15:34:55.373146 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:34:55.373056 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6e0d117_87ac_43fe_bf80_ea2add6000f1.slice/crio-9892a864007b3fd15da1c5ecdccecc2185b79ea7cebf65685d7648638eb22e95 WatchSource:0}: Error finding container 9892a864007b3fd15da1c5ecdccecc2185b79ea7cebf65685d7648638eb22e95: Status 404 returned error can't find the container with id 9892a864007b3fd15da1c5ecdccecc2185b79ea7cebf65685d7648638eb22e95 Apr 22 15:34:55.387189 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:55.387155 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jgsl7"] Apr 22 15:34:55.391018 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:34:55.390975 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07586edf_24f7_4873_81ac_df167bc41e5e.slice/crio-acac0b83d45078a331351095665d3a9b2fdb6e9849186d4b35f1198bb8612c4e WatchSource:0}: Error finding container acac0b83d45078a331351095665d3a9b2fdb6e9849186d4b35f1198bb8612c4e: Status 404 returned error can't find the container with id acac0b83d45078a331351095665d3a9b2fdb6e9849186d4b35f1198bb8612c4e Apr 22 15:34:55.479662 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:55.479566 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jgsl7" event={"ID":"07586edf-24f7-4873-81ac-df167bc41e5e","Type":"ContainerStarted","Data":"acac0b83d45078a331351095665d3a9b2fdb6e9849186d4b35f1198bb8612c4e"} Apr 22 15:34:55.480527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:55.480506 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hgz25" event={"ID":"d6e0d117-87ac-43fe-bf80-ea2add6000f1","Type":"ContainerStarted","Data":"9892a864007b3fd15da1c5ecdccecc2185b79ea7cebf65685d7648638eb22e95"} Apr 22 15:34:58.489391 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:58.489350 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jgsl7" event={"ID":"07586edf-24f7-4873-81ac-df167bc41e5e","Type":"ContainerStarted","Data":"8cb6c6dff59f2f2b4e65f93e27e7eec07f0592fc321f11b10caf825e6cdd4d65"} Apr 22 15:34:58.489839 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:58.489571 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:34:58.505264 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:34:58.505204 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jgsl7" podStartSLOduration=66.518765615 podStartE2EDuration="1m9.505180226s" podCreationTimestamp="2026-04-22 15:33:49 +0000 UTC" firstStartedPulling="2026-04-22 15:34:55.392906102 +0000 UTC m=+66.718659735" lastFinishedPulling="2026-04-22 15:34:58.379320713 +0000 UTC m=+69.705074346" observedRunningTime="2026-04-22 15:34:58.504449936 +0000 UTC m=+69.830203589" watchObservedRunningTime="2026-04-22 15:34:58.505180226 +0000 UTC m=+69.830933877" Apr 22 15:35:01.496467 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:01.496426 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hgz25" event={"ID":"d6e0d117-87ac-43fe-bf80-ea2add6000f1","Type":"ContainerStarted","Data":"353cb5e58198a834445f2365e7afbd5020c95edd3ce22f5454901208d0533a02"} Apr 22 15:35:01.510969 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:01.510915 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hgz25" podStartSLOduration=64.909590773 podStartE2EDuration="1m10.510900256s" podCreationTimestamp="2026-04-22 15:33:51 +0000 UTC" firstStartedPulling="2026-04-22 15:34:55.374944987 +0000 UTC m=+66.700698619" lastFinishedPulling="2026-04-22 15:35:00.976254471 +0000 UTC m=+72.302008102" observedRunningTime="2026-04-22 15:35:01.510774227 +0000 UTC m=+72.836527878" watchObservedRunningTime="2026-04-22 15:35:01.510900256 +0000 UTC m=+72.836653900" Apr 22 15:35:26.364245 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:26.364206 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:35:26.364618 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:35:26.364359 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:35:26.364618 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:35:26.364382 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57f57db9dc-2l5nd: secret "image-registry-tls" not found Apr 22 15:35:26.364618 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:35:26.364453 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls podName:1aa0caed-5656-4ed9-8cc0-35f66d7d8123 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:30.36443638 +0000 UTC m=+161.690190008 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls") pod "image-registry-57f57db9dc-2l5nd" (UID: "1aa0caed-5656-4ed9-8cc0-35f66d7d8123") : secret "image-registry-tls" not found Apr 22 15:35:26.666569 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:26.666481 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert\") pod \"ingress-canary-x6h97\" (UID: \"4078b99e-a844-47ea-8e8d-88fefc3efd1b\") " pod="openshift-ingress-canary/ingress-canary-x6h97" Apr 22 15:35:26.666569 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:26.666527 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:35:26.666751 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:35:26.666637 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:35:26.666751 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:35:26.666702 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls podName:2bcb9971-bb8e-460b-b9e5-409f39381abb nodeName:}" failed. No retries permitted until 2026-04-22 15:36:30.666687404 +0000 UTC m=+161.992441032 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls") pod "dns-default-6w5vh" (UID: "2bcb9971-bb8e-460b-b9e5-409f39381abb") : secret "dns-default-metrics-tls" not found Apr 22 15:35:26.666751 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:35:26.666637 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:35:26.666847 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:35:26.666773 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert podName:4078b99e-a844-47ea-8e8d-88fefc3efd1b nodeName:}" failed. No retries permitted until 2026-04-22 15:36:30.666758966 +0000 UTC m=+161.992512598 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert") pod "ingress-canary-x6h97" (UID: "4078b99e-a844-47ea-8e8d-88fefc3efd1b") : secret "canary-serving-cert" not found Apr 22 15:35:29.494400 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:29.494369 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jgsl7" Apr 22 15:35:58.617934 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:58.617896 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng"] Apr 22 15:35:58.619937 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:58.619918 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng" Apr 22 15:35:58.622543 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:58.622516 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 15:35:58.622683 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:58.622545 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:35:58.622683 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:58.622609 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-62cmj\"" Apr 22 15:35:58.623554 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:58.623533 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 15:35:58.631873 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:58.631841 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng"] Apr 22 15:35:58.698868 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:58.698825 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qxgng\" (UID: \"33db7d39-27dc-47a6-83dc-91f5dff0fb7c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng" Apr 22 15:35:58.800172 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:58.800130 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qxgng\" (UID: \"33db7d39-27dc-47a6-83dc-91f5dff0fb7c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng" Apr 22 15:35:58.800172 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:58.800180 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjjfr\" (UniqueName: \"kubernetes.io/projected/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-kube-api-access-mjjfr\") pod \"cluster-samples-operator-6dc5bdb6b4-qxgng\" (UID: \"33db7d39-27dc-47a6-83dc-91f5dff0fb7c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng" Apr 22 15:35:58.800410 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:35:58.800286 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 15:35:58.800410 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:35:58.800365 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls podName:33db7d39-27dc-47a6-83dc-91f5dff0fb7c nodeName:}" failed. No retries permitted until 2026-04-22 15:35:59.300347717 +0000 UTC m=+130.626101349 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qxgng" (UID: "33db7d39-27dc-47a6-83dc-91f5dff0fb7c") : secret "samples-operator-tls" not found Apr 22 15:35:58.900822 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:58.900722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjjfr\" (UniqueName: \"kubernetes.io/projected/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-kube-api-access-mjjfr\") pod \"cluster-samples-operator-6dc5bdb6b4-qxgng\" (UID: \"33db7d39-27dc-47a6-83dc-91f5dff0fb7c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng" Apr 22 15:35:58.909728 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:58.909697 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjjfr\" (UniqueName: \"kubernetes.io/projected/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-kube-api-access-mjjfr\") pod \"cluster-samples-operator-6dc5bdb6b4-qxgng\" (UID: \"33db7d39-27dc-47a6-83dc-91f5dff0fb7c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng" Apr 22 15:35:59.001386 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:59.001340 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs\") pod \"network-metrics-daemon-vk5nl\" (UID: \"0708298d-9f47-4968-9489-c7cb22cb282c\") " pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:35:59.001536 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:35:59.001487 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 15:35:59.001580 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:35:59.001562 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs podName:0708298d-9f47-4968-9489-c7cb22cb282c nodeName:}" failed. No retries permitted until 2026-04-22 15:38:01.001545786 +0000 UTC m=+252.327299414 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs") pod "network-metrics-daemon-vk5nl" (UID: "0708298d-9f47-4968-9489-c7cb22cb282c") : secret "metrics-daemon-secret" not found Apr 22 15:35:59.302927 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:35:59.302876 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qxgng\" (UID: \"33db7d39-27dc-47a6-83dc-91f5dff0fb7c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng" Apr 22 15:35:59.303137 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:35:59.303022 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 15:35:59.303137 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:35:59.303106 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls podName:33db7d39-27dc-47a6-83dc-91f5dff0fb7c nodeName:}" failed. No retries permitted until 2026-04-22 15:36:00.303091393 +0000 UTC m=+131.628845022 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qxgng" (UID: "33db7d39-27dc-47a6-83dc-91f5dff0fb7c") : secret "samples-operator-tls" not found Apr 22 15:36:00.310508 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.310462 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qxgng\" (UID: \"33db7d39-27dc-47a6-83dc-91f5dff0fb7c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng" Apr 22 15:36:00.310900 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:00.310613 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 15:36:00.310900 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:00.310684 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls podName:33db7d39-27dc-47a6-83dc-91f5dff0fb7c nodeName:}" failed. No retries permitted until 2026-04-22 15:36:02.310666743 +0000 UTC m=+133.636420376 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qxgng" (UID: "33db7d39-27dc-47a6-83dc-91f5dff0fb7c") : secret "samples-operator-tls" not found Apr 22 15:36:00.613996 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.613901 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-44bxm"] Apr 22 15:36:00.615724 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.615705 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" Apr 22 15:36:00.618577 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.618546 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 15:36:00.618742 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.618630 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 15:36:00.618742 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.618655 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-8bvx7\"" Apr 22 15:36:00.619720 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.619703 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:36:00.619720 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.619702 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 15:36:00.624218 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.624194 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 15:36:00.626516 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.626491 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-44bxm"] Apr 22 15:36:00.712836 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.712800 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11386c68-c09f-4923-91a0-bfd58155fe9e-config\") pod \"console-operator-9d4b6777b-44bxm\" (UID: \"11386c68-c09f-4923-91a0-bfd58155fe9e\") " pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" Apr 22 15:36:00.713028 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.712840 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11386c68-c09f-4923-91a0-bfd58155fe9e-trusted-ca\") pod \"console-operator-9d4b6777b-44bxm\" (UID: \"11386c68-c09f-4923-91a0-bfd58155fe9e\") " pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" Apr 22 15:36:00.713028 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.712910 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgt2q\" (UniqueName: \"kubernetes.io/projected/11386c68-c09f-4923-91a0-bfd58155fe9e-kube-api-access-qgt2q\") pod \"console-operator-9d4b6777b-44bxm\" (UID: \"11386c68-c09f-4923-91a0-bfd58155fe9e\") " pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" Apr 22 15:36:00.713028 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.712957 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11386c68-c09f-4923-91a0-bfd58155fe9e-serving-cert\") pod \"console-operator-9d4b6777b-44bxm\" (UID: \"11386c68-c09f-4923-91a0-bfd58155fe9e\") " pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" Apr 22 15:36:00.814099 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.814027 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgt2q\" (UniqueName: \"kubernetes.io/projected/11386c68-c09f-4923-91a0-bfd58155fe9e-kube-api-access-qgt2q\") pod \"console-operator-9d4b6777b-44bxm\" (UID: \"11386c68-c09f-4923-91a0-bfd58155fe9e\") " pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" Apr 22 15:36:00.814224 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.814130 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11386c68-c09f-4923-91a0-bfd58155fe9e-serving-cert\") pod \"console-operator-9d4b6777b-44bxm\" (UID: \"11386c68-c09f-4923-91a0-bfd58155fe9e\") " pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" Apr 22 15:36:00.814224 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.814179 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11386c68-c09f-4923-91a0-bfd58155fe9e-config\") pod \"console-operator-9d4b6777b-44bxm\" (UID: \"11386c68-c09f-4923-91a0-bfd58155fe9e\") " pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" Apr 22 15:36:00.814224 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.814194 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11386c68-c09f-4923-91a0-bfd58155fe9e-trusted-ca\") pod \"console-operator-9d4b6777b-44bxm\" (UID: \"11386c68-c09f-4923-91a0-bfd58155fe9e\") " pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" Apr 22 15:36:00.814776 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.814752 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11386c68-c09f-4923-91a0-bfd58155fe9e-config\") pod \"console-operator-9d4b6777b-44bxm\" (UID: \"11386c68-c09f-4923-91a0-bfd58155fe9e\") " pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" Apr 22 15:36:00.814988 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.814898 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11386c68-c09f-4923-91a0-bfd58155fe9e-trusted-ca\") pod \"console-operator-9d4b6777b-44bxm\" (UID: \"11386c68-c09f-4923-91a0-bfd58155fe9e\") " pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" Apr 22 15:36:00.816640 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.816618 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11386c68-c09f-4923-91a0-bfd58155fe9e-serving-cert\") pod \"console-operator-9d4b6777b-44bxm\" (UID: \"11386c68-c09f-4923-91a0-bfd58155fe9e\") " pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" Apr 22 15:36:00.828052 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.828027 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgt2q\" (UniqueName: \"kubernetes.io/projected/11386c68-c09f-4923-91a0-bfd58155fe9e-kube-api-access-qgt2q\") pod \"console-operator-9d4b6777b-44bxm\" (UID: \"11386c68-c09f-4923-91a0-bfd58155fe9e\") " pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" Apr 22 15:36:00.926093 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:00.925984 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" Apr 22 15:36:01.026420 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:01.026395 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-64nrn_f9cf0d97-f5d8-44fe-a781-fa3940c08f48/dns-node-resolver/0.log" Apr 22 15:36:01.048565 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:01.048532 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-44bxm"] Apr 22 15:36:01.052360 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:36:01.052319 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11386c68_c09f_4923_91a0_bfd58155fe9e.slice/crio-fde099dfe1db7d307664ea931221568982e8c1127df542c208a495dde0b71468 WatchSource:0}: Error finding container fde099dfe1db7d307664ea931221568982e8c1127df542c208a495dde0b71468: Status 404 returned error can't find the container with id fde099dfe1db7d307664ea931221568982e8c1127df542c208a495dde0b71468 Apr 22 15:36:01.616173 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:01.616130 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" event={"ID":"11386c68-c09f-4923-91a0-bfd58155fe9e","Type":"ContainerStarted","Data":"fde099dfe1db7d307664ea931221568982e8c1127df542c208a495dde0b71468"} Apr 22 15:36:01.826099 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:01.826055 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8sqml_2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69/node-ca/0.log" Apr 22 15:36:02.325201 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:02.325151 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qxgng\" (UID: \"33db7d39-27dc-47a6-83dc-91f5dff0fb7c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng" Apr 22 15:36:02.325408 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:02.325372 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 15:36:02.325471 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:02.325453 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls podName:33db7d39-27dc-47a6-83dc-91f5dff0fb7c nodeName:}" failed. No retries permitted until 2026-04-22 15:36:06.32543161 +0000 UTC m=+137.651185243 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qxgng" (UID: "33db7d39-27dc-47a6-83dc-91f5dff0fb7c") : secret "samples-operator-tls" not found Apr 22 15:36:03.622247 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:03.622211 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/0.log" Apr 22 15:36:03.622610 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:03.622265 2573 generic.go:358] "Generic (PLEG): container finished" podID="11386c68-c09f-4923-91a0-bfd58155fe9e" containerID="4781e9fcd6c50d5049d047bd177e8b2fa64226516360796f9ad148c6d9346873" exitCode=255 Apr 22 15:36:03.622610 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:03.622327 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" event={"ID":"11386c68-c09f-4923-91a0-bfd58155fe9e","Type":"ContainerDied","Data":"4781e9fcd6c50d5049d047bd177e8b2fa64226516360796f9ad148c6d9346873"} Apr 22 15:36:03.622610 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:03.622539 2573 scope.go:117] "RemoveContainer" containerID="4781e9fcd6c50d5049d047bd177e8b2fa64226516360796f9ad148c6d9346873" Apr 22 15:36:04.625744 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:04.625711 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 15:36:04.626281 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:04.626052 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/0.log" Apr 22 15:36:04.626281 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:04.626106 2573 generic.go:358] "Generic (PLEG): container finished" podID="11386c68-c09f-4923-91a0-bfd58155fe9e" containerID="05c4e62870952d0ff4b1fe3c55b077f17a73583e26296f84d438f38ff4b45ce4" exitCode=255 Apr 22 15:36:04.626281 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:04.626133 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" event={"ID":"11386c68-c09f-4923-91a0-bfd58155fe9e","Type":"ContainerDied","Data":"05c4e62870952d0ff4b1fe3c55b077f17a73583e26296f84d438f38ff4b45ce4"} Apr 22 15:36:04.626281 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:04.626174 2573 scope.go:117] "RemoveContainer" containerID="4781e9fcd6c50d5049d047bd177e8b2fa64226516360796f9ad148c6d9346873" Apr 22 15:36:04.626554 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:04.626483 2573 scope.go:117] "RemoveContainer" containerID="05c4e62870952d0ff4b1fe3c55b077f17a73583e26296f84d438f38ff4b45ce4" Apr 22 15:36:04.626695 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:04.626676 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-44bxm_openshift-console-operator(11386c68-c09f-4923-91a0-bfd58155fe9e)\"" pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" podUID="11386c68-c09f-4923-91a0-bfd58155fe9e" Apr 22 15:36:05.629257 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:05.629231 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 15:36:05.629634 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:05.629552 2573 scope.go:117] "RemoveContainer" containerID="05c4e62870952d0ff4b1fe3c55b077f17a73583e26296f84d438f38ff4b45ce4" Apr 22 15:36:05.629720 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:05.629703 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-44bxm_openshift-console-operator(11386c68-c09f-4923-91a0-bfd58155fe9e)\"" pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" podUID="11386c68-c09f-4923-91a0-bfd58155fe9e" Apr 22 15:36:06.028475 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:06.028443 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-kgskm"] Apr 22 15:36:06.030677 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:06.030657 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kgskm" Apr 22 15:36:06.034095 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:06.034058 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-k592z\"" Apr 22 15:36:06.050694 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:06.050661 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-kgskm"] Apr 22 15:36:06.053943 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:06.053916 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrcj9\" (UniqueName: \"kubernetes.io/projected/ee532035-be0c-4e7e-a7b9-da60be33b91c-kube-api-access-zrcj9\") pod \"network-check-source-8894fc9bd-kgskm\" (UID: \"ee532035-be0c-4e7e-a7b9-da60be33b91c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kgskm" Apr 22 15:36:06.154986 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:06.154943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrcj9\" (UniqueName: \"kubernetes.io/projected/ee532035-be0c-4e7e-a7b9-da60be33b91c-kube-api-access-zrcj9\") pod \"network-check-source-8894fc9bd-kgskm\" (UID: \"ee532035-be0c-4e7e-a7b9-da60be33b91c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kgskm" Apr 22 15:36:06.166119 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:06.166085 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrcj9\" (UniqueName: \"kubernetes.io/projected/ee532035-be0c-4e7e-a7b9-da60be33b91c-kube-api-access-zrcj9\") pod \"network-check-source-8894fc9bd-kgskm\" (UID: \"ee532035-be0c-4e7e-a7b9-da60be33b91c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kgskm" Apr 22 15:36:06.339365 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:06.339271 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kgskm" Apr 22 15:36:06.356434 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:06.356393 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qxgng\" (UID: \"33db7d39-27dc-47a6-83dc-91f5dff0fb7c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng" Apr 22 15:36:06.356613 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:06.356588 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 15:36:06.356692 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:06.356680 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls podName:33db7d39-27dc-47a6-83dc-91f5dff0fb7c nodeName:}" failed. No retries permitted until 2026-04-22 15:36:14.356657602 +0000 UTC m=+145.682411254 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qxgng" (UID: "33db7d39-27dc-47a6-83dc-91f5dff0fb7c") : secret "samples-operator-tls" not found Apr 22 15:36:06.466381 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:06.466343 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-kgskm"] Apr 22 15:36:06.469698 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:36:06.469655 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee532035_be0c_4e7e_a7b9_da60be33b91c.slice/crio-bb5f78c3a80b0d730f88fcd9e6ee1b00c5a430da41d27a3bfb8e2481ad89afc0 WatchSource:0}: Error finding container bb5f78c3a80b0d730f88fcd9e6ee1b00c5a430da41d27a3bfb8e2481ad89afc0: Status 404 returned error can't find the container with id bb5f78c3a80b0d730f88fcd9e6ee1b00c5a430da41d27a3bfb8e2481ad89afc0 Apr 22 15:36:06.633048 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:06.632950 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kgskm" event={"ID":"ee532035-be0c-4e7e-a7b9-da60be33b91c","Type":"ContainerStarted","Data":"d93934b9121b4909a2a35886341f2928a8be785663a6a64b41eefc86a16f118c"} Apr 22 15:36:06.633048 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:06.632991 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kgskm" event={"ID":"ee532035-be0c-4e7e-a7b9-da60be33b91c","Type":"ContainerStarted","Data":"bb5f78c3a80b0d730f88fcd9e6ee1b00c5a430da41d27a3bfb8e2481ad89afc0"} Apr 22 15:36:06.653999 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:06.653939 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kgskm" podStartSLOduration=1.653919444 podStartE2EDuration="1.653919444s" podCreationTimestamp="2026-04-22 15:36:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:36:06.653173684 +0000 UTC m=+137.978927339" watchObservedRunningTime="2026-04-22 15:36:06.653919444 +0000 UTC m=+137.979673096" Apr 22 15:36:10.926843 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:10.926789 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" Apr 22 15:36:10.926843 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:10.926846 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" Apr 22 15:36:10.927429 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:10.927315 2573 scope.go:117] "RemoveContainer" containerID="05c4e62870952d0ff4b1fe3c55b077f17a73583e26296f84d438f38ff4b45ce4" Apr 22 15:36:10.927540 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:10.927516 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-44bxm_openshift-console-operator(11386c68-c09f-4923-91a0-bfd58155fe9e)\"" pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" podUID="11386c68-c09f-4923-91a0-bfd58155fe9e" Apr 22 15:36:11.714537 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:11.714498 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-hd9lf"] Apr 22 15:36:11.716422 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:11.716403 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-hd9lf" Apr 22 15:36:11.722236 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:11.719213 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 15:36:11.722678 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:11.722656 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-tbsd9\"" Apr 22 15:36:11.722779 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:11.722657 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 15:36:11.722779 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:11.722656 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 15:36:11.722871 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:11.722706 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 15:36:11.726806 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:11.726554 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-hd9lf"] Apr 22 15:36:11.796754 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:11.796710 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d688f2b7-158c-4398-9277-b2535423a024-signing-cabundle\") pod \"service-ca-865cb79987-hd9lf\" (UID: \"d688f2b7-158c-4398-9277-b2535423a024\") " pod="openshift-service-ca/service-ca-865cb79987-hd9lf" Apr 22 15:36:11.796931 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:11.796803 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf227\" (UniqueName: \"kubernetes.io/projected/d688f2b7-158c-4398-9277-b2535423a024-kube-api-access-wf227\") pod \"service-ca-865cb79987-hd9lf\" (UID: \"d688f2b7-158c-4398-9277-b2535423a024\") " pod="openshift-service-ca/service-ca-865cb79987-hd9lf" Apr 22 15:36:11.796931 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:11.796852 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d688f2b7-158c-4398-9277-b2535423a024-signing-key\") pod \"service-ca-865cb79987-hd9lf\" (UID: \"d688f2b7-158c-4398-9277-b2535423a024\") " pod="openshift-service-ca/service-ca-865cb79987-hd9lf" Apr 22 15:36:11.897975 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:11.897937 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wf227\" (UniqueName: \"kubernetes.io/projected/d688f2b7-158c-4398-9277-b2535423a024-kube-api-access-wf227\") pod \"service-ca-865cb79987-hd9lf\" (UID: \"d688f2b7-158c-4398-9277-b2535423a024\") " pod="openshift-service-ca/service-ca-865cb79987-hd9lf" Apr 22 15:36:11.898115 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:11.897999 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d688f2b7-158c-4398-9277-b2535423a024-signing-key\") pod \"service-ca-865cb79987-hd9lf\" (UID: \"d688f2b7-158c-4398-9277-b2535423a024\") " pod="openshift-service-ca/service-ca-865cb79987-hd9lf" Apr 22 15:36:11.898115 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:11.898024 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d688f2b7-158c-4398-9277-b2535423a024-signing-cabundle\") pod \"service-ca-865cb79987-hd9lf\" (UID: \"d688f2b7-158c-4398-9277-b2535423a024\") " pod="openshift-service-ca/service-ca-865cb79987-hd9lf" Apr 22 15:36:11.898672 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:11.898651 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d688f2b7-158c-4398-9277-b2535423a024-signing-cabundle\") pod \"service-ca-865cb79987-hd9lf\" (UID: \"d688f2b7-158c-4398-9277-b2535423a024\") " pod="openshift-service-ca/service-ca-865cb79987-hd9lf" Apr 22 15:36:11.900642 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:11.900619 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d688f2b7-158c-4398-9277-b2535423a024-signing-key\") pod \"service-ca-865cb79987-hd9lf\" (UID: \"d688f2b7-158c-4398-9277-b2535423a024\") " pod="openshift-service-ca/service-ca-865cb79987-hd9lf" Apr 22 15:36:11.907174 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:11.907150 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf227\" (UniqueName: \"kubernetes.io/projected/d688f2b7-158c-4398-9277-b2535423a024-kube-api-access-wf227\") pod \"service-ca-865cb79987-hd9lf\" (UID: \"d688f2b7-158c-4398-9277-b2535423a024\") " pod="openshift-service-ca/service-ca-865cb79987-hd9lf" Apr 22 15:36:12.029836 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:12.029788 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-hd9lf" Apr 22 15:36:12.155016 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:12.154954 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-hd9lf"] Apr 22 15:36:12.158922 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:36:12.158889 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd688f2b7_158c_4398_9277_b2535423a024.slice/crio-c39ac2d8d69bee32c097a3ca02342ab31b71cab509ce1bc2764c94f7de72a498 WatchSource:0}: Error finding container c39ac2d8d69bee32c097a3ca02342ab31b71cab509ce1bc2764c94f7de72a498: Status 404 returned error can't find the container with id c39ac2d8d69bee32c097a3ca02342ab31b71cab509ce1bc2764c94f7de72a498 Apr 22 15:36:12.646782 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:12.646745 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-hd9lf" event={"ID":"d688f2b7-158c-4398-9277-b2535423a024","Type":"ContainerStarted","Data":"c39ac2d8d69bee32c097a3ca02342ab31b71cab509ce1bc2764c94f7de72a498"} Apr 22 15:36:14.416952 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:14.416856 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qxgng\" (UID: \"33db7d39-27dc-47a6-83dc-91f5dff0fb7c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng" Apr 22 15:36:14.417444 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:14.416995 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 15:36:14.417444 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:14.417056 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls podName:33db7d39-27dc-47a6-83dc-91f5dff0fb7c nodeName:}" failed. No retries permitted until 2026-04-22 15:36:30.417041697 +0000 UTC m=+161.742795325 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qxgng" (UID: "33db7d39-27dc-47a6-83dc-91f5dff0fb7c") : secret "samples-operator-tls" not found Apr 22 15:36:14.653435 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:14.653399 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-hd9lf" event={"ID":"d688f2b7-158c-4398-9277-b2535423a024","Type":"ContainerStarted","Data":"b88134120c5d4532463977cc0a85af6a67b6556a2d7140d19961ff62e6bf681d"} Apr 22 15:36:14.672807 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:14.672687 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-hd9lf" podStartSLOduration=1.7551660359999999 podStartE2EDuration="3.672655086s" podCreationTimestamp="2026-04-22 15:36:11 +0000 UTC" firstStartedPulling="2026-04-22 15:36:12.160733981 +0000 UTC m=+143.486487613" lastFinishedPulling="2026-04-22 15:36:14.07822303 +0000 UTC m=+145.403976663" observedRunningTime="2026-04-22 15:36:14.670897318 +0000 UTC m=+145.996650968" watchObservedRunningTime="2026-04-22 15:36:14.672655086 +0000 UTC m=+145.998408736" Apr 22 15:36:23.250168 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:23.250132 2573 scope.go:117] "RemoveContainer" containerID="05c4e62870952d0ff4b1fe3c55b077f17a73583e26296f84d438f38ff4b45ce4" Apr 22 15:36:23.677298 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:23.677215 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 15:36:23.677436 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:23.677302 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" event={"ID":"11386c68-c09f-4923-91a0-bfd58155fe9e","Type":"ContainerStarted","Data":"d0d24cc0a7973979fb69437081a2c0d3d9cba0eeba26d7f2af9863b565e8f138"} Apr 22 15:36:23.677646 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:23.677629 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" Apr 22 15:36:23.697670 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:23.697604 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" podStartSLOduration=21.308120057 podStartE2EDuration="23.697587386s" podCreationTimestamp="2026-04-22 15:36:00 +0000 UTC" firstStartedPulling="2026-04-22 15:36:01.054269535 +0000 UTC m=+132.380023171" lastFinishedPulling="2026-04-22 15:36:03.443736868 +0000 UTC m=+134.769490500" observedRunningTime="2026-04-22 15:36:23.69553007 +0000 UTC m=+155.021283720" watchObservedRunningTime="2026-04-22 15:36:23.697587386 +0000 UTC m=+155.023341036" Apr 22 15:36:24.278487 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:24.278458 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-44bxm" Apr 22 15:36:25.513992 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:25.513950 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" podUID="1aa0caed-5656-4ed9-8cc0-35f66d7d8123" Apr 22 15:36:25.683035 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:25.683002 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:36:25.776364 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:25.776265 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-x6h97" podUID="4078b99e-a844-47ea-8e8d-88fefc3efd1b" Apr 22 15:36:25.796914 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:25.796869 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6w5vh" podUID="2bcb9971-bb8e-460b-b9e5-409f39381abb" Apr 22 15:36:26.687014 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:26.686915 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-x6h97" Apr 22 15:36:26.687438 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:26.687096 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6w5vh" Apr 22 15:36:27.276428 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:27.276375 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-vk5nl" podUID="0708298d-9f47-4968-9489-c7cb22cb282c" Apr 22 15:36:30.451203 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.451160 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:36:30.451602 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.451261 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qxgng\" (UID: \"33db7d39-27dc-47a6-83dc-91f5dff0fb7c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng" Apr 22 15:36:30.453824 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.453801 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls\") pod \"image-registry-57f57db9dc-2l5nd\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:36:30.453912 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.453891 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33db7d39-27dc-47a6-83dc-91f5dff0fb7c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qxgng\" (UID: \"33db7d39-27dc-47a6-83dc-91f5dff0fb7c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng" Apr 22 15:36:30.486576 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.486543 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-92frt\"" Apr 22 15:36:30.494817 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.494787 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:36:30.629044 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.629012 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57f57db9dc-2l5nd"] Apr 22 15:36:30.632230 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:36:30.632189 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aa0caed_5656_4ed9_8cc0_35f66d7d8123.slice/crio-74250c3747988c6d1e1abda6658f12c76da0a9102cbd166bc71ca0f9bd27ae1f WatchSource:0}: Error finding container 74250c3747988c6d1e1abda6658f12c76da0a9102cbd166bc71ca0f9bd27ae1f: Status 404 returned error can't find the container with id 74250c3747988c6d1e1abda6658f12c76da0a9102cbd166bc71ca0f9bd27ae1f Apr 22 15:36:30.699428 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.699389 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" event={"ID":"1aa0caed-5656-4ed9-8cc0-35f66d7d8123","Type":"ContainerStarted","Data":"056a9d63ff8c43589a80afba47d9c867da3225dd035c23579670a3114225b7bb"} Apr 22 15:36:30.699428 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.699424 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" event={"ID":"1aa0caed-5656-4ed9-8cc0-35f66d7d8123","Type":"ContainerStarted","Data":"74250c3747988c6d1e1abda6658f12c76da0a9102cbd166bc71ca0f9bd27ae1f"} Apr 22 15:36:30.699680 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.699535 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:36:30.726281 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.726159 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" podStartSLOduration=161.726141629 podStartE2EDuration="2m41.726141629s" podCreationTimestamp="2026-04-22 15:33:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:36:30.724924261 +0000 UTC m=+162.050677939" watchObservedRunningTime="2026-04-22 15:36:30.726141629 +0000 UTC m=+162.051895280" Apr 22 15:36:30.729272 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.729246 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng" Apr 22 15:36:30.754310 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.754267 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert\") pod \"ingress-canary-x6h97\" (UID: \"4078b99e-a844-47ea-8e8d-88fefc3efd1b\") " pod="openshift-ingress-canary/ingress-canary-x6h97" Apr 22 15:36:30.754503 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.754342 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:36:30.757114 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.757050 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bcb9971-bb8e-460b-b9e5-409f39381abb-metrics-tls\") pod \"dns-default-6w5vh\" (UID: \"2bcb9971-bb8e-460b-b9e5-409f39381abb\") " pod="openshift-dns/dns-default-6w5vh" Apr 22 15:36:30.757514 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.757496 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4078b99e-a844-47ea-8e8d-88fefc3efd1b-cert\") pod \"ingress-canary-x6h97\" (UID: \"4078b99e-a844-47ea-8e8d-88fefc3efd1b\") " pod="openshift-ingress-canary/ingress-canary-x6h97" Apr 22 15:36:30.858692 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.858647 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng"] Apr 22 15:36:30.890541 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.890512 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xgzsv\"" Apr 22 15:36:30.890541 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.890512 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fpsm4\"" Apr 22 15:36:30.898142 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.898118 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-x6h97" Apr 22 15:36:30.898249 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:30.898141 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6w5vh" Apr 22 15:36:31.052090 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:31.052039 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6w5vh"] Apr 22 15:36:31.055444 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:36:31.055414 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bcb9971_bb8e_460b_b9e5_409f39381abb.slice/crio-75615079d8ccb78c79670f4517bbe828562ff676080ffddec89e1a29e6ebf724 WatchSource:0}: Error finding container 75615079d8ccb78c79670f4517bbe828562ff676080ffddec89e1a29e6ebf724: Status 404 returned error can't find the container with id 75615079d8ccb78c79670f4517bbe828562ff676080ffddec89e1a29e6ebf724 Apr 22 15:36:31.064677 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:31.064646 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-x6h97"] Apr 22 15:36:31.068373 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:36:31.068331 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4078b99e_a844_47ea_8e8d_88fefc3efd1b.slice/crio-2fbe2dfe1b519d4ee4104d2f9894dc9e3f3d479a26443787de085f1b6b3c99f9 WatchSource:0}: Error finding container 2fbe2dfe1b519d4ee4104d2f9894dc9e3f3d479a26443787de085f1b6b3c99f9: Status 404 returned error can't find the container with id 2fbe2dfe1b519d4ee4104d2f9894dc9e3f3d479a26443787de085f1b6b3c99f9 Apr 22 15:36:31.704000 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:31.703952 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-x6h97" event={"ID":"4078b99e-a844-47ea-8e8d-88fefc3efd1b","Type":"ContainerStarted","Data":"2fbe2dfe1b519d4ee4104d2f9894dc9e3f3d479a26443787de085f1b6b3c99f9"} Apr 22 15:36:31.705193 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:31.705164 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng" event={"ID":"33db7d39-27dc-47a6-83dc-91f5dff0fb7c","Type":"ContainerStarted","Data":"84c40b661d822ee69be24752abb3a4e90ba6ba01620ac59814a0bdce42dc351c"} Apr 22 15:36:31.706636 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:31.706609 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6w5vh" event={"ID":"2bcb9971-bb8e-460b-b9e5-409f39381abb","Type":"ContainerStarted","Data":"75615079d8ccb78c79670f4517bbe828562ff676080ffddec89e1a29e6ebf724"} Apr 22 15:36:32.165562 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.165530 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fx4lq"] Apr 22 15:36:32.168214 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.168185 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fx4lq" Apr 22 15:36:32.172498 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.172468 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 15:36:32.172801 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.172760 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-6mcjg\"" Apr 22 15:36:32.172935 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.172806 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 15:36:32.172935 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.172825 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 15:36:32.173053 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.172974 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 15:36:32.184281 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.184224 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fx4lq"] Apr 22 15:36:32.267473 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.267434 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b4c0ab11-708d-4eeb-bc03-2f8ab994f98e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fx4lq\" (UID: \"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e\") " pod="openshift-insights/insights-runtime-extractor-fx4lq" Apr 22 15:36:32.267672 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.267494 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jts2p\" (UniqueName: \"kubernetes.io/projected/b4c0ab11-708d-4eeb-bc03-2f8ab994f98e-kube-api-access-jts2p\") pod \"insights-runtime-extractor-fx4lq\" (UID: \"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e\") " pod="openshift-insights/insights-runtime-extractor-fx4lq" Apr 22 15:36:32.267672 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.267562 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b4c0ab11-708d-4eeb-bc03-2f8ab994f98e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fx4lq\" (UID: \"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e\") " pod="openshift-insights/insights-runtime-extractor-fx4lq" Apr 22 15:36:32.267672 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.267593 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b4c0ab11-708d-4eeb-bc03-2f8ab994f98e-crio-socket\") pod \"insights-runtime-extractor-fx4lq\" (UID: \"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e\") " pod="openshift-insights/insights-runtime-extractor-fx4lq" Apr 22 15:36:32.267672 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.267622 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b4c0ab11-708d-4eeb-bc03-2f8ab994f98e-data-volume\") pod \"insights-runtime-extractor-fx4lq\" (UID: \"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e\") " pod="openshift-insights/insights-runtime-extractor-fx4lq" Apr 22 15:36:32.369244 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.368999 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jts2p\" (UniqueName: \"kubernetes.io/projected/b4c0ab11-708d-4eeb-bc03-2f8ab994f98e-kube-api-access-jts2p\") pod \"insights-runtime-extractor-fx4lq\" (UID: \"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e\") " pod="openshift-insights/insights-runtime-extractor-fx4lq" Apr 22 15:36:32.369244 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.369126 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b4c0ab11-708d-4eeb-bc03-2f8ab994f98e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fx4lq\" (UID: \"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e\") " pod="openshift-insights/insights-runtime-extractor-fx4lq" Apr 22 15:36:32.369244 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.369156 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b4c0ab11-708d-4eeb-bc03-2f8ab994f98e-crio-socket\") pod \"insights-runtime-extractor-fx4lq\" (UID: \"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e\") " pod="openshift-insights/insights-runtime-extractor-fx4lq" Apr 22 15:36:32.369244 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.369188 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b4c0ab11-708d-4eeb-bc03-2f8ab994f98e-data-volume\") pod \"insights-runtime-extractor-fx4lq\" (UID: \"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e\") " pod="openshift-insights/insights-runtime-extractor-fx4lq" Apr 22 15:36:32.369713 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.369390 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b4c0ab11-708d-4eeb-bc03-2f8ab994f98e-crio-socket\") pod \"insights-runtime-extractor-fx4lq\" (UID: \"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e\") " pod="openshift-insights/insights-runtime-extractor-fx4lq" Apr 22 15:36:32.369713 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.369540 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b4c0ab11-708d-4eeb-bc03-2f8ab994f98e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fx4lq\" (UID: \"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e\") " pod="openshift-insights/insights-runtime-extractor-fx4lq" Apr 22 15:36:32.369713 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.369556 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b4c0ab11-708d-4eeb-bc03-2f8ab994f98e-data-volume\") pod \"insights-runtime-extractor-fx4lq\" (UID: \"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e\") " pod="openshift-insights/insights-runtime-extractor-fx4lq" Apr 22 15:36:32.370048 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.370025 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b4c0ab11-708d-4eeb-bc03-2f8ab994f98e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fx4lq\" (UID: \"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e\") " pod="openshift-insights/insights-runtime-extractor-fx4lq" Apr 22 15:36:32.372321 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.372270 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b4c0ab11-708d-4eeb-bc03-2f8ab994f98e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fx4lq\" (UID: \"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e\") " pod="openshift-insights/insights-runtime-extractor-fx4lq" Apr 22 15:36:32.381821 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.381781 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jts2p\" (UniqueName: \"kubernetes.io/projected/b4c0ab11-708d-4eeb-bc03-2f8ab994f98e-kube-api-access-jts2p\") pod \"insights-runtime-extractor-fx4lq\" (UID: \"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e\") " pod="openshift-insights/insights-runtime-extractor-fx4lq" Apr 22 15:36:32.480356 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.480260 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fx4lq" Apr 22 15:36:32.645463 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:32.645429 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fx4lq"] Apr 22 15:36:33.043847 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:36:33.043794 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4c0ab11_708d_4eeb_bc03_2f8ab994f98e.slice/crio-9c7d0c57c416d841118968e6824aa369bd4c566c03ca8956054495c646ff0587 WatchSource:0}: Error finding container 9c7d0c57c416d841118968e6824aa369bd4c566c03ca8956054495c646ff0587: Status 404 returned error can't find the container with id 9c7d0c57c416d841118968e6824aa369bd4c566c03ca8956054495c646ff0587 Apr 22 15:36:33.257130 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.257091 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-75885c989c-x9pd8"] Apr 22 15:36:33.260796 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.260768 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.263492 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.263462 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-sfr9v\"" Apr 22 15:36:33.263665 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.263495 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 15:36:33.264514 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.264490 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 15:36:33.265210 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.264927 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 15:36:33.265210 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.264966 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 15:36:33.265210 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.264998 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 15:36:33.265210 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.264928 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 15:36:33.265210 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.265109 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 15:36:33.271964 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.271938 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75885c989c-x9pd8"] Apr 22 15:36:33.376903 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.376809 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/876701e9-e1dc-47dc-b581-0b5461b6b59f-console-serving-cert\") pod \"console-75885c989c-x9pd8\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.377100 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.376958 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm2t2\" (UniqueName: \"kubernetes.io/projected/876701e9-e1dc-47dc-b581-0b5461b6b59f-kube-api-access-bm2t2\") pod \"console-75885c989c-x9pd8\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.377100 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.377035 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/876701e9-e1dc-47dc-b581-0b5461b6b59f-oauth-serving-cert\") pod \"console-75885c989c-x9pd8\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.377100 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.377083 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/876701e9-e1dc-47dc-b581-0b5461b6b59f-console-config\") pod \"console-75885c989c-x9pd8\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.377256 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.377119 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/876701e9-e1dc-47dc-b581-0b5461b6b59f-console-oauth-config\") pod \"console-75885c989c-x9pd8\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.377256 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.377152 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/876701e9-e1dc-47dc-b581-0b5461b6b59f-service-ca\") pod \"console-75885c989c-x9pd8\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.478052 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.478016 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bm2t2\" (UniqueName: \"kubernetes.io/projected/876701e9-e1dc-47dc-b581-0b5461b6b59f-kube-api-access-bm2t2\") pod \"console-75885c989c-x9pd8\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.478263 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.478085 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/876701e9-e1dc-47dc-b581-0b5461b6b59f-oauth-serving-cert\") pod \"console-75885c989c-x9pd8\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.478263 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.478117 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/876701e9-e1dc-47dc-b581-0b5461b6b59f-console-config\") pod \"console-75885c989c-x9pd8\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.478263 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.478143 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/876701e9-e1dc-47dc-b581-0b5461b6b59f-console-oauth-config\") pod \"console-75885c989c-x9pd8\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.478263 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.478171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/876701e9-e1dc-47dc-b581-0b5461b6b59f-service-ca\") pod \"console-75885c989c-x9pd8\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.478263 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.478196 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/876701e9-e1dc-47dc-b581-0b5461b6b59f-console-serving-cert\") pod \"console-75885c989c-x9pd8\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.478935 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.478904 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/876701e9-e1dc-47dc-b581-0b5461b6b59f-console-config\") pod \"console-75885c989c-x9pd8\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.479175 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.478904 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/876701e9-e1dc-47dc-b581-0b5461b6b59f-oauth-serving-cert\") pod \"console-75885c989c-x9pd8\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.479175 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.478953 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/876701e9-e1dc-47dc-b581-0b5461b6b59f-service-ca\") pod \"console-75885c989c-x9pd8\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.480678 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.480650 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/876701e9-e1dc-47dc-b581-0b5461b6b59f-console-oauth-config\") pod \"console-75885c989c-x9pd8\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.480881 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.480733 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/876701e9-e1dc-47dc-b581-0b5461b6b59f-console-serving-cert\") pod \"console-75885c989c-x9pd8\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.486854 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.486827 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm2t2\" (UniqueName: \"kubernetes.io/projected/876701e9-e1dc-47dc-b581-0b5461b6b59f-kube-api-access-bm2t2\") pod \"console-75885c989c-x9pd8\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.573228 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.573185 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:33.713609 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:33.713521 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fx4lq" event={"ID":"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e","Type":"ContainerStarted","Data":"9c7d0c57c416d841118968e6824aa369bd4c566c03ca8956054495c646ff0587"} Apr 22 15:36:34.052821 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:34.052749 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75885c989c-x9pd8"] Apr 22 15:36:34.056140 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:36:34.056079 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod876701e9_e1dc_47dc_b581_0b5461b6b59f.slice/crio-64296a91a3c191fc4bb8c14042df635818018a65c348d093f49c2827a8fe9a7a WatchSource:0}: Error finding container 64296a91a3c191fc4bb8c14042df635818018a65c348d093f49c2827a8fe9a7a: Status 404 returned error can't find the container with id 64296a91a3c191fc4bb8c14042df635818018a65c348d093f49c2827a8fe9a7a Apr 22 15:36:34.718356 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:34.718316 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6w5vh" event={"ID":"2bcb9971-bb8e-460b-b9e5-409f39381abb","Type":"ContainerStarted","Data":"149d2df348de7e5769c2210fabd8dce2d7294fde60f14864ab6146dac9fadc3c"} Apr 22 15:36:34.718356 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:34.718362 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6w5vh" event={"ID":"2bcb9971-bb8e-460b-b9e5-409f39381abb","Type":"ContainerStarted","Data":"fcfa5846ceb1a18c02b8e44e892e35b2bea0f03cf831da7378e5f802a8fc41de"} Apr 22 15:36:34.718611 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:34.718442 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6w5vh" Apr 22 15:36:34.720126 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:34.720092 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fx4lq" event={"ID":"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e","Type":"ContainerStarted","Data":"64fa52950a4cd8efa7418fbc7ff4195fa9bb1728798cd5b38320b04fd6b347fa"} Apr 22 15:36:34.720253 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:34.720131 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fx4lq" event={"ID":"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e","Type":"ContainerStarted","Data":"e736bb353def304533a18d84055e248f97e80bf0d56bc5bffaa93d404a2cdfd2"} Apr 22 15:36:34.721662 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:34.721584 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-x6h97" event={"ID":"4078b99e-a844-47ea-8e8d-88fefc3efd1b","Type":"ContainerStarted","Data":"8520d0443a5bb65d5851cb29d532850584404c88d1dc0be9bc80e9b45c66e339"} Apr 22 15:36:34.723600 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:34.723572 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng" event={"ID":"33db7d39-27dc-47a6-83dc-91f5dff0fb7c","Type":"ContainerStarted","Data":"9404559d37ca7f76804302288ed53e12c0c933f66953338634b17b846d97eb7a"} Apr 22 15:36:34.723716 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:34.723609 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng" event={"ID":"33db7d39-27dc-47a6-83dc-91f5dff0fb7c","Type":"ContainerStarted","Data":"50b92958924a95004b8013d2e69b6e246694a9a2292d9d5edd171a99c9e79cfb"} Apr 22 15:36:34.724900 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:34.724876 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75885c989c-x9pd8" event={"ID":"876701e9-e1dc-47dc-b581-0b5461b6b59f","Type":"ContainerStarted","Data":"64296a91a3c191fc4bb8c14042df635818018a65c348d093f49c2827a8fe9a7a"} Apr 22 15:36:34.740798 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:34.740741 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6w5vh" podStartSLOduration=129.904080041 podStartE2EDuration="2m12.740724807s" podCreationTimestamp="2026-04-22 15:34:22 +0000 UTC" firstStartedPulling="2026-04-22 15:36:31.057475161 +0000 UTC m=+162.383228790" lastFinishedPulling="2026-04-22 15:36:33.894119912 +0000 UTC m=+165.219873556" observedRunningTime="2026-04-22 15:36:34.739278095 +0000 UTC m=+166.065031770" watchObservedRunningTime="2026-04-22 15:36:34.740724807 +0000 UTC m=+166.066478468" Apr 22 15:36:34.756564 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:34.756500 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-x6h97" podStartSLOduration=129.93490799 podStartE2EDuration="2m12.756482717s" podCreationTimestamp="2026-04-22 15:34:22 +0000 UTC" firstStartedPulling="2026-04-22 15:36:31.070321341 +0000 UTC m=+162.396074973" lastFinishedPulling="2026-04-22 15:36:33.891896057 +0000 UTC m=+165.217649700" observedRunningTime="2026-04-22 15:36:34.756278599 +0000 UTC m=+166.082032278" watchObservedRunningTime="2026-04-22 15:36:34.756482717 +0000 UTC m=+166.082236371" Apr 22 15:36:34.775219 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:34.775156 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qxgng" podStartSLOduration=33.777888902 podStartE2EDuration="36.77513507s" podCreationTimestamp="2026-04-22 15:35:58 +0000 UTC" firstStartedPulling="2026-04-22 15:36:30.894647961 +0000 UTC m=+162.220401593" lastFinishedPulling="2026-04-22 15:36:33.89189413 +0000 UTC m=+165.217647761" observedRunningTime="2026-04-22 15:36:34.774461873 +0000 UTC m=+166.100215525" watchObservedRunningTime="2026-04-22 15:36:34.77513507 +0000 UTC m=+166.100888720" Apr 22 15:36:36.129536 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.129496 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-g82q2"] Apr 22 15:36:36.132180 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.132153 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" Apr 22 15:36:36.134565 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.134532 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 15:36:36.134732 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.134702 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 15:36:36.136628 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.135732 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 15:36:36.136628 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.135983 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-njv29\"" Apr 22 15:36:36.136628 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.136300 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 15:36:36.136628 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.136499 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 15:36:36.142823 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.142696 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-g82q2"] Apr 22 15:36:36.205209 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.205104 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4743dda-39bf-4a2e-b129-091f383cd787-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-g82q2\" (UID: \"a4743dda-39bf-4a2e-b129-091f383cd787\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" Apr 22 15:36:36.205209 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.205196 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a4743dda-39bf-4a2e-b129-091f383cd787-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-g82q2\" (UID: \"a4743dda-39bf-4a2e-b129-091f383cd787\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" Apr 22 15:36:36.205423 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.205230 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a4743dda-39bf-4a2e-b129-091f383cd787-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-g82q2\" (UID: \"a4743dda-39bf-4a2e-b129-091f383cd787\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" Apr 22 15:36:36.205423 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.205281 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4djm\" (UniqueName: \"kubernetes.io/projected/a4743dda-39bf-4a2e-b129-091f383cd787-kube-api-access-b4djm\") pod \"prometheus-operator-5676c8c784-g82q2\" (UID: \"a4743dda-39bf-4a2e-b129-091f383cd787\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" Apr 22 15:36:36.305997 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.305957 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4743dda-39bf-4a2e-b129-091f383cd787-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-g82q2\" (UID: \"a4743dda-39bf-4a2e-b129-091f383cd787\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" Apr 22 15:36:36.306211 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.306017 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a4743dda-39bf-4a2e-b129-091f383cd787-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-g82q2\" (UID: \"a4743dda-39bf-4a2e-b129-091f383cd787\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" Apr 22 15:36:36.306211 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.306049 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a4743dda-39bf-4a2e-b129-091f383cd787-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-g82q2\" (UID: \"a4743dda-39bf-4a2e-b129-091f383cd787\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" Apr 22 15:36:36.306211 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.306107 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4djm\" (UniqueName: \"kubernetes.io/projected/a4743dda-39bf-4a2e-b129-091f383cd787-kube-api-access-b4djm\") pod \"prometheus-operator-5676c8c784-g82q2\" (UID: \"a4743dda-39bf-4a2e-b129-091f383cd787\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" Apr 22 15:36:36.306211 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:36.306150 2573 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 15:36:36.306398 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:36.306239 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4743dda-39bf-4a2e-b129-091f383cd787-prometheus-operator-tls podName:a4743dda-39bf-4a2e-b129-091f383cd787 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:36.806217323 +0000 UTC m=+168.131970952 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/a4743dda-39bf-4a2e-b129-091f383cd787-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-g82q2" (UID: "a4743dda-39bf-4a2e-b129-091f383cd787") : secret "prometheus-operator-tls" not found Apr 22 15:36:36.307253 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.307228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a4743dda-39bf-4a2e-b129-091f383cd787-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-g82q2\" (UID: \"a4743dda-39bf-4a2e-b129-091f383cd787\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" Apr 22 15:36:36.309320 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.309283 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a4743dda-39bf-4a2e-b129-091f383cd787-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-g82q2\" (UID: \"a4743dda-39bf-4a2e-b129-091f383cd787\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" Apr 22 15:36:36.315919 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.315814 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4djm\" (UniqueName: \"kubernetes.io/projected/a4743dda-39bf-4a2e-b129-091f383cd787-kube-api-access-b4djm\") pod \"prometheus-operator-5676c8c784-g82q2\" (UID: \"a4743dda-39bf-4a2e-b129-091f383cd787\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" Apr 22 15:36:36.733256 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.733208 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fx4lq" event={"ID":"b4c0ab11-708d-4eeb-bc03-2f8ab994f98e","Type":"ContainerStarted","Data":"d245be0562b08087eb5a90267965cfcbd06bb43a7ca339fbf4b8c440997271a8"} Apr 22 15:36:36.751849 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.751797 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fx4lq" podStartSLOduration=2.818592001 podStartE2EDuration="4.751780061s" podCreationTimestamp="2026-04-22 15:36:32 +0000 UTC" firstStartedPulling="2026-04-22 15:36:33.980020116 +0000 UTC m=+165.305773745" lastFinishedPulling="2026-04-22 15:36:35.913208175 +0000 UTC m=+167.238961805" observedRunningTime="2026-04-22 15:36:36.750516423 +0000 UTC m=+168.076270110" watchObservedRunningTime="2026-04-22 15:36:36.751780061 +0000 UTC m=+168.077533730" Apr 22 15:36:36.811983 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.811935 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4743dda-39bf-4a2e-b129-091f383cd787-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-g82q2\" (UID: \"a4743dda-39bf-4a2e-b129-091f383cd787\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" Apr 22 15:36:36.815057 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:36.815019 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4743dda-39bf-4a2e-b129-091f383cd787-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-g82q2\" (UID: \"a4743dda-39bf-4a2e-b129-091f383cd787\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" Apr 22 15:36:37.044615 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:37.044568 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" Apr 22 15:36:37.507877 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:37.507843 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-g82q2"] Apr 22 15:36:37.511474 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:36:37.511438 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4743dda_39bf_4a2e_b129_091f383cd787.slice/crio-1cc730f303662eae795868d7baa67b1929f77f6a160c277e65bed19e86e53ecb WatchSource:0}: Error finding container 1cc730f303662eae795868d7baa67b1929f77f6a160c277e65bed19e86e53ecb: Status 404 returned error can't find the container with id 1cc730f303662eae795868d7baa67b1929f77f6a160c277e65bed19e86e53ecb Apr 22 15:36:37.737700 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:37.737669 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75885c989c-x9pd8" event={"ID":"876701e9-e1dc-47dc-b581-0b5461b6b59f","Type":"ContainerStarted","Data":"879e7088161367942791fa7598c133bc85258793db8d48fd845ef8e5594288f3"} Apr 22 15:36:37.738851 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:37.738824 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" event={"ID":"a4743dda-39bf-4a2e-b129-091f383cd787","Type":"ContainerStarted","Data":"1cc730f303662eae795868d7baa67b1929f77f6a160c277e65bed19e86e53ecb"} Apr 22 15:36:37.757861 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:37.757813 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75885c989c-x9pd8" podStartSLOduration=1.413914658 podStartE2EDuration="4.75779723s" podCreationTimestamp="2026-04-22 15:36:33 +0000 UTC" firstStartedPulling="2026-04-22 15:36:34.058511727 +0000 UTC m=+165.384265357" lastFinishedPulling="2026-04-22 15:36:37.402394293 +0000 UTC m=+168.728147929" observedRunningTime="2026-04-22 15:36:37.756977793 +0000 UTC m=+169.082731468" watchObservedRunningTime="2026-04-22 15:36:37.75779723 +0000 UTC m=+169.083550880" Apr 22 15:36:39.745849 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:39.745809 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" event={"ID":"a4743dda-39bf-4a2e-b129-091f383cd787","Type":"ContainerStarted","Data":"a0138d975f471e2fbbbb02f2638760d07ef746932f2ec51f1feda6b4f3cf6d46"} Apr 22 15:36:39.745849 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:39.745846 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" event={"ID":"a4743dda-39bf-4a2e-b129-091f383cd787","Type":"ContainerStarted","Data":"0fc82a98219f0cf351e2afc2f7ccd83c0b0f6ad9cf03bd1a6bdec98162a82738"} Apr 22 15:36:39.762821 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:39.762758 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-g82q2" podStartSLOduration=2.356063503 podStartE2EDuration="3.762739484s" podCreationTimestamp="2026-04-22 15:36:36 +0000 UTC" firstStartedPulling="2026-04-22 15:36:37.514442074 +0000 UTC m=+168.840195703" lastFinishedPulling="2026-04-22 15:36:38.921118051 +0000 UTC m=+170.246871684" observedRunningTime="2026-04-22 15:36:39.762019088 +0000 UTC m=+171.087772751" watchObservedRunningTime="2026-04-22 15:36:39.762739484 +0000 UTC m=+171.088493134" Apr 22 15:36:41.250045 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.250003 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:36:41.508700 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.508601 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-tskkt"] Apr 22 15:36:41.511360 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.511341 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.515892 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.515857 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 15:36:41.516942 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.516904 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-gn49b\"" Apr 22 15:36:41.517111 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.516965 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 15:36:41.518411 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.518386 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-k4jmm"] Apr 22 15:36:41.519356 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.519329 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 15:36:41.521148 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.521122 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.523731 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.523704 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 15:36:41.523844 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.523709 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 15:36:41.523965 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.523950 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-pqtf6\"" Apr 22 15:36:41.524137 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.524121 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 15:36:41.533512 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.533480 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-tskkt"] Apr 22 15:36:41.654645 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.654601 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/500e5ead-fa2e-40ff-8137-d9dbe8098414-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-tskkt\" (UID: \"500e5ead-fa2e-40ff-8137-d9dbe8098414\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.654645 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.654642 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-node-exporter-tls\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.654916 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.654678 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/500e5ead-fa2e-40ff-8137-d9dbe8098414-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tskkt\" (UID: \"500e5ead-fa2e-40ff-8137-d9dbe8098414\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.654916 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.654700 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-root\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.654916 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.654717 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-node-exporter-wtmp\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.654916 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.654744 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-node-exporter-textfile\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.654916 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.654801 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-metrics-client-ca\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.654916 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.654848 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-node-exporter-accelerators-collector-config\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.654916 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.654884 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.654916 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.654900 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/500e5ead-fa2e-40ff-8137-d9dbe8098414-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-tskkt\" (UID: \"500e5ead-fa2e-40ff-8137-d9dbe8098414\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.655303 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.654931 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsh5t\" (UniqueName: \"kubernetes.io/projected/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-kube-api-access-qsh5t\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.655303 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.654983 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/500e5ead-fa2e-40ff-8137-d9dbe8098414-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-tskkt\" (UID: \"500e5ead-fa2e-40ff-8137-d9dbe8098414\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.655303 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.655003 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsmlb\" (UniqueName: \"kubernetes.io/projected/500e5ead-fa2e-40ff-8137-d9dbe8098414-kube-api-access-dsmlb\") pod \"kube-state-metrics-69db897b98-tskkt\" (UID: \"500e5ead-fa2e-40ff-8137-d9dbe8098414\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.655303 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.655035 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/500e5ead-fa2e-40ff-8137-d9dbe8098414-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-tskkt\" (UID: \"500e5ead-fa2e-40ff-8137-d9dbe8098414\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.655303 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.655057 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-sys\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.756212 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.756177 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-node-exporter-accelerators-collector-config\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.756212 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.756216 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.756483 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.756239 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/500e5ead-fa2e-40ff-8137-d9dbe8098414-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-tskkt\" (UID: \"500e5ead-fa2e-40ff-8137-d9dbe8098414\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.756483 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.756359 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsh5t\" (UniqueName: \"kubernetes.io/projected/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-kube-api-access-qsh5t\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.756483 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.756415 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/500e5ead-fa2e-40ff-8137-d9dbe8098414-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-tskkt\" (UID: \"500e5ead-fa2e-40ff-8137-d9dbe8098414\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.756483 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.756447 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsmlb\" (UniqueName: \"kubernetes.io/projected/500e5ead-fa2e-40ff-8137-d9dbe8098414-kube-api-access-dsmlb\") pod \"kube-state-metrics-69db897b98-tskkt\" (UID: \"500e5ead-fa2e-40ff-8137-d9dbe8098414\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.756676 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.756481 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/500e5ead-fa2e-40ff-8137-d9dbe8098414-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-tskkt\" (UID: \"500e5ead-fa2e-40ff-8137-d9dbe8098414\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.756676 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.756520 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-sys\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.756676 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.756555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/500e5ead-fa2e-40ff-8137-d9dbe8098414-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-tskkt\" (UID: \"500e5ead-fa2e-40ff-8137-d9dbe8098414\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.756676 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.756581 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-node-exporter-tls\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.756676 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.756615 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/500e5ead-fa2e-40ff-8137-d9dbe8098414-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tskkt\" (UID: \"500e5ead-fa2e-40ff-8137-d9dbe8098414\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.756676 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.756641 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-root\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.756676 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.756671 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-node-exporter-wtmp\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.757001 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.756703 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-sys\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.757001 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.756708 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-node-exporter-textfile\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.757001 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.756788 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-metrics-client-ca\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.757001 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:41.756839 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 15:36:41.757001 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:36:41.756903 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-node-exporter-tls podName:a9bbe7fd-fc20-4d50-828c-7ba2ab200da2 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:42.256881682 +0000 UTC m=+173.582635319 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-node-exporter-tls") pod "node-exporter-k4jmm" (UID: "a9bbe7fd-fc20-4d50-828c-7ba2ab200da2") : secret "node-exporter-tls" not found Apr 22 15:36:41.757001 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.756996 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-node-exporter-textfile\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.757312 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.757083 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-root\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.757312 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.757168 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/500e5ead-fa2e-40ff-8137-d9dbe8098414-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-tskkt\" (UID: \"500e5ead-fa2e-40ff-8137-d9dbe8098414\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.757312 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.757215 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-node-exporter-wtmp\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.757439 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.757425 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/500e5ead-fa2e-40ff-8137-d9dbe8098414-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-tskkt\" (UID: \"500e5ead-fa2e-40ff-8137-d9dbe8098414\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.757591 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.757567 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-metrics-client-ca\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.757943 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.757916 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-node-exporter-accelerators-collector-config\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.758052 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.757995 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/500e5ead-fa2e-40ff-8137-d9dbe8098414-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-tskkt\" (UID: \"500e5ead-fa2e-40ff-8137-d9dbe8098414\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.759429 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.759370 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.759611 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.759587 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/500e5ead-fa2e-40ff-8137-d9dbe8098414-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-tskkt\" (UID: \"500e5ead-fa2e-40ff-8137-d9dbe8098414\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.759761 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.759743 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/500e5ead-fa2e-40ff-8137-d9dbe8098414-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tskkt\" (UID: \"500e5ead-fa2e-40ff-8137-d9dbe8098414\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.766131 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.766104 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsh5t\" (UniqueName: \"kubernetes.io/projected/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-kube-api-access-qsh5t\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:41.771824 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.771796 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsmlb\" (UniqueName: \"kubernetes.io/projected/500e5ead-fa2e-40ff-8137-d9dbe8098414-kube-api-access-dsmlb\") pod \"kube-state-metrics-69db897b98-tskkt\" (UID: \"500e5ead-fa2e-40ff-8137-d9dbe8098414\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.823597 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.823555 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" Apr 22 15:36:41.960022 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:41.959987 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-tskkt"] Apr 22 15:36:41.963388 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:36:41.963349 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod500e5ead_fa2e_40ff_8137_d9dbe8098414.slice/crio-0951fdaa2cdbb2361a05c121c1f87061fc04b47c09160137d6bdab7530c4ec22 WatchSource:0}: Error finding container 0951fdaa2cdbb2361a05c121c1f87061fc04b47c09160137d6bdab7530c4ec22: Status 404 returned error can't find the container with id 0951fdaa2cdbb2361a05c121c1f87061fc04b47c09160137d6bdab7530c4ec22 Apr 22 15:36:42.262466 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:42.262421 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-node-exporter-tls\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:42.264959 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:42.264936 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a9bbe7fd-fc20-4d50-828c-7ba2ab200da2-node-exporter-tls\") pod \"node-exporter-k4jmm\" (UID: \"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2\") " pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:42.432770 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:42.432633 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k4jmm" Apr 22 15:36:42.443459 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:36:42.443422 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9bbe7fd_fc20_4d50_828c_7ba2ab200da2.slice/crio-2e40403202122ff467f31188f4c4b9569395a3dfa77c0e551e232df4d5449e91 WatchSource:0}: Error finding container 2e40403202122ff467f31188f4c4b9569395a3dfa77c0e551e232df4d5449e91: Status 404 returned error can't find the container with id 2e40403202122ff467f31188f4c4b9569395a3dfa77c0e551e232df4d5449e91 Apr 22 15:36:42.756527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:42.756489 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k4jmm" event={"ID":"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2","Type":"ContainerStarted","Data":"2e40403202122ff467f31188f4c4b9569395a3dfa77c0e551e232df4d5449e91"} Apr 22 15:36:42.757752 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:42.757716 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" event={"ID":"500e5ead-fa2e-40ff-8137-d9dbe8098414","Type":"ContainerStarted","Data":"0951fdaa2cdbb2361a05c121c1f87061fc04b47c09160137d6bdab7530c4ec22"} Apr 22 15:36:43.575362 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:43.574476 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:43.575362 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:43.575320 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:36:43.576865 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:43.576832 2573 patch_prober.go:28] interesting pod/console-75885c989c-x9pd8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.14:8443/health\": dial tcp 10.132.0.14:8443: connect: connection refused" start-of-body= Apr 22 15:36:43.576979 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:43.576901 2573 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-75885c989c-x9pd8" podUID="876701e9-e1dc-47dc-b581-0b5461b6b59f" containerName="console" probeResult="failure" output="Get \"https://10.132.0.14:8443/health\": dial tcp 10.132.0.14:8443: connect: connection refused" Apr 22 15:36:43.763919 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:43.763708 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" event={"ID":"500e5ead-fa2e-40ff-8137-d9dbe8098414","Type":"ContainerStarted","Data":"d7c5fab4f8d00be110344aef30a3023c3f1753af1269d4c4ef60726112600cd3"} Apr 22 15:36:43.763919 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:43.763752 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" event={"ID":"500e5ead-fa2e-40ff-8137-d9dbe8098414","Type":"ContainerStarted","Data":"06203a363327f2a43b16ecf78c617382c640b924a52d829a651f5b755f1a25ea"} Apr 22 15:36:43.763919 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:43.763762 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" event={"ID":"500e5ead-fa2e-40ff-8137-d9dbe8098414","Type":"ContainerStarted","Data":"25c7e28a6502d6eaf2dc2b5c2dc51e58eda72cb130f34b169fb1f5fccbfacb11"} Apr 22 15:36:43.765396 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:43.765370 2573 generic.go:358] "Generic (PLEG): container finished" podID="a9bbe7fd-fc20-4d50-828c-7ba2ab200da2" containerID="947aadb56740bcbf52a5fc7b054e3f8a8e3dc730f08fc0b1ee0066ed65b7c4ed" exitCode=0 Apr 22 15:36:43.765513 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:43.765454 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k4jmm" event={"ID":"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2","Type":"ContainerDied","Data":"947aadb56740bcbf52a5fc7b054e3f8a8e3dc730f08fc0b1ee0066ed65b7c4ed"} Apr 22 15:36:43.785651 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:43.785603 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-tskkt" podStartSLOduration=1.376114256 podStartE2EDuration="2.785585614s" podCreationTimestamp="2026-04-22 15:36:41 +0000 UTC" firstStartedPulling="2026-04-22 15:36:41.965278665 +0000 UTC m=+173.291032295" lastFinishedPulling="2026-04-22 15:36:43.37475002 +0000 UTC m=+174.700503653" observedRunningTime="2026-04-22 15:36:43.783880127 +0000 UTC m=+175.109633779" watchObservedRunningTime="2026-04-22 15:36:43.785585614 +0000 UTC m=+175.111339265" Apr 22 15:36:44.730475 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:44.730444 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6w5vh" Apr 22 15:36:44.771681 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:44.771638 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k4jmm" event={"ID":"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2","Type":"ContainerStarted","Data":"dd20340a4434c0130581e069e9c16b730cd1983fff55df3e4fd1653e67dcdf07"} Apr 22 15:36:44.771901 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:44.771864 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k4jmm" event={"ID":"a9bbe7fd-fc20-4d50-828c-7ba2ab200da2","Type":"ContainerStarted","Data":"fa725acb96dfb7b1288d65ef376fec37db04c82f6ee24aa6cc8c925d7b3829aa"} Apr 22 15:36:44.796303 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:44.796245 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-k4jmm" podStartSLOduration=2.866307054 podStartE2EDuration="3.796226178s" podCreationTimestamp="2026-04-22 15:36:41 +0000 UTC" firstStartedPulling="2026-04-22 15:36:42.445440074 +0000 UTC m=+173.771193704" lastFinishedPulling="2026-04-22 15:36:43.375359192 +0000 UTC m=+174.701112828" observedRunningTime="2026-04-22 15:36:44.795000776 +0000 UTC m=+176.120754428" watchObservedRunningTime="2026-04-22 15:36:44.796226178 +0000 UTC m=+176.121979826" Apr 22 15:36:45.925616 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:45.925577 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-56b9bf79c7-j65r7"] Apr 22 15:36:45.928046 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:45.928014 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:45.930726 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:45.930700 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-rlv9s\"" Apr 22 15:36:45.930913 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:45.930892 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 15:36:45.932670 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:45.932648 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 15:36:45.932670 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:45.932662 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 15:36:45.933397 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:45.933376 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 15:36:45.933487 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:45.933420 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-bm3d792gd8p43\"" Apr 22 15:36:45.950677 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:45.950636 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-56b9bf79c7-j65r7"] Apr 22 15:36:46.096767 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.096732 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3e3e029f-5d45-4031-9e8d-502a525ad806-secret-metrics-server-tls\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.096913 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.096779 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3e3e029f-5d45-4031-9e8d-502a525ad806-secret-metrics-server-client-certs\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.096913 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.096818 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3e3e029f-5d45-4031-9e8d-502a525ad806-metrics-server-audit-profiles\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.096913 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.096834 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mkvm\" (UniqueName: \"kubernetes.io/projected/3e3e029f-5d45-4031-9e8d-502a525ad806-kube-api-access-6mkvm\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.096913 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.096867 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e3e029f-5d45-4031-9e8d-502a525ad806-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.097082 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.096947 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3e029f-5d45-4031-9e8d-502a525ad806-client-ca-bundle\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.097082 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.096984 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3e3e029f-5d45-4031-9e8d-502a525ad806-audit-log\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.198443 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.198326 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3e029f-5d45-4031-9e8d-502a525ad806-client-ca-bundle\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.198443 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.198386 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3e3e029f-5d45-4031-9e8d-502a525ad806-audit-log\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.198443 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.198427 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3e3e029f-5d45-4031-9e8d-502a525ad806-secret-metrics-server-tls\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.198717 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.198472 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3e3e029f-5d45-4031-9e8d-502a525ad806-secret-metrics-server-client-certs\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.198717 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.198622 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3e3e029f-5d45-4031-9e8d-502a525ad806-metrics-server-audit-profiles\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.198717 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.198665 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mkvm\" (UniqueName: \"kubernetes.io/projected/3e3e029f-5d45-4031-9e8d-502a525ad806-kube-api-access-6mkvm\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.198870 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.198720 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e3e029f-5d45-4031-9e8d-502a525ad806-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.198916 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.198881 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3e3e029f-5d45-4031-9e8d-502a525ad806-audit-log\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.199556 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.199532 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3e3e029f-5d45-4031-9e8d-502a525ad806-metrics-server-audit-profiles\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.199708 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.199686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e3e029f-5d45-4031-9e8d-502a525ad806-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.201253 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.201226 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3e3e029f-5d45-4031-9e8d-502a525ad806-secret-metrics-server-tls\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.201334 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.201301 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3e3e029f-5d45-4031-9e8d-502a525ad806-secret-metrics-server-client-certs\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.201334 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.201304 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3e029f-5d45-4031-9e8d-502a525ad806-client-ca-bundle\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.210003 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.209968 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mkvm\" (UniqueName: \"kubernetes.io/projected/3e3e029f-5d45-4031-9e8d-502a525ad806-kube-api-access-6mkvm\") pod \"metrics-server-56b9bf79c7-j65r7\" (UID: \"3e3e029f-5d45-4031-9e8d-502a525ad806\") " pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.239034 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.238990 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:36:46.384171 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.384134 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-56b9bf79c7-j65r7"] Apr 22 15:36:46.389164 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:36:46.389122 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e3e029f_5d45_4031_9e8d_502a525ad806.slice/crio-ca8187542cfaaeff00a2db7f9e8f773b10b48676c4fe8d384e00610c35fdb3a4 WatchSource:0}: Error finding container ca8187542cfaaeff00a2db7f9e8f773b10b48676c4fe8d384e00610c35fdb3a4: Status 404 returned error can't find the container with id ca8187542cfaaeff00a2db7f9e8f773b10b48676c4fe8d384e00610c35fdb3a4 Apr 22 15:36:46.779325 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:46.779282 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" event={"ID":"3e3e029f-5d45-4031-9e8d-502a525ad806","Type":"ContainerStarted","Data":"ca8187542cfaaeff00a2db7f9e8f773b10b48676c4fe8d384e00610c35fdb3a4"} Apr 22 15:36:47.918033 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:47.917994 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-75885c989c-x9pd8"] Apr 22 15:36:48.786642 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:48.786594 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" event={"ID":"3e3e029f-5d45-4031-9e8d-502a525ad806","Type":"ContainerStarted","Data":"9c119cff059a1c7e9294b9fb67e7f064a08a5c13a13ad1fb08e939532b019f0f"} Apr 22 15:36:48.806998 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:48.806939 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" podStartSLOduration=2.186005477 podStartE2EDuration="3.80692309s" podCreationTimestamp="2026-04-22 15:36:45 +0000 UTC" firstStartedPulling="2026-04-22 15:36:46.391191301 +0000 UTC m=+177.716944933" lastFinishedPulling="2026-04-22 15:36:48.012108907 +0000 UTC m=+179.337862546" observedRunningTime="2026-04-22 15:36:48.805392303 +0000 UTC m=+180.131145955" watchObservedRunningTime="2026-04-22 15:36:48.80692309 +0000 UTC m=+180.132676798" Apr 22 15:36:49.197561 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.197472 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-746876875-b8fxh"] Apr 22 15:36:49.200358 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.200330 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.212717 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.212650 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 15:36:49.213104 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.212746 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-746876875-b8fxh"] Apr 22 15:36:49.325748 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.325700 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-console-config\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.325748 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.325749 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f15a5f79-c5c1-4e66-a32f-d0910f7decab-console-serving-cert\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.325981 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.325841 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-service-ca\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.325981 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.325911 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-oauth-serving-cert\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.325981 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.325962 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-trusted-ca-bundle\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.326191 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.326042 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f15a5f79-c5c1-4e66-a32f-d0910f7decab-console-oauth-config\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.326191 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.326110 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk78m\" (UniqueName: \"kubernetes.io/projected/f15a5f79-c5c1-4e66-a32f-d0910f7decab-kube-api-access-hk78m\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.427317 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.427262 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-service-ca\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.427317 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.427330 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-oauth-serving-cert\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.427593 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.427359 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-trusted-ca-bundle\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.427593 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.427423 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f15a5f79-c5c1-4e66-a32f-d0910f7decab-console-oauth-config\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.427593 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.427490 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hk78m\" (UniqueName: \"kubernetes.io/projected/f15a5f79-c5c1-4e66-a32f-d0910f7decab-kube-api-access-hk78m\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.427593 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.427530 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-console-config\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.427593 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.427558 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f15a5f79-c5c1-4e66-a32f-d0910f7decab-console-serving-cert\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.428209 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.428181 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-service-ca\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.428331 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.428197 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-oauth-serving-cert\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.428331 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.428225 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-console-config\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.428586 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.428565 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-trusted-ca-bundle\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.429945 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.429927 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f15a5f79-c5c1-4e66-a32f-d0910f7decab-console-oauth-config\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.430193 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.430177 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f15a5f79-c5c1-4e66-a32f-d0910f7decab-console-serving-cert\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.437418 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.437386 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk78m\" (UniqueName: \"kubernetes.io/projected/f15a5f79-c5c1-4e66-a32f-d0910f7decab-kube-api-access-hk78m\") pod \"console-746876875-b8fxh\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.515923 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.515862 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:49.651514 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.651474 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-746876875-b8fxh"] Apr 22 15:36:49.654975 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:36:49.654941 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf15a5f79_c5c1_4e66_a32f_d0910f7decab.slice/crio-ef061b6d7e120bc743e72fc2734bbb400610574cecd1d02141f9a64cc02d5896 WatchSource:0}: Error finding container ef061b6d7e120bc743e72fc2734bbb400610574cecd1d02141f9a64cc02d5896: Status 404 returned error can't find the container with id ef061b6d7e120bc743e72fc2734bbb400610574cecd1d02141f9a64cc02d5896 Apr 22 15:36:49.791467 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.791365 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-746876875-b8fxh" event={"ID":"f15a5f79-c5c1-4e66-a32f-d0910f7decab","Type":"ContainerStarted","Data":"f0664fd387b6f6ab532bc48fa13b389a5c89ea8f8c2c1cde1bc9eb00f93b342e"} Apr 22 15:36:49.791467 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.791407 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-746876875-b8fxh" event={"ID":"f15a5f79-c5c1-4e66-a32f-d0910f7decab","Type":"ContainerStarted","Data":"ef061b6d7e120bc743e72fc2734bbb400610574cecd1d02141f9a64cc02d5896"} Apr 22 15:36:49.809343 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:49.809283 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-746876875-b8fxh" podStartSLOduration=0.809265975 podStartE2EDuration="809.265975ms" podCreationTimestamp="2026-04-22 15:36:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:36:49.80773209 +0000 UTC m=+181.133485742" watchObservedRunningTime="2026-04-22 15:36:49.809265975 +0000 UTC m=+181.135019626" Apr 22 15:36:50.499495 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:50.499459 2573 patch_prober.go:28] interesting pod/image-registry-57f57db9dc-2l5nd container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:36:50.499892 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:50.499513 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" podUID="1aa0caed-5656-4ed9-8cc0-35f66d7d8123" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:36:51.711422 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:51.711395 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:36:54.585960 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:54.585913 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-57f57db9dc-2l5nd"] Apr 22 15:36:59.516866 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:59.516817 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:59.516866 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:59.516873 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:59.522039 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:59.522005 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-746876875-b8fxh" Apr 22 15:36:59.825020 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:36:59.824931 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-746876875-b8fxh" Apr 22 15:37:06.240145 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:06.240083 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:37:06.240145 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:06.240152 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:37:12.942337 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:12.942280 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-75885c989c-x9pd8" podUID="876701e9-e1dc-47dc-b581-0b5461b6b59f" containerName="console" containerID="cri-o://879e7088161367942791fa7598c133bc85258793db8d48fd845ef8e5594288f3" gracePeriod=15 Apr 22 15:37:13.186822 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.186797 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75885c989c-x9pd8_876701e9-e1dc-47dc-b581-0b5461b6b59f/console/0.log" Apr 22 15:37:13.186963 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.186873 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:37:13.246593 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.246557 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/876701e9-e1dc-47dc-b581-0b5461b6b59f-oauth-serving-cert\") pod \"876701e9-e1dc-47dc-b581-0b5461b6b59f\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " Apr 22 15:37:13.246785 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.246609 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm2t2\" (UniqueName: \"kubernetes.io/projected/876701e9-e1dc-47dc-b581-0b5461b6b59f-kube-api-access-bm2t2\") pod \"876701e9-e1dc-47dc-b581-0b5461b6b59f\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " Apr 22 15:37:13.246785 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.246639 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/876701e9-e1dc-47dc-b581-0b5461b6b59f-console-config\") pod \"876701e9-e1dc-47dc-b581-0b5461b6b59f\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " Apr 22 15:37:13.246785 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.246657 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/876701e9-e1dc-47dc-b581-0b5461b6b59f-console-serving-cert\") pod \"876701e9-e1dc-47dc-b581-0b5461b6b59f\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " Apr 22 15:37:13.246785 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.246705 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/876701e9-e1dc-47dc-b581-0b5461b6b59f-service-ca\") pod \"876701e9-e1dc-47dc-b581-0b5461b6b59f\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " Apr 22 15:37:13.246785 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.246728 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/876701e9-e1dc-47dc-b581-0b5461b6b59f-console-oauth-config\") pod \"876701e9-e1dc-47dc-b581-0b5461b6b59f\" (UID: \"876701e9-e1dc-47dc-b581-0b5461b6b59f\") " Apr 22 15:37:13.247142 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.247111 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/876701e9-e1dc-47dc-b581-0b5461b6b59f-console-config" (OuterVolumeSpecName: "console-config") pod "876701e9-e1dc-47dc-b581-0b5461b6b59f" (UID: "876701e9-e1dc-47dc-b581-0b5461b6b59f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:37:13.247249 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.247146 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/876701e9-e1dc-47dc-b581-0b5461b6b59f-service-ca" (OuterVolumeSpecName: "service-ca") pod "876701e9-e1dc-47dc-b581-0b5461b6b59f" (UID: "876701e9-e1dc-47dc-b581-0b5461b6b59f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:37:13.247249 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.247119 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/876701e9-e1dc-47dc-b581-0b5461b6b59f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "876701e9-e1dc-47dc-b581-0b5461b6b59f" (UID: "876701e9-e1dc-47dc-b581-0b5461b6b59f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:37:13.249176 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.249147 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/876701e9-e1dc-47dc-b581-0b5461b6b59f-kube-api-access-bm2t2" (OuterVolumeSpecName: "kube-api-access-bm2t2") pod "876701e9-e1dc-47dc-b581-0b5461b6b59f" (UID: "876701e9-e1dc-47dc-b581-0b5461b6b59f"). InnerVolumeSpecName "kube-api-access-bm2t2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:37:13.249266 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.249222 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/876701e9-e1dc-47dc-b581-0b5461b6b59f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "876701e9-e1dc-47dc-b581-0b5461b6b59f" (UID: "876701e9-e1dc-47dc-b581-0b5461b6b59f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:37:13.249344 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.249324 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/876701e9-e1dc-47dc-b581-0b5461b6b59f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "876701e9-e1dc-47dc-b581-0b5461b6b59f" (UID: "876701e9-e1dc-47dc-b581-0b5461b6b59f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:37:13.347611 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.347566 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/876701e9-e1dc-47dc-b581-0b5461b6b59f-service-ca\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:37:13.347611 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.347605 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/876701e9-e1dc-47dc-b581-0b5461b6b59f-console-oauth-config\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:37:13.347611 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.347620 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/876701e9-e1dc-47dc-b581-0b5461b6b59f-oauth-serving-cert\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:37:13.347852 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.347632 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bm2t2\" (UniqueName: \"kubernetes.io/projected/876701e9-e1dc-47dc-b581-0b5461b6b59f-kube-api-access-bm2t2\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:37:13.347852 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.347645 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/876701e9-e1dc-47dc-b581-0b5461b6b59f-console-config\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:37:13.347852 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.347659 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/876701e9-e1dc-47dc-b581-0b5461b6b59f-console-serving-cert\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:37:13.862495 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.862466 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75885c989c-x9pd8_876701e9-e1dc-47dc-b581-0b5461b6b59f/console/0.log" Apr 22 15:37:13.862664 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.862505 2573 generic.go:358] "Generic (PLEG): container finished" podID="876701e9-e1dc-47dc-b581-0b5461b6b59f" containerID="879e7088161367942791fa7598c133bc85258793db8d48fd845ef8e5594288f3" exitCode=2 Apr 22 15:37:13.862664 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.862539 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75885c989c-x9pd8" event={"ID":"876701e9-e1dc-47dc-b581-0b5461b6b59f","Type":"ContainerDied","Data":"879e7088161367942791fa7598c133bc85258793db8d48fd845ef8e5594288f3"} Apr 22 15:37:13.862664 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.862572 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75885c989c-x9pd8" Apr 22 15:37:13.862664 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.862581 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75885c989c-x9pd8" event={"ID":"876701e9-e1dc-47dc-b581-0b5461b6b59f","Type":"ContainerDied","Data":"64296a91a3c191fc4bb8c14042df635818018a65c348d093f49c2827a8fe9a7a"} Apr 22 15:37:13.862664 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.862600 2573 scope.go:117] "RemoveContainer" containerID="879e7088161367942791fa7598c133bc85258793db8d48fd845ef8e5594288f3" Apr 22 15:37:13.870633 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.870615 2573 scope.go:117] "RemoveContainer" containerID="879e7088161367942791fa7598c133bc85258793db8d48fd845ef8e5594288f3" Apr 22 15:37:13.870946 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:37:13.870925 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"879e7088161367942791fa7598c133bc85258793db8d48fd845ef8e5594288f3\": container with ID starting with 879e7088161367942791fa7598c133bc85258793db8d48fd845ef8e5594288f3 not found: ID does not exist" containerID="879e7088161367942791fa7598c133bc85258793db8d48fd845ef8e5594288f3" Apr 22 15:37:13.871017 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.870952 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"879e7088161367942791fa7598c133bc85258793db8d48fd845ef8e5594288f3"} err="failed to get container status \"879e7088161367942791fa7598c133bc85258793db8d48fd845ef8e5594288f3\": rpc error: code = NotFound desc = could not find container \"879e7088161367942791fa7598c133bc85258793db8d48fd845ef8e5594288f3\": container with ID starting with 879e7088161367942791fa7598c133bc85258793db8d48fd845ef8e5594288f3 not found: ID does not exist" Apr 22 15:37:13.879756 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.879725 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-75885c989c-x9pd8"] Apr 22 15:37:13.883960 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:13.883931 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-75885c989c-x9pd8"] Apr 22 15:37:15.258498 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:15.258461 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="876701e9-e1dc-47dc-b581-0b5461b6b59f" path="/var/lib/kubelet/pods/876701e9-e1dc-47dc-b581-0b5461b6b59f/volumes" Apr 22 15:37:19.605756 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:19.605690 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" podUID="1aa0caed-5656-4ed9-8cc0-35f66d7d8123" containerName="registry" containerID="cri-o://056a9d63ff8c43589a80afba47d9c867da3225dd035c23579670a3114225b7bb" gracePeriod=30 Apr 22 15:37:19.848287 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:19.848262 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:37:19.883226 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:19.883130 2573 generic.go:358] "Generic (PLEG): container finished" podID="1aa0caed-5656-4ed9-8cc0-35f66d7d8123" containerID="056a9d63ff8c43589a80afba47d9c867da3225dd035c23579670a3114225b7bb" exitCode=0 Apr 22 15:37:19.883226 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:19.883192 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" Apr 22 15:37:19.883226 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:19.883208 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" event={"ID":"1aa0caed-5656-4ed9-8cc0-35f66d7d8123","Type":"ContainerDied","Data":"056a9d63ff8c43589a80afba47d9c867da3225dd035c23579670a3114225b7bb"} Apr 22 15:37:19.883467 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:19.883241 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57f57db9dc-2l5nd" event={"ID":"1aa0caed-5656-4ed9-8cc0-35f66d7d8123","Type":"ContainerDied","Data":"74250c3747988c6d1e1abda6658f12c76da0a9102cbd166bc71ca0f9bd27ae1f"} Apr 22 15:37:19.883467 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:19.883261 2573 scope.go:117] "RemoveContainer" containerID="056a9d63ff8c43589a80afba47d9c867da3225dd035c23579670a3114225b7bb" Apr 22 15:37:19.892927 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:19.892892 2573 scope.go:117] "RemoveContainer" containerID="056a9d63ff8c43589a80afba47d9c867da3225dd035c23579670a3114225b7bb" Apr 22 15:37:19.893303 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:37:19.893276 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"056a9d63ff8c43589a80afba47d9c867da3225dd035c23579670a3114225b7bb\": container with ID starting with 056a9d63ff8c43589a80afba47d9c867da3225dd035c23579670a3114225b7bb not found: ID does not exist" containerID="056a9d63ff8c43589a80afba47d9c867da3225dd035c23579670a3114225b7bb" Apr 22 15:37:19.893402 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:19.893312 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"056a9d63ff8c43589a80afba47d9c867da3225dd035c23579670a3114225b7bb"} err="failed to get container status \"056a9d63ff8c43589a80afba47d9c867da3225dd035c23579670a3114225b7bb\": rpc error: code = NotFound desc = could not find container \"056a9d63ff8c43589a80afba47d9c867da3225dd035c23579670a3114225b7bb\": container with ID starting with 056a9d63ff8c43589a80afba47d9c867da3225dd035c23579670a3114225b7bb not found: ID does not exist" Apr 22 15:37:20.002704 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.002669 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6njk\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-kube-api-access-d6njk\") pod \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " Apr 22 15:37:20.002878 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.002795 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-ca-trust-extracted\") pod \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " Apr 22 15:37:20.002878 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.002823 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-certificates\") pod \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " Apr 22 15:37:20.002878 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.002858 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-image-registry-private-configuration\") pod \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " Apr 22 15:37:20.003045 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.002885 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-trusted-ca\") pod \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " Apr 22 15:37:20.003045 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.002908 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-bound-sa-token\") pod \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " Apr 22 15:37:20.003045 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.002933 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-installation-pull-secrets\") pod \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " Apr 22 15:37:20.003045 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.003007 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls\") pod \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\" (UID: \"1aa0caed-5656-4ed9-8cc0-35f66d7d8123\") " Apr 22 15:37:20.003893 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.003331 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1aa0caed-5656-4ed9-8cc0-35f66d7d8123" (UID: "1aa0caed-5656-4ed9-8cc0-35f66d7d8123"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:37:20.003893 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.003375 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1aa0caed-5656-4ed9-8cc0-35f66d7d8123" (UID: "1aa0caed-5656-4ed9-8cc0-35f66d7d8123"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:37:20.005858 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.005827 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1aa0caed-5656-4ed9-8cc0-35f66d7d8123" (UID: "1aa0caed-5656-4ed9-8cc0-35f66d7d8123"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:37:20.006188 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.006163 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1aa0caed-5656-4ed9-8cc0-35f66d7d8123" (UID: "1aa0caed-5656-4ed9-8cc0-35f66d7d8123"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:37:20.006308 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.006164 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-kube-api-access-d6njk" (OuterVolumeSpecName: "kube-api-access-d6njk") pod "1aa0caed-5656-4ed9-8cc0-35f66d7d8123" (UID: "1aa0caed-5656-4ed9-8cc0-35f66d7d8123"). InnerVolumeSpecName "kube-api-access-d6njk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:37:20.006308 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.006256 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "1aa0caed-5656-4ed9-8cc0-35f66d7d8123" (UID: "1aa0caed-5656-4ed9-8cc0-35f66d7d8123"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:37:20.006397 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.006335 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1aa0caed-5656-4ed9-8cc0-35f66d7d8123" (UID: "1aa0caed-5656-4ed9-8cc0-35f66d7d8123"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:37:20.011704 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.011582 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1aa0caed-5656-4ed9-8cc0-35f66d7d8123" (UID: "1aa0caed-5656-4ed9-8cc0-35f66d7d8123"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:37:20.104352 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.104298 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-trusted-ca\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:37:20.104352 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.104343 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-bound-sa-token\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:37:20.104352 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.104353 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-installation-pull-secrets\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:37:20.104352 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.104365 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-tls\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:37:20.104352 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.104374 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d6njk\" (UniqueName: \"kubernetes.io/projected/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-kube-api-access-d6njk\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:37:20.104659 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.104386 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-ca-trust-extracted\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:37:20.104659 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.104395 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-registry-certificates\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:37:20.104659 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.104404 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1aa0caed-5656-4ed9-8cc0-35f66d7d8123-image-registry-private-configuration\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:37:20.205454 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.205414 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-57f57db9dc-2l5nd"] Apr 22 15:37:20.212130 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:20.210493 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-57f57db9dc-2l5nd"] Apr 22 15:37:21.253672 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:21.253631 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa0caed-5656-4ed9-8cc0-35f66d7d8123" path="/var/lib/kubelet/pods/1aa0caed-5656-4ed9-8cc0-35f66d7d8123/volumes" Apr 22 15:37:26.244986 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:26.244955 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:37:26.249047 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:37:26.249025 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-56b9bf79c7-j65r7" Apr 22 15:38:01.028464 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:01.028370 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs\") pod \"network-metrics-daemon-vk5nl\" (UID: \"0708298d-9f47-4968-9489-c7cb22cb282c\") " pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:38:01.031123 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:01.031097 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0708298d-9f47-4968-9489-c7cb22cb282c-metrics-certs\") pod \"network-metrics-daemon-vk5nl\" (UID: \"0708298d-9f47-4968-9489-c7cb22cb282c\") " pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:38:01.056663 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:01.056620 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qxbwk\"" Apr 22 15:38:01.061241 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:01.061219 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vk5nl" Apr 22 15:38:01.203192 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:01.203159 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vk5nl"] Apr 22 15:38:01.206566 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:38:01.206523 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0708298d_9f47_4968_9489_c7cb22cb282c.slice/crio-e8cf4e878fc6edf74ca49012b4ef6e29fe68d671d26167f280908d2c1a6b1942 WatchSource:0}: Error finding container e8cf4e878fc6edf74ca49012b4ef6e29fe68d671d26167f280908d2c1a6b1942: Status 404 returned error can't find the container with id e8cf4e878fc6edf74ca49012b4ef6e29fe68d671d26167f280908d2c1a6b1942 Apr 22 15:38:02.025375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:02.025335 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vk5nl" event={"ID":"0708298d-9f47-4968-9489-c7cb22cb282c","Type":"ContainerStarted","Data":"e8cf4e878fc6edf74ca49012b4ef6e29fe68d671d26167f280908d2c1a6b1942"} Apr 22 15:38:03.029313 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:03.029270 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vk5nl" event={"ID":"0708298d-9f47-4968-9489-c7cb22cb282c","Type":"ContainerStarted","Data":"54dc479184e3f144441da994f33831ac8c8724125d5a5fc34914d057f13ade3e"} Apr 22 15:38:03.029313 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:03.029312 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vk5nl" event={"ID":"0708298d-9f47-4968-9489-c7cb22cb282c","Type":"ContainerStarted","Data":"4c72f49b04a2e0397dff0efbad4da1659a03b32a4ad7d858e9a093707f04bc64"} Apr 22 15:38:03.048354 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:03.048299 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vk5nl" podStartSLOduration=253.061841195 podStartE2EDuration="4m14.048284327s" podCreationTimestamp="2026-04-22 15:33:49 +0000 UTC" firstStartedPulling="2026-04-22 15:38:01.20849965 +0000 UTC m=+252.534253286" lastFinishedPulling="2026-04-22 15:38:02.194942788 +0000 UTC m=+253.520696418" observedRunningTime="2026-04-22 15:38:03.046253012 +0000 UTC m=+254.372006664" watchObservedRunningTime="2026-04-22 15:38:03.048284327 +0000 UTC m=+254.374037986" Apr 22 15:38:09.573517 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.573478 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-85b6d7fcc4-vw8q7"] Apr 22 15:38:09.574605 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.574578 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="876701e9-e1dc-47dc-b581-0b5461b6b59f" containerName="console" Apr 22 15:38:09.574712 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.574610 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="876701e9-e1dc-47dc-b581-0b5461b6b59f" containerName="console" Apr 22 15:38:09.574712 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.574641 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1aa0caed-5656-4ed9-8cc0-35f66d7d8123" containerName="registry" Apr 22 15:38:09.574712 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.574650 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa0caed-5656-4ed9-8cc0-35f66d7d8123" containerName="registry" Apr 22 15:38:09.574712 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.574708 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="876701e9-e1dc-47dc-b581-0b5461b6b59f" containerName="console" Apr 22 15:38:09.574879 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.574723 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1aa0caed-5656-4ed9-8cc0-35f66d7d8123" containerName="registry" Apr 22 15:38:09.577720 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.577693 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.588817 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.588785 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85b6d7fcc4-vw8q7"] Apr 22 15:38:09.600555 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.600523 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx6dg\" (UniqueName: \"kubernetes.io/projected/70e929d7-d1b5-4082-a78e-d3d434df3337-kube-api-access-lx6dg\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.600555 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.600566 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-oauth-serving-cert\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.600815 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.600626 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-service-ca\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.600815 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.600666 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/70e929d7-d1b5-4082-a78e-d3d434df3337-console-serving-cert\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.600815 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.600699 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-trusted-ca-bundle\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.600815 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.600751 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-console-config\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.600815 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.600784 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/70e929d7-d1b5-4082-a78e-d3d434df3337-console-oauth-config\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.701359 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.701318 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-oauth-serving-cert\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.701359 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.701366 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-service-ca\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.701605 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.701383 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/70e929d7-d1b5-4082-a78e-d3d434df3337-console-serving-cert\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.701605 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.701405 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-trusted-ca-bundle\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.701605 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.701432 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-console-config\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.701605 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.701452 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/70e929d7-d1b5-4082-a78e-d3d434df3337-console-oauth-config\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.701605 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.701496 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lx6dg\" (UniqueName: \"kubernetes.io/projected/70e929d7-d1b5-4082-a78e-d3d434df3337-kube-api-access-lx6dg\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.702222 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.702146 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-oauth-serving-cert\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.702222 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.702250 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-service-ca\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.702535 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.702336 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-trusted-ca-bundle\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.702596 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.702566 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-console-config\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.704175 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.704153 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/70e929d7-d1b5-4082-a78e-d3d434df3337-console-serving-cert\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.704279 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.704229 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/70e929d7-d1b5-4082-a78e-d3d434df3337-console-oauth-config\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.711212 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.711186 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx6dg\" (UniqueName: \"kubernetes.io/projected/70e929d7-d1b5-4082-a78e-d3d434df3337-kube-api-access-lx6dg\") pod \"console-85b6d7fcc4-vw8q7\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:09.889928 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:09.889823 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:10.043101 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:10.042023 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85b6d7fcc4-vw8q7"] Apr 22 15:38:10.045523 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:38:10.045490 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70e929d7_d1b5_4082_a78e_d3d434df3337.slice/crio-692102aa9a0376cd5af387819bd439178cb81e8b7d5724488c02bf41d596fa91 WatchSource:0}: Error finding container 692102aa9a0376cd5af387819bd439178cb81e8b7d5724488c02bf41d596fa91: Status 404 returned error can't find the container with id 692102aa9a0376cd5af387819bd439178cb81e8b7d5724488c02bf41d596fa91 Apr 22 15:38:10.052273 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:10.052240 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85b6d7fcc4-vw8q7" event={"ID":"70e929d7-d1b5-4082-a78e-d3d434df3337","Type":"ContainerStarted","Data":"692102aa9a0376cd5af387819bd439178cb81e8b7d5724488c02bf41d596fa91"} Apr 22 15:38:11.056671 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:11.056626 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85b6d7fcc4-vw8q7" event={"ID":"70e929d7-d1b5-4082-a78e-d3d434df3337","Type":"ContainerStarted","Data":"e522600bbd4bb9c119318eb5668bcde95d2d839697b07fbdbf8efa2551bd3c25"} Apr 22 15:38:11.084259 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:11.084199 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85b6d7fcc4-vw8q7" podStartSLOduration=2.084178991 podStartE2EDuration="2.084178991s" podCreationTimestamp="2026-04-22 15:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:38:11.082233486 +0000 UTC m=+262.407987149" watchObservedRunningTime="2026-04-22 15:38:11.084178991 +0000 UTC m=+262.409932646" Apr 22 15:38:19.890439 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:19.890388 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:19.890439 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:19.890441 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:19.895128 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:19.895100 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:20.085512 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:20.085483 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:38:20.170392 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:20.170276 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-746876875-b8fxh"] Apr 22 15:38:45.190583 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.190522 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-746876875-b8fxh" podUID="f15a5f79-c5c1-4e66-a32f-d0910f7decab" containerName="console" containerID="cri-o://f0664fd387b6f6ab532bc48fa13b389a5c89ea8f8c2c1cde1bc9eb00f93b342e" gracePeriod=15 Apr 22 15:38:45.431591 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.431568 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-746876875-b8fxh_f15a5f79-c5c1-4e66-a32f-d0910f7decab/console/0.log" Apr 22 15:38:45.431729 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.431630 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-746876875-b8fxh" Apr 22 15:38:45.582289 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.582248 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk78m\" (UniqueName: \"kubernetes.io/projected/f15a5f79-c5c1-4e66-a32f-d0910f7decab-kube-api-access-hk78m\") pod \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " Apr 22 15:38:45.582505 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.582320 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-service-ca\") pod \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " Apr 22 15:38:45.582505 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.582359 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-oauth-serving-cert\") pod \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " Apr 22 15:38:45.582505 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.582387 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f15a5f79-c5c1-4e66-a32f-d0910f7decab-console-oauth-config\") pod \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " Apr 22 15:38:45.582505 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.582425 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-trusted-ca-bundle\") pod \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " Apr 22 15:38:45.582505 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.582453 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-console-config\") pod \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " Apr 22 15:38:45.582505 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.582476 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f15a5f79-c5c1-4e66-a32f-d0910f7decab-console-serving-cert\") pod \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\" (UID: \"f15a5f79-c5c1-4e66-a32f-d0910f7decab\") " Apr 22 15:38:45.582830 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.582783 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f15a5f79-c5c1-4e66-a32f-d0910f7decab" (UID: "f15a5f79-c5c1-4e66-a32f-d0910f7decab"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:38:45.582906 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.582827 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-service-ca" (OuterVolumeSpecName: "service-ca") pod "f15a5f79-c5c1-4e66-a32f-d0910f7decab" (UID: "f15a5f79-c5c1-4e66-a32f-d0910f7decab"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:38:45.582906 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.582839 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-console-config" (OuterVolumeSpecName: "console-config") pod "f15a5f79-c5c1-4e66-a32f-d0910f7decab" (UID: "f15a5f79-c5c1-4e66-a32f-d0910f7decab"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:38:45.582906 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.582854 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f15a5f79-c5c1-4e66-a32f-d0910f7decab" (UID: "f15a5f79-c5c1-4e66-a32f-d0910f7decab"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:38:45.584685 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.584652 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f15a5f79-c5c1-4e66-a32f-d0910f7decab-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f15a5f79-c5c1-4e66-a32f-d0910f7decab" (UID: "f15a5f79-c5c1-4e66-a32f-d0910f7decab"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:38:45.584797 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.584703 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f15a5f79-c5c1-4e66-a32f-d0910f7decab-kube-api-access-hk78m" (OuterVolumeSpecName: "kube-api-access-hk78m") pod "f15a5f79-c5c1-4e66-a32f-d0910f7decab" (UID: "f15a5f79-c5c1-4e66-a32f-d0910f7decab"). InnerVolumeSpecName "kube-api-access-hk78m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:38:45.584797 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.584730 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f15a5f79-c5c1-4e66-a32f-d0910f7decab-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f15a5f79-c5c1-4e66-a32f-d0910f7decab" (UID: "f15a5f79-c5c1-4e66-a32f-d0910f7decab"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:38:45.683329 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.683277 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hk78m\" (UniqueName: \"kubernetes.io/projected/f15a5f79-c5c1-4e66-a32f-d0910f7decab-kube-api-access-hk78m\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:38:45.683329 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.683320 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-service-ca\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:38:45.683329 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.683334 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-oauth-serving-cert\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:38:45.683329 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.683344 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f15a5f79-c5c1-4e66-a32f-d0910f7decab-console-oauth-config\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:38:45.683610 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.683353 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-trusted-ca-bundle\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:38:45.683610 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.683363 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f15a5f79-c5c1-4e66-a32f-d0910f7decab-console-config\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:38:45.683610 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:45.683372 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f15a5f79-c5c1-4e66-a32f-d0910f7decab-console-serving-cert\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:38:46.154394 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:46.154360 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-746876875-b8fxh_f15a5f79-c5c1-4e66-a32f-d0910f7decab/console/0.log" Apr 22 15:38:46.154576 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:46.154408 2573 generic.go:358] "Generic (PLEG): container finished" podID="f15a5f79-c5c1-4e66-a32f-d0910f7decab" containerID="f0664fd387b6f6ab532bc48fa13b389a5c89ea8f8c2c1cde1bc9eb00f93b342e" exitCode=2 Apr 22 15:38:46.154576 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:46.154480 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-746876875-b8fxh" Apr 22 15:38:46.154576 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:46.154471 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-746876875-b8fxh" event={"ID":"f15a5f79-c5c1-4e66-a32f-d0910f7decab","Type":"ContainerDied","Data":"f0664fd387b6f6ab532bc48fa13b389a5c89ea8f8c2c1cde1bc9eb00f93b342e"} Apr 22 15:38:46.154701 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:46.154602 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-746876875-b8fxh" event={"ID":"f15a5f79-c5c1-4e66-a32f-d0910f7decab","Type":"ContainerDied","Data":"ef061b6d7e120bc743e72fc2734bbb400610574cecd1d02141f9a64cc02d5896"} Apr 22 15:38:46.154701 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:46.154623 2573 scope.go:117] "RemoveContainer" containerID="f0664fd387b6f6ab532bc48fa13b389a5c89ea8f8c2c1cde1bc9eb00f93b342e" Apr 22 15:38:46.168490 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:46.168464 2573 scope.go:117] "RemoveContainer" containerID="f0664fd387b6f6ab532bc48fa13b389a5c89ea8f8c2c1cde1bc9eb00f93b342e" Apr 22 15:38:46.168840 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:38:46.168813 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0664fd387b6f6ab532bc48fa13b389a5c89ea8f8c2c1cde1bc9eb00f93b342e\": container with ID starting with f0664fd387b6f6ab532bc48fa13b389a5c89ea8f8c2c1cde1bc9eb00f93b342e not found: ID does not exist" containerID="f0664fd387b6f6ab532bc48fa13b389a5c89ea8f8c2c1cde1bc9eb00f93b342e" Apr 22 15:38:46.168903 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:46.168854 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0664fd387b6f6ab532bc48fa13b389a5c89ea8f8c2c1cde1bc9eb00f93b342e"} err="failed to get container status \"f0664fd387b6f6ab532bc48fa13b389a5c89ea8f8c2c1cde1bc9eb00f93b342e\": rpc error: code = NotFound desc = could not find container \"f0664fd387b6f6ab532bc48fa13b389a5c89ea8f8c2c1cde1bc9eb00f93b342e\": container with ID starting with f0664fd387b6f6ab532bc48fa13b389a5c89ea8f8c2c1cde1bc9eb00f93b342e not found: ID does not exist" Apr 22 15:38:46.180965 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:46.180933 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-746876875-b8fxh"] Apr 22 15:38:46.183688 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:46.183649 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-746876875-b8fxh"] Apr 22 15:38:47.253723 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:47.253683 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f15a5f79-c5c1-4e66-a32f-d0910f7decab" path="/var/lib/kubelet/pods/f15a5f79-c5c1-4e66-a32f-d0910f7decab/volumes" Apr 22 15:38:49.130936 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:49.130911 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 15:38:49.131374 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:49.130911 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 15:38:49.134098 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:49.134053 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 15:38:49.134249 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:49.134138 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 15:38:49.140402 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:38:49.140378 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 15:39:16.988593 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:16.988553 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c48f4fc67-p5qwx"] Apr 22 15:39:16.994124 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:16.988991 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f15a5f79-c5c1-4e66-a32f-d0910f7decab" containerName="console" Apr 22 15:39:16.994124 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:16.989011 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f15a5f79-c5c1-4e66-a32f-d0910f7decab" containerName="console" Apr 22 15:39:16.994124 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:16.989118 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f15a5f79-c5c1-4e66-a32f-d0910f7decab" containerName="console" Apr 22 15:39:16.994695 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:16.994658 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.003119 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.002876 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c48f4fc67-p5qwx"] Apr 22 15:39:17.035385 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.035344 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-trusted-ca-bundle\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.035582 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.035401 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-console-config\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.035582 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.035463 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-oauth-serving-cert\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.035582 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.035489 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7whmj\" (UniqueName: \"kubernetes.io/projected/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-kube-api-access-7whmj\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.035582 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.035540 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-console-serving-cert\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.035582 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.035578 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-console-oauth-config\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.035772 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.035598 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-service-ca\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.136674 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.136636 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-service-ca\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.136674 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.136698 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-trusted-ca-bundle\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.136954 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.136724 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-console-config\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.136954 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.136752 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-oauth-serving-cert\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.136954 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.136888 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7whmj\" (UniqueName: \"kubernetes.io/projected/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-kube-api-access-7whmj\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.136954 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.136943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-console-serving-cert\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.137187 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.136977 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-console-oauth-config\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.137573 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.137535 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-oauth-serving-cert\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.137681 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.137599 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-console-config\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.137681 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.137622 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-service-ca\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.137959 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.137851 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-trusted-ca-bundle\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.139749 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.139727 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-console-oauth-config\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.139894 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.139876 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-console-serving-cert\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.147511 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.147477 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7whmj\" (UniqueName: \"kubernetes.io/projected/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-kube-api-access-7whmj\") pod \"console-6c48f4fc67-p5qwx\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.306804 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.306683 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:17.444637 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.444421 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c48f4fc67-p5qwx"] Apr 22 15:39:17.447407 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:39:17.447373 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fcd2ee2_b3e3_496d_873f_509fdf93ed27.slice/crio-867ea4067c641441771fb5ea775da9e8ed73a20bc084e6db3a11a7ef376e5386 WatchSource:0}: Error finding container 867ea4067c641441771fb5ea775da9e8ed73a20bc084e6db3a11a7ef376e5386: Status 404 returned error can't find the container with id 867ea4067c641441771fb5ea775da9e8ed73a20bc084e6db3a11a7ef376e5386 Apr 22 15:39:17.449294 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:17.449273 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:39:18.245884 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:18.245850 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c48f4fc67-p5qwx" event={"ID":"8fcd2ee2-b3e3-496d-873f-509fdf93ed27","Type":"ContainerStarted","Data":"0dfcc48de17724bafac2da176766258bf20077948d81b1470b1c305ee852bba5"} Apr 22 15:39:18.245884 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:18.245887 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c48f4fc67-p5qwx" event={"ID":"8fcd2ee2-b3e3-496d-873f-509fdf93ed27","Type":"ContainerStarted","Data":"867ea4067c641441771fb5ea775da9e8ed73a20bc084e6db3a11a7ef376e5386"} Apr 22 15:39:18.263479 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:18.263427 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c48f4fc67-p5qwx" podStartSLOduration=2.263412538 podStartE2EDuration="2.263412538s" podCreationTimestamp="2026-04-22 15:39:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:39:18.261785967 +0000 UTC m=+329.587539621" watchObservedRunningTime="2026-04-22 15:39:18.263412538 +0000 UTC m=+329.589166188" Apr 22 15:39:27.307360 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:27.307299 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:27.307823 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:27.307393 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:27.312173 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:27.312151 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:28.277683 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:28.277658 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:39:28.354953 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:28.354916 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85b6d7fcc4-vw8q7"] Apr 22 15:39:53.375320 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.375258 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-85b6d7fcc4-vw8q7" podUID="70e929d7-d1b5-4082-a78e-d3d434df3337" containerName="console" containerID="cri-o://e522600bbd4bb9c119318eb5668bcde95d2d839697b07fbdbf8efa2551bd3c25" gracePeriod=15 Apr 22 15:39:53.618455 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.618430 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85b6d7fcc4-vw8q7_70e929d7-d1b5-4082-a78e-d3d434df3337/console/0.log" Apr 22 15:39:53.618624 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.618497 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:39:53.715467 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.715372 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-service-ca\") pod \"70e929d7-d1b5-4082-a78e-d3d434df3337\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " Apr 22 15:39:53.715467 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.715429 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/70e929d7-d1b5-4082-a78e-d3d434df3337-console-serving-cert\") pod \"70e929d7-d1b5-4082-a78e-d3d434df3337\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " Apr 22 15:39:53.715467 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.715453 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-console-config\") pod \"70e929d7-d1b5-4082-a78e-d3d434df3337\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " Apr 22 15:39:53.715467 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.715471 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx6dg\" (UniqueName: \"kubernetes.io/projected/70e929d7-d1b5-4082-a78e-d3d434df3337-kube-api-access-lx6dg\") pod \"70e929d7-d1b5-4082-a78e-d3d434df3337\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " Apr 22 15:39:53.715862 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.715500 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-oauth-serving-cert\") pod \"70e929d7-d1b5-4082-a78e-d3d434df3337\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " Apr 22 15:39:53.715862 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.715516 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/70e929d7-d1b5-4082-a78e-d3d434df3337-console-oauth-config\") pod \"70e929d7-d1b5-4082-a78e-d3d434df3337\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " Apr 22 15:39:53.715862 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.715564 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-trusted-ca-bundle\") pod \"70e929d7-d1b5-4082-a78e-d3d434df3337\" (UID: \"70e929d7-d1b5-4082-a78e-d3d434df3337\") " Apr 22 15:39:53.716027 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.715908 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-service-ca" (OuterVolumeSpecName: "service-ca") pod "70e929d7-d1b5-4082-a78e-d3d434df3337" (UID: "70e929d7-d1b5-4082-a78e-d3d434df3337"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:39:53.716027 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.716009 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-console-config" (OuterVolumeSpecName: "console-config") pod "70e929d7-d1b5-4082-a78e-d3d434df3337" (UID: "70e929d7-d1b5-4082-a78e-d3d434df3337"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:39:53.716182 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.716014 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "70e929d7-d1b5-4082-a78e-d3d434df3337" (UID: "70e929d7-d1b5-4082-a78e-d3d434df3337"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:39:53.716182 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.716104 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "70e929d7-d1b5-4082-a78e-d3d434df3337" (UID: "70e929d7-d1b5-4082-a78e-d3d434df3337"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:39:53.717855 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.717830 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70e929d7-d1b5-4082-a78e-d3d434df3337-kube-api-access-lx6dg" (OuterVolumeSpecName: "kube-api-access-lx6dg") pod "70e929d7-d1b5-4082-a78e-d3d434df3337" (UID: "70e929d7-d1b5-4082-a78e-d3d434df3337"). InnerVolumeSpecName "kube-api-access-lx6dg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:39:53.717963 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.717861 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e929d7-d1b5-4082-a78e-d3d434df3337-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "70e929d7-d1b5-4082-a78e-d3d434df3337" (UID: "70e929d7-d1b5-4082-a78e-d3d434df3337"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:39:53.717963 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.717906 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e929d7-d1b5-4082-a78e-d3d434df3337-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "70e929d7-d1b5-4082-a78e-d3d434df3337" (UID: "70e929d7-d1b5-4082-a78e-d3d434df3337"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:39:53.816321 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.816265 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-service-ca\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:39:53.816321 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.816315 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/70e929d7-d1b5-4082-a78e-d3d434df3337-console-serving-cert\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:39:53.816321 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.816329 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-console-config\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:39:53.816562 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.816342 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lx6dg\" (UniqueName: \"kubernetes.io/projected/70e929d7-d1b5-4082-a78e-d3d434df3337-kube-api-access-lx6dg\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:39:53.816562 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.816357 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-oauth-serving-cert\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:39:53.816562 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.816370 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/70e929d7-d1b5-4082-a78e-d3d434df3337-console-oauth-config\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:39:53.816562 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:53.816382 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70e929d7-d1b5-4082-a78e-d3d434df3337-trusted-ca-bundle\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:39:54.344194 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:54.344166 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85b6d7fcc4-vw8q7_70e929d7-d1b5-4082-a78e-d3d434df3337/console/0.log" Apr 22 15:39:54.344375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:54.344209 2573 generic.go:358] "Generic (PLEG): container finished" podID="70e929d7-d1b5-4082-a78e-d3d434df3337" containerID="e522600bbd4bb9c119318eb5668bcde95d2d839697b07fbdbf8efa2551bd3c25" exitCode=2 Apr 22 15:39:54.344375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:54.344255 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85b6d7fcc4-vw8q7" event={"ID":"70e929d7-d1b5-4082-a78e-d3d434df3337","Type":"ContainerDied","Data":"e522600bbd4bb9c119318eb5668bcde95d2d839697b07fbdbf8efa2551bd3c25"} Apr 22 15:39:54.344375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:54.344289 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85b6d7fcc4-vw8q7" Apr 22 15:39:54.344375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:54.344305 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85b6d7fcc4-vw8q7" event={"ID":"70e929d7-d1b5-4082-a78e-d3d434df3337","Type":"ContainerDied","Data":"692102aa9a0376cd5af387819bd439178cb81e8b7d5724488c02bf41d596fa91"} Apr 22 15:39:54.344375 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:54.344322 2573 scope.go:117] "RemoveContainer" containerID="e522600bbd4bb9c119318eb5668bcde95d2d839697b07fbdbf8efa2551bd3c25" Apr 22 15:39:54.352634 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:54.352615 2573 scope.go:117] "RemoveContainer" containerID="e522600bbd4bb9c119318eb5668bcde95d2d839697b07fbdbf8efa2551bd3c25" Apr 22 15:39:54.352971 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:39:54.352948 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e522600bbd4bb9c119318eb5668bcde95d2d839697b07fbdbf8efa2551bd3c25\": container with ID starting with e522600bbd4bb9c119318eb5668bcde95d2d839697b07fbdbf8efa2551bd3c25 not found: ID does not exist" containerID="e522600bbd4bb9c119318eb5668bcde95d2d839697b07fbdbf8efa2551bd3c25" Apr 22 15:39:54.353025 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:54.352981 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e522600bbd4bb9c119318eb5668bcde95d2d839697b07fbdbf8efa2551bd3c25"} err="failed to get container status \"e522600bbd4bb9c119318eb5668bcde95d2d839697b07fbdbf8efa2551bd3c25\": rpc error: code = NotFound desc = could not find container \"e522600bbd4bb9c119318eb5668bcde95d2d839697b07fbdbf8efa2551bd3c25\": container with ID starting with e522600bbd4bb9c119318eb5668bcde95d2d839697b07fbdbf8efa2551bd3c25 not found: ID does not exist" Apr 22 15:39:54.365783 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:54.365747 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85b6d7fcc4-vw8q7"] Apr 22 15:39:54.371173 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:54.371139 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-85b6d7fcc4-vw8q7"] Apr 22 15:39:55.254442 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:39:55.254401 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70e929d7-d1b5-4082-a78e-d3d434df3337" path="/var/lib/kubelet/pods/70e929d7-d1b5-4082-a78e-d3d434df3337/volumes" Apr 22 15:40:18.840038 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:18.839999 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6"] Apr 22 15:40:18.840596 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:18.840332 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70e929d7-d1b5-4082-a78e-d3d434df3337" containerName="console" Apr 22 15:40:18.840596 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:18.840350 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e929d7-d1b5-4082-a78e-d3d434df3337" containerName="console" Apr 22 15:40:18.840596 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:18.840429 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="70e929d7-d1b5-4082-a78e-d3d434df3337" containerName="console" Apr 22 15:40:18.843445 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:18.843424 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6" Apr 22 15:40:18.846928 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:18.846898 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 15:40:18.847097 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:18.846900 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 15:40:18.847097 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:18.846949 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 15:40:18.847097 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:18.846966 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-vjpml\"" Apr 22 15:40:18.847097 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:18.846970 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 15:40:18.852825 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:18.852800 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6"] Apr 22 15:40:18.932287 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:18.932245 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz"] Apr 22 15:40:18.935804 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:18.935769 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:18.938592 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:18.938552 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 15:40:18.938766 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:18.938635 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 15:40:18.938766 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:18.938655 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 15:40:18.938766 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:18.938719 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 15:40:18.948123 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:18.948077 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz"] Apr 22 15:40:19.006683 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.006644 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwj65\" (UniqueName: \"kubernetes.io/projected/c2184c69-6253-4f82-9752-76f0a1dc206f-kube-api-access-vwj65\") pod \"managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6\" (UID: \"c2184c69-6253-4f82-9752-76f0a1dc206f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6" Apr 22 15:40:19.006882 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.006705 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c2184c69-6253-4f82-9752-76f0a1dc206f-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6\" (UID: \"c2184c69-6253-4f82-9752-76f0a1dc206f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6" Apr 22 15:40:19.108128 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.108006 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvxvm\" (UniqueName: \"kubernetes.io/projected/8046d64d-c651-4c4f-ab22-bc94d1802922-kube-api-access-zvxvm\") pod \"cluster-proxy-proxy-agent-6f458bb86-n9mgz\" (UID: \"8046d64d-c651-4c4f-ab22-bc94d1802922\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.108128 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.108097 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8046d64d-c651-4c4f-ab22-bc94d1802922-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6f458bb86-n9mgz\" (UID: \"8046d64d-c651-4c4f-ab22-bc94d1802922\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.108311 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.108157 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8046d64d-c651-4c4f-ab22-bc94d1802922-hub\") pod \"cluster-proxy-proxy-agent-6f458bb86-n9mgz\" (UID: \"8046d64d-c651-4c4f-ab22-bc94d1802922\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.108311 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.108227 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwj65\" (UniqueName: \"kubernetes.io/projected/c2184c69-6253-4f82-9752-76f0a1dc206f-kube-api-access-vwj65\") pod \"managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6\" (UID: \"c2184c69-6253-4f82-9752-76f0a1dc206f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6" Apr 22 15:40:19.108311 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.108256 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8046d64d-c651-4c4f-ab22-bc94d1802922-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6f458bb86-n9mgz\" (UID: \"8046d64d-c651-4c4f-ab22-bc94d1802922\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.108311 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.108282 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8046d64d-c651-4c4f-ab22-bc94d1802922-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6f458bb86-n9mgz\" (UID: \"8046d64d-c651-4c4f-ab22-bc94d1802922\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.108435 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.108318 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8046d64d-c651-4c4f-ab22-bc94d1802922-ca\") pod \"cluster-proxy-proxy-agent-6f458bb86-n9mgz\" (UID: \"8046d64d-c651-4c4f-ab22-bc94d1802922\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.108435 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.108343 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c2184c69-6253-4f82-9752-76f0a1dc206f-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6\" (UID: \"c2184c69-6253-4f82-9752-76f0a1dc206f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6" Apr 22 15:40:19.111158 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.111132 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c2184c69-6253-4f82-9752-76f0a1dc206f-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6\" (UID: \"c2184c69-6253-4f82-9752-76f0a1dc206f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6" Apr 22 15:40:19.117204 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.117169 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwj65\" (UniqueName: \"kubernetes.io/projected/c2184c69-6253-4f82-9752-76f0a1dc206f-kube-api-access-vwj65\") pod \"managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6\" (UID: \"c2184c69-6253-4f82-9752-76f0a1dc206f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6" Apr 22 15:40:19.164158 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.164121 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6" Apr 22 15:40:19.209373 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.209325 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8046d64d-c651-4c4f-ab22-bc94d1802922-hub\") pod \"cluster-proxy-proxy-agent-6f458bb86-n9mgz\" (UID: \"8046d64d-c651-4c4f-ab22-bc94d1802922\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.209563 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.209401 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8046d64d-c651-4c4f-ab22-bc94d1802922-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6f458bb86-n9mgz\" (UID: \"8046d64d-c651-4c4f-ab22-bc94d1802922\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.209563 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.209439 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8046d64d-c651-4c4f-ab22-bc94d1802922-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6f458bb86-n9mgz\" (UID: \"8046d64d-c651-4c4f-ab22-bc94d1802922\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.209682 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.209576 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8046d64d-c651-4c4f-ab22-bc94d1802922-ca\") pod \"cluster-proxy-proxy-agent-6f458bb86-n9mgz\" (UID: \"8046d64d-c651-4c4f-ab22-bc94d1802922\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.209682 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.209641 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvxvm\" (UniqueName: \"kubernetes.io/projected/8046d64d-c651-4c4f-ab22-bc94d1802922-kube-api-access-zvxvm\") pod \"cluster-proxy-proxy-agent-6f458bb86-n9mgz\" (UID: \"8046d64d-c651-4c4f-ab22-bc94d1802922\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.209782 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.209703 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8046d64d-c651-4c4f-ab22-bc94d1802922-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6f458bb86-n9mgz\" (UID: \"8046d64d-c651-4c4f-ab22-bc94d1802922\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.210734 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.210701 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8046d64d-c651-4c4f-ab22-bc94d1802922-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6f458bb86-n9mgz\" (UID: \"8046d64d-c651-4c4f-ab22-bc94d1802922\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.212479 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.212399 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8046d64d-c651-4c4f-ab22-bc94d1802922-ca\") pod \"cluster-proxy-proxy-agent-6f458bb86-n9mgz\" (UID: \"8046d64d-c651-4c4f-ab22-bc94d1802922\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.212479 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.212430 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8046d64d-c651-4c4f-ab22-bc94d1802922-hub\") pod \"cluster-proxy-proxy-agent-6f458bb86-n9mgz\" (UID: \"8046d64d-c651-4c4f-ab22-bc94d1802922\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.213979 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.213934 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8046d64d-c651-4c4f-ab22-bc94d1802922-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6f458bb86-n9mgz\" (UID: \"8046d64d-c651-4c4f-ab22-bc94d1802922\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.215626 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.215580 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8046d64d-c651-4c4f-ab22-bc94d1802922-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6f458bb86-n9mgz\" (UID: \"8046d64d-c651-4c4f-ab22-bc94d1802922\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.219333 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.219299 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvxvm\" (UniqueName: \"kubernetes.io/projected/8046d64d-c651-4c4f-ab22-bc94d1802922-kube-api-access-zvxvm\") pod \"cluster-proxy-proxy-agent-6f458bb86-n9mgz\" (UID: \"8046d64d-c651-4c4f-ab22-bc94d1802922\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.253299 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.253262 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" Apr 22 15:40:19.295486 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.295025 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6"] Apr 22 15:40:19.384678 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.384603 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz"] Apr 22 15:40:19.388046 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:40:19.388020 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8046d64d_c651_4c4f_ab22_bc94d1802922.slice/crio-56b5531d799743eace81e1a73ab37a44c559b03d48fb0f21d79d8c165705485b WatchSource:0}: Error finding container 56b5531d799743eace81e1a73ab37a44c559b03d48fb0f21d79d8c165705485b: Status 404 returned error can't find the container with id 56b5531d799743eace81e1a73ab37a44c559b03d48fb0f21d79d8c165705485b Apr 22 15:40:19.413672 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.413622 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" event={"ID":"8046d64d-c651-4c4f-ab22-bc94d1802922","Type":"ContainerStarted","Data":"56b5531d799743eace81e1a73ab37a44c559b03d48fb0f21d79d8c165705485b"} Apr 22 15:40:19.414681 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:19.414644 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6" event={"ID":"c2184c69-6253-4f82-9752-76f0a1dc206f","Type":"ContainerStarted","Data":"a7e34f8599a7e1e8cd961f058d257e4cb6aafef0899bd3056a6f46b9aef176ff"} Apr 22 15:40:23.428376 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:23.428339 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" event={"ID":"8046d64d-c651-4c4f-ab22-bc94d1802922","Type":"ContainerStarted","Data":"48bbea99939d15e2cc984f76ad38a3d0109c246461bf668b64feb2849765f84f"} Apr 22 15:40:23.429498 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:23.429476 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6" event={"ID":"c2184c69-6253-4f82-9752-76f0a1dc206f","Type":"ContainerStarted","Data":"ca223a459f88e7889d95ad22127fbfb9fc5d682acadd2a9df049d14fa8d23464"} Apr 22 15:40:23.444645 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:23.444593 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fbfb5cd85-cd2f6" podStartSLOduration=1.93010892 podStartE2EDuration="5.444576249s" podCreationTimestamp="2026-04-22 15:40:18 +0000 UTC" firstStartedPulling="2026-04-22 15:40:19.30095881 +0000 UTC m=+390.626712439" lastFinishedPulling="2026-04-22 15:40:22.815426136 +0000 UTC m=+394.141179768" observedRunningTime="2026-04-22 15:40:23.443582783 +0000 UTC m=+394.769336432" watchObservedRunningTime="2026-04-22 15:40:23.444576249 +0000 UTC m=+394.770329903" Apr 22 15:40:26.441175 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:26.441129 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" event={"ID":"8046d64d-c651-4c4f-ab22-bc94d1802922","Type":"ContainerStarted","Data":"0e4034e9dcd7b9214827f89d30a15007d704d245debecc75940f27161744ac50"} Apr 22 15:40:26.441175 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:26.441173 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" event={"ID":"8046d64d-c651-4c4f-ab22-bc94d1802922","Type":"ContainerStarted","Data":"f0bb63f096744a649b22b85a9c54821233aee57f18cbed1ab6248c316d144a0e"} Apr 22 15:40:26.459725 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:40:26.459662 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f458bb86-n9mgz" podStartSLOduration=2.099841107 podStartE2EDuration="8.459645593s" podCreationTimestamp="2026-04-22 15:40:18 +0000 UTC" firstStartedPulling="2026-04-22 15:40:19.389679741 +0000 UTC m=+390.715433369" lastFinishedPulling="2026-04-22 15:40:25.749484212 +0000 UTC m=+397.075237855" observedRunningTime="2026-04-22 15:40:26.458325109 +0000 UTC m=+397.784078757" watchObservedRunningTime="2026-04-22 15:40:26.459645593 +0000 UTC m=+397.785399292" Apr 22 15:43:20.506102 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.506038 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-b4bb6977d-t29g2"] Apr 22 15:43:20.510028 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.510000 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.524797 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.524766 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b4bb6977d-t29g2"] Apr 22 15:43:20.582079 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.582031 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-oauth-serving-cert\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.582292 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.582111 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-service-ca\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.582292 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.582186 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-console-serving-cert\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.582292 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.582241 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-console-oauth-config\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.582468 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.582316 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-console-config\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.582468 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.582369 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsrwt\" (UniqueName: \"kubernetes.io/projected/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-kube-api-access-tsrwt\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.582572 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.582468 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-trusted-ca-bundle\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.683042 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.682998 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsrwt\" (UniqueName: \"kubernetes.io/projected/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-kube-api-access-tsrwt\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.683273 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.683055 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-trusted-ca-bundle\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.683273 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.683117 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-oauth-serving-cert\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.683273 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.683142 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-service-ca\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.683414 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.683332 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-console-serving-cert\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.683414 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.683394 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-console-oauth-config\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.683512 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.683437 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-console-config\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.683943 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.683908 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-service-ca\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.683943 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.683918 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-oauth-serving-cert\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.684140 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.683988 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-console-config\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.684140 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.683997 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-trusted-ca-bundle\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.686044 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.686013 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-console-serving-cert\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.686044 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.686036 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-console-oauth-config\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.695634 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.695598 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsrwt\" (UniqueName: \"kubernetes.io/projected/5769d82c-40d1-49a8-8038-d6c1ee6a9ece-kube-api-access-tsrwt\") pod \"console-b4bb6977d-t29g2\" (UID: \"5769d82c-40d1-49a8-8038-d6c1ee6a9ece\") " pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.820613 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.820507 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:20.949353 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:20.949242 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b4bb6977d-t29g2"] Apr 22 15:43:20.952003 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:43:20.951974 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5769d82c_40d1_49a8_8038_d6c1ee6a9ece.slice/crio-200729acc11444ccb9f2823899561867536bd0047a4ac67d8c5ccee19bd33aa7 WatchSource:0}: Error finding container 200729acc11444ccb9f2823899561867536bd0047a4ac67d8c5ccee19bd33aa7: Status 404 returned error can't find the container with id 200729acc11444ccb9f2823899561867536bd0047a4ac67d8c5ccee19bd33aa7 Apr 22 15:43:21.952997 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:21.952956 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b4bb6977d-t29g2" event={"ID":"5769d82c-40d1-49a8-8038-d6c1ee6a9ece","Type":"ContainerStarted","Data":"ee3c7787da5ac1e231d2297cbb410e5f77baa8633155fb3ef02c96673b43db7c"} Apr 22 15:43:21.952997 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:21.952996 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b4bb6977d-t29g2" event={"ID":"5769d82c-40d1-49a8-8038-d6c1ee6a9ece","Type":"ContainerStarted","Data":"200729acc11444ccb9f2823899561867536bd0047a4ac67d8c5ccee19bd33aa7"} Apr 22 15:43:21.973179 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:21.973125 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b4bb6977d-t29g2" podStartSLOduration=1.973109802 podStartE2EDuration="1.973109802s" podCreationTimestamp="2026-04-22 15:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:43:21.971548608 +0000 UTC m=+573.297302259" watchObservedRunningTime="2026-04-22 15:43:21.973109802 +0000 UTC m=+573.298863454" Apr 22 15:43:30.821421 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:30.821378 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:30.821421 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:30.821425 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:30.826175 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:30.826147 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:30.982870 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:30.982835 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b4bb6977d-t29g2" Apr 22 15:43:31.035573 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:31.035532 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c48f4fc67-p5qwx"] Apr 22 15:43:49.151488 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:49.151447 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 15:43:49.151932 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:49.151695 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 15:43:49.154135 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:49.154108 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 15:43:49.154264 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:49.154201 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 15:43:56.057354 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.057313 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c48f4fc67-p5qwx" podUID="8fcd2ee2-b3e3-496d-873f-509fdf93ed27" containerName="console" containerID="cri-o://0dfcc48de17724bafac2da176766258bf20077948d81b1470b1c305ee852bba5" gracePeriod=15 Apr 22 15:43:56.298941 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.298915 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c48f4fc67-p5qwx_8fcd2ee2-b3e3-496d-873f-509fdf93ed27/console/0.log" Apr 22 15:43:56.299101 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.298982 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:43:56.361327 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.361236 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-console-serving-cert\") pod \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " Apr 22 15:43:56.361327 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.361283 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-console-config\") pod \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " Apr 22 15:43:56.361327 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.361318 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7whmj\" (UniqueName: \"kubernetes.io/projected/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-kube-api-access-7whmj\") pod \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " Apr 22 15:43:56.361604 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.361341 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-trusted-ca-bundle\") pod \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " Apr 22 15:43:56.361604 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.361364 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-oauth-serving-cert\") pod \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " Apr 22 15:43:56.361604 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.361389 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-service-ca\") pod \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " Apr 22 15:43:56.361604 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.361441 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-console-oauth-config\") pod \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\" (UID: \"8fcd2ee2-b3e3-496d-873f-509fdf93ed27\") " Apr 22 15:43:56.361901 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.361872 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-console-config" (OuterVolumeSpecName: "console-config") pod "8fcd2ee2-b3e3-496d-873f-509fdf93ed27" (UID: "8fcd2ee2-b3e3-496d-873f-509fdf93ed27"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:43:56.361961 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.361896 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8fcd2ee2-b3e3-496d-873f-509fdf93ed27" (UID: "8fcd2ee2-b3e3-496d-873f-509fdf93ed27"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:43:56.361961 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.361906 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8fcd2ee2-b3e3-496d-873f-509fdf93ed27" (UID: "8fcd2ee2-b3e3-496d-873f-509fdf93ed27"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:43:56.361961 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.361914 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-service-ca" (OuterVolumeSpecName: "service-ca") pod "8fcd2ee2-b3e3-496d-873f-509fdf93ed27" (UID: "8fcd2ee2-b3e3-496d-873f-509fdf93ed27"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:43:56.363690 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.363664 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8fcd2ee2-b3e3-496d-873f-509fdf93ed27" (UID: "8fcd2ee2-b3e3-496d-873f-509fdf93ed27"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:43:56.363690 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.363664 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-kube-api-access-7whmj" (OuterVolumeSpecName: "kube-api-access-7whmj") pod "8fcd2ee2-b3e3-496d-873f-509fdf93ed27" (UID: "8fcd2ee2-b3e3-496d-873f-509fdf93ed27"). InnerVolumeSpecName "kube-api-access-7whmj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:43:56.363826 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.363732 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8fcd2ee2-b3e3-496d-873f-509fdf93ed27" (UID: "8fcd2ee2-b3e3-496d-873f-509fdf93ed27"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:43:56.462141 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.462103 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7whmj\" (UniqueName: \"kubernetes.io/projected/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-kube-api-access-7whmj\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:43:56.462141 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.462137 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-trusted-ca-bundle\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:43:56.462141 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.462147 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-oauth-serving-cert\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:43:56.462374 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.462156 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-service-ca\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:43:56.462374 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.462166 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-console-oauth-config\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:43:56.462374 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.462175 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-console-serving-cert\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:43:56.462374 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:56.462185 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8fcd2ee2-b3e3-496d-873f-509fdf93ed27-console-config\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:43:57.051542 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:57.051509 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c48f4fc67-p5qwx_8fcd2ee2-b3e3-496d-873f-509fdf93ed27/console/0.log" Apr 22 15:43:57.051755 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:57.051553 2573 generic.go:358] "Generic (PLEG): container finished" podID="8fcd2ee2-b3e3-496d-873f-509fdf93ed27" containerID="0dfcc48de17724bafac2da176766258bf20077948d81b1470b1c305ee852bba5" exitCode=2 Apr 22 15:43:57.051755 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:57.051610 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c48f4fc67-p5qwx" event={"ID":"8fcd2ee2-b3e3-496d-873f-509fdf93ed27","Type":"ContainerDied","Data":"0dfcc48de17724bafac2da176766258bf20077948d81b1470b1c305ee852bba5"} Apr 22 15:43:57.051755 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:57.051633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c48f4fc67-p5qwx" event={"ID":"8fcd2ee2-b3e3-496d-873f-509fdf93ed27","Type":"ContainerDied","Data":"867ea4067c641441771fb5ea775da9e8ed73a20bc084e6db3a11a7ef376e5386"} Apr 22 15:43:57.051755 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:57.051638 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c48f4fc67-p5qwx" Apr 22 15:43:57.051755 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:57.051648 2573 scope.go:117] "RemoveContainer" containerID="0dfcc48de17724bafac2da176766258bf20077948d81b1470b1c305ee852bba5" Apr 22 15:43:57.060662 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:57.060470 2573 scope.go:117] "RemoveContainer" containerID="0dfcc48de17724bafac2da176766258bf20077948d81b1470b1c305ee852bba5" Apr 22 15:43:57.060897 ip-10-0-143-128 kubenswrapper[2573]: E0422 15:43:57.060776 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dfcc48de17724bafac2da176766258bf20077948d81b1470b1c305ee852bba5\": container with ID starting with 0dfcc48de17724bafac2da176766258bf20077948d81b1470b1c305ee852bba5 not found: ID does not exist" containerID="0dfcc48de17724bafac2da176766258bf20077948d81b1470b1c305ee852bba5" Apr 22 15:43:57.060897 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:57.060803 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dfcc48de17724bafac2da176766258bf20077948d81b1470b1c305ee852bba5"} err="failed to get container status \"0dfcc48de17724bafac2da176766258bf20077948d81b1470b1c305ee852bba5\": rpc error: code = NotFound desc = could not find container \"0dfcc48de17724bafac2da176766258bf20077948d81b1470b1c305ee852bba5\": container with ID starting with 0dfcc48de17724bafac2da176766258bf20077948d81b1470b1c305ee852bba5 not found: ID does not exist" Apr 22 15:43:57.110267 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:57.110202 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c48f4fc67-p5qwx"] Apr 22 15:43:57.112823 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:57.112791 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c48f4fc67-p5qwx"] Apr 22 15:43:57.254376 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:43:57.254340 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fcd2ee2-b3e3-496d-873f-509fdf93ed27" path="/var/lib/kubelet/pods/8fcd2ee2-b3e3-496d-873f-509fdf93ed27/volumes" Apr 22 15:44:26.533655 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:44:26.533613 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-x9jlh/test-trainjob-gqfsf-node-0-0-l6rwk"] Apr 22 15:44:26.534208 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:44:26.534129 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8fcd2ee2-b3e3-496d-873f-509fdf93ed27" containerName="console" Apr 22 15:44:26.534208 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:44:26.534150 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fcd2ee2-b3e3-496d-873f-509fdf93ed27" containerName="console" Apr 22 15:44:26.534335 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:44:26.534241 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8fcd2ee2-b3e3-496d-873f-509fdf93ed27" containerName="console" Apr 22 15:44:26.537316 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:44:26.537289 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-x9jlh/test-trainjob-gqfsf-node-0-0-l6rwk" Apr 22 15:44:26.541632 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:44:26.541575 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-x9jlh\"/\"default-dockercfg-n8rsq\"" Apr 22 15:44:26.541632 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:44:26.541575 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-x9jlh\"/\"kube-root-ca.crt\"" Apr 22 15:44:26.541941 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:44:26.541918 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-x9jlh\"/\"openshift-service-ca.crt\"" Apr 22 15:44:26.551388 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:44:26.551352 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-x9jlh/test-trainjob-gqfsf-node-0-0-l6rwk"] Apr 22 15:44:26.597329 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:44:26.597282 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh7wj\" (UniqueName: \"kubernetes.io/projected/09915ea3-651f-44fc-aa8e-89e0f8db6097-kube-api-access-wh7wj\") pod \"test-trainjob-gqfsf-node-0-0-l6rwk\" (UID: \"09915ea3-651f-44fc-aa8e-89e0f8db6097\") " pod="test-ns-x9jlh/test-trainjob-gqfsf-node-0-0-l6rwk" Apr 22 15:44:26.698117 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:44:26.698038 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wh7wj\" (UniqueName: \"kubernetes.io/projected/09915ea3-651f-44fc-aa8e-89e0f8db6097-kube-api-access-wh7wj\") pod \"test-trainjob-gqfsf-node-0-0-l6rwk\" (UID: \"09915ea3-651f-44fc-aa8e-89e0f8db6097\") " pod="test-ns-x9jlh/test-trainjob-gqfsf-node-0-0-l6rwk" Apr 22 15:44:26.707343 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:44:26.707316 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh7wj\" (UniqueName: \"kubernetes.io/projected/09915ea3-651f-44fc-aa8e-89e0f8db6097-kube-api-access-wh7wj\") pod \"test-trainjob-gqfsf-node-0-0-l6rwk\" (UID: \"09915ea3-651f-44fc-aa8e-89e0f8db6097\") " pod="test-ns-x9jlh/test-trainjob-gqfsf-node-0-0-l6rwk" Apr 22 15:44:26.846453 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:44:26.846343 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-x9jlh/test-trainjob-gqfsf-node-0-0-l6rwk" Apr 22 15:44:26.979610 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:44:26.979583 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-x9jlh/test-trainjob-gqfsf-node-0-0-l6rwk"] Apr 22 15:44:26.982508 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:44:26.982478 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09915ea3_651f_44fc_aa8e_89e0f8db6097.slice/crio-a81534318548556445189449868667f9512734a01d2a3a2be49b65619c2ccf90 WatchSource:0}: Error finding container a81534318548556445189449868667f9512734a01d2a3a2be49b65619c2ccf90: Status 404 returned error can't find the container with id a81534318548556445189449868667f9512734a01d2a3a2be49b65619c2ccf90 Apr 22 15:44:26.984544 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:44:26.984521 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:44:27.136942 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:44:27.136848 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-x9jlh/test-trainjob-gqfsf-node-0-0-l6rwk" event={"ID":"09915ea3-651f-44fc-aa8e-89e0f8db6097","Type":"ContainerStarted","Data":"a81534318548556445189449868667f9512734a01d2a3a2be49b65619c2ccf90"} Apr 22 15:49:20.570369 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:20.570342 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 15:49:20.570870 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:20.570347 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 15:49:20.573400 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:20.573379 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 15:49:20.573527 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:20.573413 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 15:49:22.084420 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:22.084384 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-x9jlh/test-trainjob-gqfsf-node-0-0-l6rwk" event={"ID":"09915ea3-651f-44fc-aa8e-89e0f8db6097","Type":"ContainerStarted","Data":"a955ea75c669f1856e821fd43b54e5cfdde161a8c98ba13985eb9f76aac277fb"} Apr 22 15:49:22.109626 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:22.109568 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-x9jlh/test-trainjob-gqfsf-node-0-0-l6rwk" podStartSLOduration=1.41341597 podStartE2EDuration="4m56.109552306s" podCreationTimestamp="2026-04-22 15:44:26 +0000 UTC" firstStartedPulling="2026-04-22 15:44:26.984689627 +0000 UTC m=+638.310443256" lastFinishedPulling="2026-04-22 15:49:21.680825948 +0000 UTC m=+933.006579592" observedRunningTime="2026-04-22 15:49:22.108391601 +0000 UTC m=+933.434145276" watchObservedRunningTime="2026-04-22 15:49:22.109552306 +0000 UTC m=+933.435305957" Apr 22 15:49:29.109428 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:29.109394 2573 generic.go:358] "Generic (PLEG): container finished" podID="09915ea3-651f-44fc-aa8e-89e0f8db6097" containerID="a955ea75c669f1856e821fd43b54e5cfdde161a8c98ba13985eb9f76aac277fb" exitCode=0 Apr 22 15:49:29.109854 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:29.109465 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-x9jlh/test-trainjob-gqfsf-node-0-0-l6rwk" event={"ID":"09915ea3-651f-44fc-aa8e-89e0f8db6097","Type":"ContainerDied","Data":"a955ea75c669f1856e821fd43b54e5cfdde161a8c98ba13985eb9f76aac277fb"} Apr 22 15:49:30.373289 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:30.373267 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-x9jlh/test-trainjob-gqfsf-node-0-0-l6rwk" Apr 22 15:49:30.479774 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:30.479738 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh7wj\" (UniqueName: \"kubernetes.io/projected/09915ea3-651f-44fc-aa8e-89e0f8db6097-kube-api-access-wh7wj\") pod \"09915ea3-651f-44fc-aa8e-89e0f8db6097\" (UID: \"09915ea3-651f-44fc-aa8e-89e0f8db6097\") " Apr 22 15:49:30.482103 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:30.482072 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09915ea3-651f-44fc-aa8e-89e0f8db6097-kube-api-access-wh7wj" (OuterVolumeSpecName: "kube-api-access-wh7wj") pod "09915ea3-651f-44fc-aa8e-89e0f8db6097" (UID: "09915ea3-651f-44fc-aa8e-89e0f8db6097"). InnerVolumeSpecName "kube-api-access-wh7wj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:49:30.580428 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:30.580396 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wh7wj\" (UniqueName: \"kubernetes.io/projected/09915ea3-651f-44fc-aa8e-89e0f8db6097-kube-api-access-wh7wj\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:49:31.115800 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:31.115766 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-x9jlh/test-trainjob-gqfsf-node-0-0-l6rwk" event={"ID":"09915ea3-651f-44fc-aa8e-89e0f8db6097","Type":"ContainerDied","Data":"a81534318548556445189449868667f9512734a01d2a3a2be49b65619c2ccf90"} Apr 22 15:49:31.115800 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:31.115798 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a81534318548556445189449868667f9512734a01d2a3a2be49b65619c2ccf90" Apr 22 15:49:31.115800 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:31.115800 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-x9jlh/test-trainjob-gqfsf-node-0-0-l6rwk" Apr 22 15:49:31.765406 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:31.765367 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-xqxzt/test-trainjob-9kkz7-node-0-0-z8fhp"] Apr 22 15:49:31.765812 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:31.765646 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09915ea3-651f-44fc-aa8e-89e0f8db6097" containerName="node" Apr 22 15:49:31.765812 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:31.765656 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="09915ea3-651f-44fc-aa8e-89e0f8db6097" containerName="node" Apr 22 15:49:31.765812 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:31.765701 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="09915ea3-651f-44fc-aa8e-89e0f8db6097" containerName="node" Apr 22 15:49:31.985963 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:31.985924 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-xqxzt/test-trainjob-9kkz7-node-0-0-z8fhp"] Apr 22 15:49:31.986157 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:31.986111 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-xqxzt/test-trainjob-9kkz7-node-0-0-z8fhp" Apr 22 15:49:31.988977 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:31.988940 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-xqxzt\"/\"openshift-service-ca.crt\"" Apr 22 15:49:31.989822 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:31.989794 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-xqxzt\"/\"kube-root-ca.crt\"" Apr 22 15:49:31.989939 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:31.989868 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-xqxzt\"/\"default-dockercfg-skbjf\"" Apr 22 15:49:32.089756 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:32.089665 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbm88\" (UniqueName: \"kubernetes.io/projected/f2cd5b5b-becf-4cf2-9233-8d3715e048da-kube-api-access-cbm88\") pod \"test-trainjob-9kkz7-node-0-0-z8fhp\" (UID: \"f2cd5b5b-becf-4cf2-9233-8d3715e048da\") " pod="test-ns-xqxzt/test-trainjob-9kkz7-node-0-0-z8fhp" Apr 22 15:49:32.190458 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:32.190409 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbm88\" (UniqueName: \"kubernetes.io/projected/f2cd5b5b-becf-4cf2-9233-8d3715e048da-kube-api-access-cbm88\") pod \"test-trainjob-9kkz7-node-0-0-z8fhp\" (UID: \"f2cd5b5b-becf-4cf2-9233-8d3715e048da\") " pod="test-ns-xqxzt/test-trainjob-9kkz7-node-0-0-z8fhp" Apr 22 15:49:32.199408 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:32.199373 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbm88\" (UniqueName: \"kubernetes.io/projected/f2cd5b5b-becf-4cf2-9233-8d3715e048da-kube-api-access-cbm88\") pod \"test-trainjob-9kkz7-node-0-0-z8fhp\" (UID: \"f2cd5b5b-becf-4cf2-9233-8d3715e048da\") " pod="test-ns-xqxzt/test-trainjob-9kkz7-node-0-0-z8fhp" Apr 22 15:49:32.295484 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:32.295431 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-xqxzt/test-trainjob-9kkz7-node-0-0-z8fhp" Apr 22 15:49:32.419994 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:32.419840 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-xqxzt/test-trainjob-9kkz7-node-0-0-z8fhp"] Apr 22 15:49:32.422873 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:49:32.422842 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2cd5b5b_becf_4cf2_9233_8d3715e048da.slice/crio-8d9167a4ad05ac201278960ca3b5d33ee4835c075b9123f948d9b14f8ffc8c56 WatchSource:0}: Error finding container 8d9167a4ad05ac201278960ca3b5d33ee4835c075b9123f948d9b14f8ffc8c56: Status 404 returned error can't find the container with id 8d9167a4ad05ac201278960ca3b5d33ee4835c075b9123f948d9b14f8ffc8c56 Apr 22 15:49:32.426463 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:32.426437 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:49:33.122237 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:49:33.122203 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-xqxzt/test-trainjob-9kkz7-node-0-0-z8fhp" event={"ID":"f2cd5b5b-becf-4cf2-9233-8d3715e048da","Type":"ContainerStarted","Data":"8d9167a4ad05ac201278960ca3b5d33ee4835c075b9123f948d9b14f8ffc8c56"} Apr 22 15:53:52.937603 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:53:52.937569 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-xqxzt/test-trainjob-9kkz7-node-0-0-z8fhp" event={"ID":"f2cd5b5b-becf-4cf2-9233-8d3715e048da","Type":"ContainerStarted","Data":"e7087f3ca43559a5b1ae393715c688a0b13d69d090feeba1a6a426e6e9096f9c"} Apr 22 15:53:52.966070 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:53:52.966003 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-xqxzt/test-trainjob-9kkz7-node-0-0-z8fhp" podStartSLOduration=1.979071566 podStartE2EDuration="4m21.965982741s" podCreationTimestamp="2026-04-22 15:49:31 +0000 UTC" firstStartedPulling="2026-04-22 15:49:32.426618903 +0000 UTC m=+943.752372540" lastFinishedPulling="2026-04-22 15:53:52.41353008 +0000 UTC m=+1203.739283715" observedRunningTime="2026-04-22 15:53:52.964227544 +0000 UTC m=+1204.289981197" watchObservedRunningTime="2026-04-22 15:53:52.965982741 +0000 UTC m=+1204.291736393" Apr 22 15:53:59.958994 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:53:59.958949 2573 generic.go:358] "Generic (PLEG): container finished" podID="f2cd5b5b-becf-4cf2-9233-8d3715e048da" containerID="e7087f3ca43559a5b1ae393715c688a0b13d69d090feeba1a6a426e6e9096f9c" exitCode=0 Apr 22 15:53:59.959435 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:53:59.959026 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-xqxzt/test-trainjob-9kkz7-node-0-0-z8fhp" event={"ID":"f2cd5b5b-becf-4cf2-9233-8d3715e048da","Type":"ContainerDied","Data":"e7087f3ca43559a5b1ae393715c688a0b13d69d090feeba1a6a426e6e9096f9c"} Apr 22 15:54:01.188358 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:01.188335 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-xqxzt/test-trainjob-9kkz7-node-0-0-z8fhp" Apr 22 15:54:01.288750 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:01.288714 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbm88\" (UniqueName: \"kubernetes.io/projected/f2cd5b5b-becf-4cf2-9233-8d3715e048da-kube-api-access-cbm88\") pod \"f2cd5b5b-becf-4cf2-9233-8d3715e048da\" (UID: \"f2cd5b5b-becf-4cf2-9233-8d3715e048da\") " Apr 22 15:54:01.291104 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:01.291078 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2cd5b5b-becf-4cf2-9233-8d3715e048da-kube-api-access-cbm88" (OuterVolumeSpecName: "kube-api-access-cbm88") pod "f2cd5b5b-becf-4cf2-9233-8d3715e048da" (UID: "f2cd5b5b-becf-4cf2-9233-8d3715e048da"). InnerVolumeSpecName "kube-api-access-cbm88". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:54:01.389327 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:01.389287 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cbm88\" (UniqueName: \"kubernetes.io/projected/f2cd5b5b-becf-4cf2-9233-8d3715e048da-kube-api-access-cbm88\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:54:01.965715 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:01.965684 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-xqxzt/test-trainjob-9kkz7-node-0-0-z8fhp" Apr 22 15:54:01.965715 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:01.965704 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-xqxzt/test-trainjob-9kkz7-node-0-0-z8fhp" event={"ID":"f2cd5b5b-becf-4cf2-9233-8d3715e048da","Type":"ContainerDied","Data":"8d9167a4ad05ac201278960ca3b5d33ee4835c075b9123f948d9b14f8ffc8c56"} Apr 22 15:54:01.965921 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:01.965735 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d9167a4ad05ac201278960ca3b5d33ee4835c075b9123f948d9b14f8ffc8c56" Apr 22 15:54:02.917005 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:02.916970 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-lrx8p/test-trainjob-7vp9x-node-0-0-cxtqs"] Apr 22 15:54:02.917395 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:02.917254 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2cd5b5b-becf-4cf2-9233-8d3715e048da" containerName="node" Apr 22 15:54:02.917395 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:02.917266 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2cd5b5b-becf-4cf2-9233-8d3715e048da" containerName="node" Apr 22 15:54:02.917395 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:02.917340 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2cd5b5b-becf-4cf2-9233-8d3715e048da" containerName="node" Apr 22 15:54:03.002610 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:03.002565 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-lrx8p/test-trainjob-7vp9x-node-0-0-cxtqs"] Apr 22 15:54:03.002753 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:03.002681 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-lrx8p/test-trainjob-7vp9x-node-0-0-cxtqs" Apr 22 15:54:03.009107 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:03.009048 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-lrx8p\"/\"default-dockercfg-hh5jz\"" Apr 22 15:54:03.009107 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:03.009092 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-lrx8p\"/\"kube-root-ca.crt\"" Apr 22 15:54:03.009107 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:03.009049 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-lrx8p\"/\"openshift-service-ca.crt\"" Apr 22 15:54:03.105514 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:03.105474 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghbj8\" (UniqueName: \"kubernetes.io/projected/fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc-kube-api-access-ghbj8\") pod \"test-trainjob-7vp9x-node-0-0-cxtqs\" (UID: \"fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc\") " pod="test-ns-lrx8p/test-trainjob-7vp9x-node-0-0-cxtqs" Apr 22 15:54:03.206677 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:03.206582 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghbj8\" (UniqueName: \"kubernetes.io/projected/fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc-kube-api-access-ghbj8\") pod \"test-trainjob-7vp9x-node-0-0-cxtqs\" (UID: \"fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc\") " pod="test-ns-lrx8p/test-trainjob-7vp9x-node-0-0-cxtqs" Apr 22 15:54:03.215836 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:03.215801 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghbj8\" (UniqueName: \"kubernetes.io/projected/fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc-kube-api-access-ghbj8\") pod \"test-trainjob-7vp9x-node-0-0-cxtqs\" (UID: \"fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc\") " pod="test-ns-lrx8p/test-trainjob-7vp9x-node-0-0-cxtqs" Apr 22 15:54:03.316863 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:03.316825 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-lrx8p/test-trainjob-7vp9x-node-0-0-cxtqs" Apr 22 15:54:03.446815 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:03.446719 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-lrx8p/test-trainjob-7vp9x-node-0-0-cxtqs"] Apr 22 15:54:03.449212 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:54:03.449182 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc5783c4_71be_49c2_bc65_fb9fb7c1dcbc.slice/crio-05b11dd52eef6e70fdfc616eea7ff4746c50c770347859f5337b8c96eb432d8e WatchSource:0}: Error finding container 05b11dd52eef6e70fdfc616eea7ff4746c50c770347859f5337b8c96eb432d8e: Status 404 returned error can't find the container with id 05b11dd52eef6e70fdfc616eea7ff4746c50c770347859f5337b8c96eb432d8e Apr 22 15:54:03.973586 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:03.973549 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-lrx8p/test-trainjob-7vp9x-node-0-0-cxtqs" event={"ID":"fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc","Type":"ContainerStarted","Data":"05b11dd52eef6e70fdfc616eea7ff4746c50c770347859f5337b8c96eb432d8e"} Apr 22 15:54:34.445475 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:34.445446 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 15:54:34.458590 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:34.445514 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 15:54:34.458590 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:34.448489 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 15:54:34.458590 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:54:34.448542 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 15:55:21.263503 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:21.263465 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-lrx8p/test-trainjob-7vp9x-node-0-0-cxtqs" event={"ID":"fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc","Type":"ContainerStarted","Data":"a09e7dbfb8c4da983ca2b945d27f152bcc6364921b200f5d04ffe8cc45e75b26"} Apr 22 15:55:21.284362 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:21.284308 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-lrx8p/test-trainjob-7vp9x-node-0-0-cxtqs" podStartSLOduration=2.266044331 podStartE2EDuration="1m19.284288918s" podCreationTimestamp="2026-04-22 15:54:02 +0000 UTC" firstStartedPulling="2026-04-22 15:54:03.451144171 +0000 UTC m=+1214.776897801" lastFinishedPulling="2026-04-22 15:55:20.469388759 +0000 UTC m=+1291.795142388" observedRunningTime="2026-04-22 15:55:21.281852848 +0000 UTC m=+1292.607606500" watchObservedRunningTime="2026-04-22 15:55:21.284288918 +0000 UTC m=+1292.610042568" Apr 22 15:55:24.274501 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:24.274456 2573 generic.go:358] "Generic (PLEG): container finished" podID="fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc" containerID="a09e7dbfb8c4da983ca2b945d27f152bcc6364921b200f5d04ffe8cc45e75b26" exitCode=0 Apr 22 15:55:24.274976 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:24.274539 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-lrx8p/test-trainjob-7vp9x-node-0-0-cxtqs" event={"ID":"fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc","Type":"ContainerDied","Data":"a09e7dbfb8c4da983ca2b945d27f152bcc6364921b200f5d04ffe8cc45e75b26"} Apr 22 15:55:25.405540 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:25.405512 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-lrx8p/test-trainjob-7vp9x-node-0-0-cxtqs" Apr 22 15:55:25.546397 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:25.546307 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghbj8\" (UniqueName: \"kubernetes.io/projected/fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc-kube-api-access-ghbj8\") pod \"fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc\" (UID: \"fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc\") " Apr 22 15:55:25.548761 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:25.548721 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc-kube-api-access-ghbj8" (OuterVolumeSpecName: "kube-api-access-ghbj8") pod "fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc" (UID: "fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc"). InnerVolumeSpecName "kube-api-access-ghbj8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:55:25.647455 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:25.647418 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ghbj8\" (UniqueName: \"kubernetes.io/projected/fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc-kube-api-access-ghbj8\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 15:55:26.282996 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:26.282955 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-lrx8p/test-trainjob-7vp9x-node-0-0-cxtqs" event={"ID":"fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc","Type":"ContainerDied","Data":"05b11dd52eef6e70fdfc616eea7ff4746c50c770347859f5337b8c96eb432d8e"} Apr 22 15:55:26.282996 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:26.282993 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05b11dd52eef6e70fdfc616eea7ff4746c50c770347859f5337b8c96eb432d8e" Apr 22 15:55:26.282996 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:26.282992 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-lrx8p/test-trainjob-7vp9x-node-0-0-cxtqs" Apr 22 15:55:27.247813 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:27.247773 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-t5wjv/test-trainjob-p2c2q-node-0-0-9k9dm"] Apr 22 15:55:27.248253 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:27.248184 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc" containerName="node" Apr 22 15:55:27.248253 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:27.248197 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc" containerName="node" Apr 22 15:55:27.248253 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:27.248248 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc" containerName="node" Apr 22 15:55:27.308974 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:27.308924 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-t5wjv/test-trainjob-p2c2q-node-0-0-9k9dm"] Apr 22 15:55:27.309177 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:27.308995 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-t5wjv/test-trainjob-p2c2q-node-0-0-9k9dm" Apr 22 15:55:27.311636 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:27.311616 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-t5wjv\"/\"openshift-service-ca.crt\"" Apr 22 15:55:27.312658 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:27.312635 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-t5wjv\"/\"kube-root-ca.crt\"" Apr 22 15:55:27.312778 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:27.312691 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-t5wjv\"/\"default-dockercfg-bl8gz\"" Apr 22 15:55:27.460455 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:27.460416 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmnqj\" (UniqueName: \"kubernetes.io/projected/df28af46-b005-4a5d-a3fe-f7e2e4a0d000-kube-api-access-gmnqj\") pod \"test-trainjob-p2c2q-node-0-0-9k9dm\" (UID: \"df28af46-b005-4a5d-a3fe-f7e2e4a0d000\") " pod="test-ns-t5wjv/test-trainjob-p2c2q-node-0-0-9k9dm" Apr 22 15:55:27.561525 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:27.561429 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmnqj\" (UniqueName: \"kubernetes.io/projected/df28af46-b005-4a5d-a3fe-f7e2e4a0d000-kube-api-access-gmnqj\") pod \"test-trainjob-p2c2q-node-0-0-9k9dm\" (UID: \"df28af46-b005-4a5d-a3fe-f7e2e4a0d000\") " pod="test-ns-t5wjv/test-trainjob-p2c2q-node-0-0-9k9dm" Apr 22 15:55:27.570283 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:27.570244 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmnqj\" (UniqueName: \"kubernetes.io/projected/df28af46-b005-4a5d-a3fe-f7e2e4a0d000-kube-api-access-gmnqj\") pod \"test-trainjob-p2c2q-node-0-0-9k9dm\" (UID: \"df28af46-b005-4a5d-a3fe-f7e2e4a0d000\") " pod="test-ns-t5wjv/test-trainjob-p2c2q-node-0-0-9k9dm" Apr 22 15:55:27.619734 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:27.619690 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-t5wjv/test-trainjob-p2c2q-node-0-0-9k9dm" Apr 22 15:55:27.747402 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:27.747366 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-t5wjv/test-trainjob-p2c2q-node-0-0-9k9dm"] Apr 22 15:55:27.750713 ip-10-0-143-128 kubenswrapper[2573]: W0422 15:55:27.750675 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf28af46_b005_4a5d_a3fe_f7e2e4a0d000.slice/crio-9a92d9d3f0fa7fdafcc9aed3e5a22ef7193b2a4f36166201e66e9d97f32d9b83 WatchSource:0}: Error finding container 9a92d9d3f0fa7fdafcc9aed3e5a22ef7193b2a4f36166201e66e9d97f32d9b83: Status 404 returned error can't find the container with id 9a92d9d3f0fa7fdafcc9aed3e5a22ef7193b2a4f36166201e66e9d97f32d9b83 Apr 22 15:55:27.753000 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:27.752984 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:55:28.291054 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:55:28.291021 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-t5wjv/test-trainjob-p2c2q-node-0-0-9k9dm" event={"ID":"df28af46-b005-4a5d-a3fe-f7e2e4a0d000","Type":"ContainerStarted","Data":"9a92d9d3f0fa7fdafcc9aed3e5a22ef7193b2a4f36166201e66e9d97f32d9b83"} Apr 22 15:59:34.467704 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:59:34.467578 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 15:59:34.470818 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:59:34.470795 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 15:59:34.473006 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:59:34.472981 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 15:59:34.474397 ip-10-0-143-128 kubenswrapper[2573]: I0422 15:59:34.474372 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 16:02:54.835433 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:02:54.835388 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-t5wjv/test-trainjob-p2c2q-node-0-0-9k9dm" event={"ID":"df28af46-b005-4a5d-a3fe-f7e2e4a0d000","Type":"ContainerStarted","Data":"9e3fe9da3280bfdd222d7fe9788bf2aa95f2b2f75d5266c85a3423abb41b9645"} Apr 22 16:02:54.838492 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:02:54.838462 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-t5wjv\"/\"default-dockercfg-bl8gz\"" Apr 22 16:02:54.868890 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:02:54.868827 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-t5wjv/test-trainjob-p2c2q-node-0-0-9k9dm" podStartSLOduration=1.895993984 podStartE2EDuration="7m27.868809516s" podCreationTimestamp="2026-04-22 15:55:27 +0000 UTC" firstStartedPulling="2026-04-22 15:55:27.753135587 +0000 UTC m=+1299.078889217" lastFinishedPulling="2026-04-22 16:02:53.725951121 +0000 UTC m=+1745.051704749" observedRunningTime="2026-04-22 16:02:54.866173903 +0000 UTC m=+1746.191927555" watchObservedRunningTime="2026-04-22 16:02:54.868809516 +0000 UTC m=+1746.194563228" Apr 22 16:02:54.946819 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:02:54.946791 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-t5wjv\"/\"kube-root-ca.crt\"" Apr 22 16:02:54.957795 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:02:54.957766 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-t5wjv\"/\"openshift-service-ca.crt\"" Apr 22 16:02:57.845954 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:02:57.845866 2573 generic.go:358] "Generic (PLEG): container finished" podID="df28af46-b005-4a5d-a3fe-f7e2e4a0d000" containerID="9e3fe9da3280bfdd222d7fe9788bf2aa95f2b2f75d5266c85a3423abb41b9645" exitCode=0 Apr 22 16:02:57.845954 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:02:57.845905 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-t5wjv/test-trainjob-p2c2q-node-0-0-9k9dm" event={"ID":"df28af46-b005-4a5d-a3fe-f7e2e4a0d000","Type":"ContainerDied","Data":"9e3fe9da3280bfdd222d7fe9788bf2aa95f2b2f75d5266c85a3423abb41b9645"} Apr 22 16:02:58.983299 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:02:58.983271 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-t5wjv/test-trainjob-p2c2q-node-0-0-9k9dm" Apr 22 16:02:59.079405 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:02:59.079365 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmnqj\" (UniqueName: \"kubernetes.io/projected/df28af46-b005-4a5d-a3fe-f7e2e4a0d000-kube-api-access-gmnqj\") pod \"df28af46-b005-4a5d-a3fe-f7e2e4a0d000\" (UID: \"df28af46-b005-4a5d-a3fe-f7e2e4a0d000\") " Apr 22 16:02:59.081730 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:02:59.081702 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df28af46-b005-4a5d-a3fe-f7e2e4a0d000-kube-api-access-gmnqj" (OuterVolumeSpecName: "kube-api-access-gmnqj") pod "df28af46-b005-4a5d-a3fe-f7e2e4a0d000" (UID: "df28af46-b005-4a5d-a3fe-f7e2e4a0d000"). InnerVolumeSpecName "kube-api-access-gmnqj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:02:59.180545 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:02:59.180460 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gmnqj\" (UniqueName: \"kubernetes.io/projected/df28af46-b005-4a5d-a3fe-f7e2e4a0d000-kube-api-access-gmnqj\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 16:02:59.853527 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:02:59.853495 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-t5wjv/test-trainjob-p2c2q-node-0-0-9k9dm" Apr 22 16:02:59.853778 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:02:59.853495 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-t5wjv/test-trainjob-p2c2q-node-0-0-9k9dm" event={"ID":"df28af46-b005-4a5d-a3fe-f7e2e4a0d000","Type":"ContainerDied","Data":"9a92d9d3f0fa7fdafcc9aed3e5a22ef7193b2a4f36166201e66e9d97f32d9b83"} Apr 22 16:02:59.853778 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:02:59.853611 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a92d9d3f0fa7fdafcc9aed3e5a22ef7193b2a4f36166201e66e9d97f32d9b83" Apr 22 16:03:00.191969 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:03:00.191881 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc"] Apr 22 16:03:00.192355 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:03:00.192234 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df28af46-b005-4a5d-a3fe-f7e2e4a0d000" containerName="node" Apr 22 16:03:00.192355 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:03:00.192246 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="df28af46-b005-4a5d-a3fe-f7e2e4a0d000" containerName="node" Apr 22 16:03:00.192355 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:03:00.192299 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="df28af46-b005-4a5d-a3fe-f7e2e4a0d000" containerName="node" Apr 22 16:03:00.218187 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:03:00.218155 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc"] Apr 22 16:03:00.218400 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:03:00.218270 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc" Apr 22 16:03:00.220953 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:03:00.220929 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-vc978\"/\"kube-root-ca.crt\"" Apr 22 16:03:00.220953 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:03:00.220947 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-vc978\"/\"default-dockercfg-5hndh\"" Apr 22 16:03:00.221156 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:03:00.220931 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-vc978\"/\"openshift-service-ca.crt\"" Apr 22 16:03:00.289606 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:03:00.289569 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fk48\" (UniqueName: \"kubernetes.io/projected/c1fcf201-3663-45ae-8f76-74cdf9e238c1-kube-api-access-4fk48\") pod \"test-trainjob-lflgl-node-0-0-tlbhc\" (UID: \"c1fcf201-3663-45ae-8f76-74cdf9e238c1\") " pod="test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc" Apr 22 16:03:00.390173 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:03:00.390138 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fk48\" (UniqueName: \"kubernetes.io/projected/c1fcf201-3663-45ae-8f76-74cdf9e238c1-kube-api-access-4fk48\") pod \"test-trainjob-lflgl-node-0-0-tlbhc\" (UID: \"c1fcf201-3663-45ae-8f76-74cdf9e238c1\") " pod="test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc" Apr 22 16:03:00.398852 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:03:00.398820 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fk48\" (UniqueName: \"kubernetes.io/projected/c1fcf201-3663-45ae-8f76-74cdf9e238c1-kube-api-access-4fk48\") pod \"test-trainjob-lflgl-node-0-0-tlbhc\" (UID: \"c1fcf201-3663-45ae-8f76-74cdf9e238c1\") " pod="test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc" Apr 22 16:03:00.527564 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:03:00.527523 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc" Apr 22 16:03:00.656876 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:03:00.656847 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc"] Apr 22 16:03:00.659704 ip-10-0-143-128 kubenswrapper[2573]: W0422 16:03:00.659672 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1fcf201_3663_45ae_8f76_74cdf9e238c1.slice/crio-0c2f55feaa5beaddad0faa386d5afe853ac12e31fea433a151b0f6fabf2c2e8d WatchSource:0}: Error finding container 0c2f55feaa5beaddad0faa386d5afe853ac12e31fea433a151b0f6fabf2c2e8d: Status 404 returned error can't find the container with id 0c2f55feaa5beaddad0faa386d5afe853ac12e31fea433a151b0f6fabf2c2e8d Apr 22 16:03:00.661564 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:03:00.661549 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:03:00.858245 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:03:00.858153 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc" event={"ID":"c1fcf201-3663-45ae-8f76-74cdf9e238c1","Type":"ContainerStarted","Data":"0c2f55feaa5beaddad0faa386d5afe853ac12e31fea433a151b0f6fabf2c2e8d"} Apr 22 16:04:34.492927 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:04:34.492893 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 16:04:34.495726 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:04:34.495160 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 16:04:34.496162 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:04:34.496130 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 16:04:34.498359 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:04:34.498324 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 16:09:34.516804 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:09:34.516683 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 16:09:34.520827 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:09:34.519969 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 16:09:34.521020 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:09:34.521002 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 16:09:34.524131 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:09:34.524107 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 16:11:04.968760 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:11:04.968728 2573 eviction_manager.go:376] "Eviction manager: attempting to reclaim" resourceName="ephemeral-storage" Apr 22 16:11:04.969261 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:11:04.968800 2573 container_gc.go:86] "Attempting to delete unused containers" Apr 22 16:11:04.970135 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:11:04.970109 2573 scope.go:117] "RemoveContainer" containerID="a09e7dbfb8c4da983ca2b945d27f152bcc6364921b200f5d04ffe8cc45e75b26" Apr 22 16:11:05.338196 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:11:05.338163 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasDiskPressure" Apr 22 16:12:22.630967 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:12:22.630911 2573 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 22 16:12:22.630967 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:12:22.630975 2573 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 22 16:12:22.631640 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:12:22.630988 2573 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 22 16:13:04.971087 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:13:04.971028 2573 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" containerID="a09e7dbfb8c4da983ca2b945d27f152bcc6364921b200f5d04ffe8cc45e75b26" Apr 22 16:13:04.971720 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:13:04.971107 2573 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" containerID="a09e7dbfb8c4da983ca2b945d27f152bcc6364921b200f5d04ffe8cc45e75b26" Apr 22 16:13:04.971720 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:13:04.971131 2573 scope.go:117] "RemoveContainer" containerID="e7087f3ca43559a5b1ae393715c688a0b13d69d090feeba1a6a426e6e9096f9c" Apr 22 16:14:52.631670 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:14:52.631605 2573 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 22 16:14:52.631670 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:14:52.631672 2573 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 22 16:14:52.631670 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:14:52.631684 2573 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 22 16:15:04.972380 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:15:04.972324 2573 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" containerID="e7087f3ca43559a5b1ae393715c688a0b13d69d090feeba1a6a426e6e9096f9c" Apr 22 16:15:04.972380 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:15:04.972376 2573 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" containerID="e7087f3ca43559a5b1ae393715c688a0b13d69d090feeba1a6a426e6e9096f9c" Apr 22 16:15:04.972989 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:04.972403 2573 scope.go:117] "RemoveContainer" containerID="9e3fe9da3280bfdd222d7fe9788bf2aa95f2b2f75d5266c85a3423abb41b9645" Apr 22 16:15:06.631446 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:06.631410 2573 scope.go:117] "RemoveContainer" containerID="a955ea75c669f1856e821fd43b54e5cfdde161a8c98ba13985eb9f76aac277fb" Apr 22 16:15:06.645747 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:06.645712 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 16:15:06.645936 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:06.645827 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 16:15:06.648853 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:06.648828 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 16:15:06.649075 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:06.649040 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 16:15:06.656033 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:06.656003 2573 image_gc_manager.go:391] "Disk usage on image filesystem is over the high threshold, trying to free bytes down to the low threshold" usage=100 highThreshold=85 amountToFree=25648794009 lowThreshold=80 Apr 22 16:15:06.656033 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:06.656033 2573 image_gc_manager.go:514] "Removing image to free bytes" imageID="5151a6030289f6d1ef2c984ebd3e465632a3bf64de79db6f7b3d6e2e638b0557" size=1065600018 runtimeHandler="" Apr 22 16:15:06.656694 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:15:06.656662 2573 log.go:32] "RemoveImage from image service failed" err="rpc error: code = Unknown desc = delete image: image used by 30dd8778ef59bfb75dfd8b79034230f68b6aced28a2a266c2d778b8770dde3ea: image is in use by a container" image="5151a6030289f6d1ef2c984ebd3e465632a3bf64de79db6f7b3d6e2e638b0557" Apr 22 16:15:06.656797 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:15:06.656708 2573 kuberuntime_image.go:137] "Failed to remove image" err="rpc error: code = Unknown desc = delete image: image used by 30dd8778ef59bfb75dfd8b79034230f68b6aced28a2a266c2d778b8770dde3ea: image is in use by a container" image="5151a6030289f6d1ef2c984ebd3e465632a3bf64de79db6f7b3d6e2e638b0557" Apr 22 16:15:06.656797 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:06.656728 2573 image_gc_manager.go:514] "Removing image to free bytes" imageID="ac4be6c7a52584c773ae754a4ccfb9fb1db440f4c9d858ad0f78765a85625b4b" size=1065006420 runtimeHandler="" Apr 22 16:15:07.188671 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:07.188630 2573 image_gc_manager.go:514] "Removing image to free bytes" imageID="7e65b8288e37c3f4fac04e8bf51240765caae34795b317d44d5399762a08b761" size=23201654702 runtimeHandler="" Apr 22 16:15:07.353054 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:15:07.352997 2573 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\"/\"\"/\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocfft/rocfft_kernel_cache.db: no space left on device); artifact err: provided artifact is a container image" image="quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6" Apr 22 16:15:07.353247 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:15:07.353212 2573 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:node,Image:quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6,Command:[python -c import torch; print(f'PyTorch version: {torch.__version__}'); print('Training completed successfully')],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:,HostPort:0,ContainerPort:29500,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:PET_NNODES,Value:1,ValueFrom:nil,},EnvVar{Name:PET_NPROC_PER_NODE,Value:1,ValueFrom:nil,},EnvVar{Name:PET_NODE_RANK,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['batch.kubernetes.io/job-completion-index'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:PET_MASTER_ADDR,Value:test-trainjob-lflgl-node-0-0.test-trainjob-lflgl,ValueFrom:nil,},EnvVar{Name:PET_MASTER_PORT,Value:29500,ValueFrom:nil,},EnvVar{Name:JOB_COMPLETION_INDEX,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['batch.kubernetes.io/job-completion-index'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4fk48,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-trainjob-lflgl-node-0-0-tlbhc_test-ns-vc978(c1fcf201-3663-45ae-8f76-74cdf9e238c1): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\"/\"\"/\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocfft/rocfft_kernel_cache.db: no space left on device); artifact err: provided artifact is a container image" logger="UnhandledError" Apr 22 16:15:07.354420 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:15:07.354384 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \\\"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\\\"/\\\"\\\"/\\\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\\\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocfft/rocfft_kernel_cache.db: no space left on device); artifact err: provided artifact is a container image\"" pod="test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc" podUID="c1fcf201-3663-45ae-8f76-74cdf9e238c1" Apr 22 16:15:07.388040 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:07.387968 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-vc978\"/\"default-dockercfg-5hndh\"" Apr 22 16:15:07.436754 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:07.436724 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-vc978\"/\"kube-root-ca.crt\"" Apr 22 16:15:07.446912 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:07.446877 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-vc978\"/\"openshift-service-ca.crt\"" Apr 22 16:15:10.833192 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:10.833150 2573 image_gc_manager.go:514] "Removing image to free bytes" imageID="7faf0aa0a3880746117325af4230e7cd818990a9deaa54e1b8356ee7257c9b43" size=7588072890 runtimeHandler="" Apr 22 16:15:10.836812 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:10.836783 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:15:10.837117 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:15:10.837080 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \\\"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\\\"/\\\"\\\"/\\\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\\\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocfft/rocfft_kernel_cache.db: no space left on device); artifact err: provided artifact is a container image\"" pod="test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc" podUID="c1fcf201-3663-45ae-8f76-74cdf9e238c1" Apr 22 16:15:14.024579 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:15:14.024534 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="wanted to free 25648794009 bytes, but freed 31854734012 bytes space with errors in image deletion: rpc error: code = Unknown desc = delete image: image used by 30dd8778ef59bfb75dfd8b79034230f68b6aced28a2a266c2d778b8770dde3ea: image is in use by a container" Apr 22 16:15:14.059588 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:14.059567 2573 image_gc_manager.go:447] "Attempting to delete unused images" Apr 22 16:15:14.070038 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:14.070011 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 16:15:14.072829 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:14.072802 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 16:15:14.077009 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:14.076981 2573 image_gc_manager.go:514] "Removing image to free bytes" imageID="5151a6030289f6d1ef2c984ebd3e465632a3bf64de79db6f7b3d6e2e638b0557" size=1065600018 runtimeHandler="" Apr 22 16:15:14.077502 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:15:14.077478 2573 log.go:32] "RemoveImage from image service failed" err="rpc error: code = Unknown desc = delete image: image used by 30dd8778ef59bfb75dfd8b79034230f68b6aced28a2a266c2d778b8770dde3ea: image is in use by a container" image="5151a6030289f6d1ef2c984ebd3e465632a3bf64de79db6f7b3d6e2e638b0557" Apr 22 16:15:14.077602 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:15:14.077511 2573 kuberuntime_image.go:137] "Failed to remove image" err="rpc error: code = Unknown desc = delete image: image used by 30dd8778ef59bfb75dfd8b79034230f68b6aced28a2a266c2d778b8770dde3ea: image is in use by a container" image="5151a6030289f6d1ef2c984ebd3e465632a3bf64de79db6f7b3d6e2e638b0557" Apr 22 16:15:14.077602 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:14.077528 2573 image_gc_manager.go:514] "Removing image to free bytes" imageID="819e15fdec92d846e6d5de4b1b2988adcb74f6d3046689fe03c655b03a67975d" size=18873458221 runtimeHandler="" Apr 22 16:15:17.051819 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:17.051769 2573 image_gc_manager.go:514] "Removing image to free bytes" imageID="bd2f0c6a473dfa650b536cfe1992446bf45305b3ace698398143f161694113a5" size=20806872103 runtimeHandler="" Apr 22 16:15:21.699415 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:21.699367 2573 eviction_manager.go:473] "Eviction manager: unexpected error when attempting to reduce resource pressure" resourceName="ephemeral-storage" err="wanted to free 9223372036854775807 bytes, but freed 39680330324 bytes space with errors in image deletion: rpc error: code = Unknown desc = delete image: image used by 30dd8778ef59bfb75dfd8b79034230f68b6aced28a2a266c2d778b8770dde3ea: image is in use by a container" Apr 22 16:15:21.706445 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:15:21.706417 2573 eviction_manager.go:383] "Eviction manager: able to reduce resource pressure without evicting pods." resourceName="ephemeral-storage" Apr 22 16:16:21.962762 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:16:21.962724 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-128.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:20:06.665610 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:20:06.665525 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 16:20:06.668588 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:20:06.668567 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 16:20:14.035198 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:20:14.035166 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 16:20:14.037907 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:20:14.037883 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 16:20:14.043116 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:20:14.043094 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 16:24:52.679420 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:24:52.679364 2573 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" filter="nil" Apr 22 16:24:52.679420 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:24:52.679422 2573 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 22 16:24:52.680017 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:24:52.679437 2573 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 22 16:26:43.795442 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:26:43.795410 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 16:26:43.859606 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:26:43.795722 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 16:26:43.859606 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:26:43.798027 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 16:26:43.859606 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:26:43.798524 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 16:26:45.498440 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:26:45.498401 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc" event={"ID":"c1fcf201-3663-45ae-8f76-74cdf9e238c1","Type":"ContainerStarted","Data":"f650e286c5e76dc5a568bd79999bb8c965a8690f0222133e87cdfd51a32bb1aa"} Apr 22 16:26:45.502606 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:26:45.502578 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-vc978\"/\"default-dockercfg-5hndh\"" Apr 22 16:26:45.534330 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:26:45.534265 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc" podStartSLOduration=1.603501012 podStartE2EDuration="23m45.534251763s" podCreationTimestamp="2026-04-22 16:03:00 +0000 UTC" firstStartedPulling="2026-04-22 16:03:00.661676383 +0000 UTC m=+1751.987430013" lastFinishedPulling="2026-04-22 16:26:44.592427121 +0000 UTC m=+3175.918180764" observedRunningTime="2026-04-22 16:26:45.533964207 +0000 UTC m=+3176.859717851" watchObservedRunningTime="2026-04-22 16:26:45.534251763 +0000 UTC m=+3176.860005413" Apr 22 16:26:45.558772 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:26:45.558738 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-vc978\"/\"kube-root-ca.crt\"" Apr 22 16:26:45.568727 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:26:45.568703 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-vc978\"/\"openshift-service-ca.crt\"" Apr 22 16:27:02.546336 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:02.546280 2573 generic.go:358] "Generic (PLEG): container finished" podID="c1fcf201-3663-45ae-8f76-74cdf9e238c1" containerID="f650e286c5e76dc5a568bd79999bb8c965a8690f0222133e87cdfd51a32bb1aa" exitCode=0 Apr 22 16:27:02.546781 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:02.546358 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc" event={"ID":"c1fcf201-3663-45ae-8f76-74cdf9e238c1","Type":"ContainerDied","Data":"f650e286c5e76dc5a568bd79999bb8c965a8690f0222133e87cdfd51a32bb1aa"} Apr 22 16:27:03.679972 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:03.679941 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc" Apr 22 16:27:03.806958 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:03.806863 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fk48\" (UniqueName: \"kubernetes.io/projected/c1fcf201-3663-45ae-8f76-74cdf9e238c1-kube-api-access-4fk48\") pod \"c1fcf201-3663-45ae-8f76-74cdf9e238c1\" (UID: \"c1fcf201-3663-45ae-8f76-74cdf9e238c1\") " Apr 22 16:27:03.809330 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:03.809301 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1fcf201-3663-45ae-8f76-74cdf9e238c1-kube-api-access-4fk48" (OuterVolumeSpecName: "kube-api-access-4fk48") pod "c1fcf201-3663-45ae-8f76-74cdf9e238c1" (UID: "c1fcf201-3663-45ae-8f76-74cdf9e238c1"). InnerVolumeSpecName "kube-api-access-4fk48". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:27:03.907819 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:03.907777 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4fk48\" (UniqueName: \"kubernetes.io/projected/c1fcf201-3663-45ae-8f76-74cdf9e238c1-kube-api-access-4fk48\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 16:27:04.553490 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:04.553451 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc" event={"ID":"c1fcf201-3663-45ae-8f76-74cdf9e238c1","Type":"ContainerDied","Data":"0c2f55feaa5beaddad0faa386d5afe853ac12e31fea433a151b0f6fabf2c2e8d"} Apr 22 16:27:04.553490 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:04.553471 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc" Apr 22 16:27:04.553490 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:04.553488 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c2f55feaa5beaddad0faa386d5afe853ac12e31fea433a151b0f6fabf2c2e8d" Apr 22 16:27:04.868669 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:04.868586 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/test-ns-vc978_test-trainjob-lflgl-node-0-0-tlbhc_c1fcf201-3663-45ae-8f76-74cdf9e238c1/node/0.log" Apr 22 16:27:04.949505 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:27:04.949471 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e3fe9da3280bfdd222d7fe9788bf2aa95f2b2f75d5266c85a3423abb41b9645\": container with ID starting with 9e3fe9da3280bfdd222d7fe9788bf2aa95f2b2f75d5266c85a3423abb41b9645 not found: ID does not exist" containerID="9e3fe9da3280bfdd222d7fe9788bf2aa95f2b2f75d5266c85a3423abb41b9645" Apr 22 16:27:05.048770 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:27:05.048734 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09e7dbfb8c4da983ca2b945d27f152bcc6364921b200f5d04ffe8cc45e75b26\": container with ID starting with a09e7dbfb8c4da983ca2b945d27f152bcc6364921b200f5d04ffe8cc45e75b26 not found: ID does not exist" containerID="a09e7dbfb8c4da983ca2b945d27f152bcc6364921b200f5d04ffe8cc45e75b26" Apr 22 16:27:05.148665 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:27:05.148582 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7087f3ca43559a5b1ae393715c688a0b13d69d090feeba1a6a426e6e9096f9c\": container with ID starting with e7087f3ca43559a5b1ae393715c688a0b13d69d090feeba1a6a426e6e9096f9c not found: ID does not exist" containerID="e7087f3ca43559a5b1ae393715c688a0b13d69d090feeba1a6a426e6e9096f9c" Apr 22 16:27:05.643417 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:27:05.643380 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a955ea75c669f1856e821fd43b54e5cfdde161a8c98ba13985eb9f76aac277fb\": container with ID starting with a955ea75c669f1856e821fd43b54e5cfdde161a8c98ba13985eb9f76aac277fb not found: ID does not exist" containerID="a955ea75c669f1856e821fd43b54e5cfdde161a8c98ba13985eb9f76aac277fb" Apr 22 16:27:06.795858 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:06.795817 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wbrmb/must-gather-qr5cn"] Apr 22 16:27:06.796308 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:06.796136 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1fcf201-3663-45ae-8f76-74cdf9e238c1" containerName="node" Apr 22 16:27:06.796308 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:06.796147 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1fcf201-3663-45ae-8f76-74cdf9e238c1" containerName="node" Apr 22 16:27:06.796308 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:06.796196 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1fcf201-3663-45ae-8f76-74cdf9e238c1" containerName="node" Apr 22 16:27:06.937865 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:06.937828 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wbrmb/must-gather-qr5cn"] Apr 22 16:27:06.938036 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:06.937944 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wbrmb/must-gather-qr5cn" Apr 22 16:27:06.940664 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:06.940637 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wbrmb\"/\"openshift-service-ca.crt\"" Apr 22 16:27:06.940781 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:06.940721 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wbrmb\"/\"default-dockercfg-mqzth\"" Apr 22 16:27:06.940781 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:06.940749 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wbrmb\"/\"kube-root-ca.crt\"" Apr 22 16:27:07.034442 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:07.034401 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1e239310-630a-43d3-b840-c82a1f7ef3ff-must-gather-output\") pod \"must-gather-qr5cn\" (UID: \"1e239310-630a-43d3-b840-c82a1f7ef3ff\") " pod="openshift-must-gather-wbrmb/must-gather-qr5cn" Apr 22 16:27:07.034650 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:07.034484 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw7vt\" (UniqueName: \"kubernetes.io/projected/1e239310-630a-43d3-b840-c82a1f7ef3ff-kube-api-access-bw7vt\") pod \"must-gather-qr5cn\" (UID: \"1e239310-630a-43d3-b840-c82a1f7ef3ff\") " pod="openshift-must-gather-wbrmb/must-gather-qr5cn" Apr 22 16:27:07.135342 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:07.135255 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bw7vt\" (UniqueName: \"kubernetes.io/projected/1e239310-630a-43d3-b840-c82a1f7ef3ff-kube-api-access-bw7vt\") pod \"must-gather-qr5cn\" (UID: \"1e239310-630a-43d3-b840-c82a1f7ef3ff\") " pod="openshift-must-gather-wbrmb/must-gather-qr5cn" Apr 22 16:27:07.135342 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:07.135307 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1e239310-630a-43d3-b840-c82a1f7ef3ff-must-gather-output\") pod \"must-gather-qr5cn\" (UID: \"1e239310-630a-43d3-b840-c82a1f7ef3ff\") " pod="openshift-must-gather-wbrmb/must-gather-qr5cn" Apr 22 16:27:07.135659 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:07.135642 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1e239310-630a-43d3-b840-c82a1f7ef3ff-must-gather-output\") pod \"must-gather-qr5cn\" (UID: \"1e239310-630a-43d3-b840-c82a1f7ef3ff\") " pod="openshift-must-gather-wbrmb/must-gather-qr5cn" Apr 22 16:27:07.143699 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:07.143673 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw7vt\" (UniqueName: \"kubernetes.io/projected/1e239310-630a-43d3-b840-c82a1f7ef3ff-kube-api-access-bw7vt\") pod \"must-gather-qr5cn\" (UID: \"1e239310-630a-43d3-b840-c82a1f7ef3ff\") " pod="openshift-must-gather-wbrmb/must-gather-qr5cn" Apr 22 16:27:07.247914 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:07.247869 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wbrmb/must-gather-qr5cn" Apr 22 16:27:07.379705 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:07.379661 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wbrmb/must-gather-qr5cn"] Apr 22 16:27:07.382784 ip-10-0-143-128 kubenswrapper[2573]: W0422 16:27:07.382754 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e239310_630a_43d3_b840_c82a1f7ef3ff.slice/crio-ca25bc18a9c6a511bf6e96152e4a651396573b9f65d5b7daa18eb64058c72db6 WatchSource:0}: Error finding container ca25bc18a9c6a511bf6e96152e4a651396573b9f65d5b7daa18eb64058c72db6: Status 404 returned error can't find the container with id ca25bc18a9c6a511bf6e96152e4a651396573b9f65d5b7daa18eb64058c72db6 Apr 22 16:27:07.384557 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:07.384531 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:27:07.562860 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:07.562820 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wbrmb/must-gather-qr5cn" event={"ID":"1e239310-630a-43d3-b840-c82a1f7ef3ff","Type":"ContainerStarted","Data":"ca25bc18a9c6a511bf6e96152e4a651396573b9f65d5b7daa18eb64058c72db6"} Apr 22 16:27:09.908760 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:09.908718 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc"] Apr 22 16:27:09.911988 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:09.911953 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-vc978/test-trainjob-lflgl-node-0-0-tlbhc"] Apr 22 16:27:10.010697 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:10.010655 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-t5wjv/test-trainjob-p2c2q-node-0-0-9k9dm"] Apr 22 16:27:10.015892 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:10.015855 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-t5wjv/test-trainjob-p2c2q-node-0-0-9k9dm"] Apr 22 16:27:10.114985 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:10.114930 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-lrx8p/test-trainjob-7vp9x-node-0-0-cxtqs"] Apr 22 16:27:10.114985 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:10.114994 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-lrx8p/test-trainjob-7vp9x-node-0-0-cxtqs"] Apr 22 16:27:10.364074 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:10.364017 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-xqxzt/test-trainjob-9kkz7-node-0-0-z8fhp"] Apr 22 16:27:10.367552 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:10.367521 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-xqxzt/test-trainjob-9kkz7-node-0-0-z8fhp"] Apr 22 16:27:10.862968 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:10.862928 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-x9jlh/test-trainjob-gqfsf-node-0-0-l6rwk"] Apr 22 16:27:10.866212 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:10.866180 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-x9jlh/test-trainjob-gqfsf-node-0-0-l6rwk"] Apr 22 16:27:11.255884 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:11.255848 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09915ea3-651f-44fc-aa8e-89e0f8db6097" path="/var/lib/kubelet/pods/09915ea3-651f-44fc-aa8e-89e0f8db6097/volumes" Apr 22 16:27:11.256322 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:11.256250 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1fcf201-3663-45ae-8f76-74cdf9e238c1" path="/var/lib/kubelet/pods/c1fcf201-3663-45ae-8f76-74cdf9e238c1/volumes" Apr 22 16:27:11.256547 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:11.256533 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df28af46-b005-4a5d-a3fe-f7e2e4a0d000" path="/var/lib/kubelet/pods/df28af46-b005-4a5d-a3fe-f7e2e4a0d000/volumes" Apr 22 16:27:11.256869 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:11.256853 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2cd5b5b-becf-4cf2-9233-8d3715e048da" path="/var/lib/kubelet/pods/f2cd5b5b-becf-4cf2-9233-8d3715e048da/volumes" Apr 22 16:27:11.257310 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:11.257289 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc" path="/var/lib/kubelet/pods/fc5783c4-71be-49c2-bc65-fb9fb7c1dcbc/volumes" Apr 22 16:27:14.590338 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:14.590288 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wbrmb/must-gather-qr5cn" event={"ID":"1e239310-630a-43d3-b840-c82a1f7ef3ff","Type":"ContainerStarted","Data":"66c3aca517e821bce4b825f365a213a765fd59783df48569adf857cbddb57aff"} Apr 22 16:27:15.597691 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:15.597654 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wbrmb/must-gather-qr5cn" event={"ID":"1e239310-630a-43d3-b840-c82a1f7ef3ff","Type":"ContainerStarted","Data":"74a6edbdc30815af5ac4dc45ec85ccd23fe9be0b7b26cd2238c90ccbf0942f9a"} Apr 22 16:27:15.614469 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:27:15.614414 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wbrmb/must-gather-qr5cn" podStartSLOduration=2.704523371 podStartE2EDuration="9.614398098s" podCreationTimestamp="2026-04-22 16:27:06 +0000 UTC" firstStartedPulling="2026-04-22 16:27:07.384729383 +0000 UTC m=+3198.710483019" lastFinishedPulling="2026-04-22 16:27:14.294604101 +0000 UTC m=+3205.620357746" observedRunningTime="2026-04-22 16:27:15.613082495 +0000 UTC m=+3206.938836145" watchObservedRunningTime="2026-04-22 16:27:15.614398098 +0000 UTC m=+3206.940151749" Apr 22 16:28:07.780259 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:07.780220 2573 generic.go:358] "Generic (PLEG): container finished" podID="1e239310-630a-43d3-b840-c82a1f7ef3ff" containerID="66c3aca517e821bce4b825f365a213a765fd59783df48569adf857cbddb57aff" exitCode=0 Apr 22 16:28:07.780735 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:07.780298 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wbrmb/must-gather-qr5cn" event={"ID":"1e239310-630a-43d3-b840-c82a1f7ef3ff","Type":"ContainerDied","Data":"66c3aca517e821bce4b825f365a213a765fd59783df48569adf857cbddb57aff"} Apr 22 16:28:07.780735 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:07.780654 2573 scope.go:117] "RemoveContainer" containerID="66c3aca517e821bce4b825f365a213a765fd59783df48569adf857cbddb57aff" Apr 22 16:28:08.013953 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:08.013920 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wbrmb_must-gather-qr5cn_1e239310-630a-43d3-b840-c82a1f7ef3ff/gather/0.log" Apr 22 16:28:11.221783 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:11.221756 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hgz25_d6e0d117-87ac-43fe-bf80-ea2add6000f1/global-pull-secret-syncer/0.log" Apr 22 16:28:11.377533 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:11.377502 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-m2d4r_538e0c6d-a79c-4576-9d34-fc920e2c9aef/konnectivity-agent/0.log" Apr 22 16:28:11.438911 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:11.438878 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-128.ec2.internal_bc217a568b1f424e4a361f02b88acbe4/haproxy/0.log" Apr 22 16:28:13.348407 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.348346 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wbrmb/must-gather-qr5cn"] Apr 22 16:28:13.349014 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.348660 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-wbrmb/must-gather-qr5cn" podUID="1e239310-630a-43d3-b840-c82a1f7ef3ff" containerName="copy" containerID="cri-o://74a6edbdc30815af5ac4dc45ec85ccd23fe9be0b7b26cd2238c90ccbf0942f9a" gracePeriod=2 Apr 22 16:28:13.350245 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.350218 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wbrmb/must-gather-qr5cn"] Apr 22 16:28:13.350759 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.350735 2573 status_manager.go:895] "Failed to get status for pod" podUID="1e239310-630a-43d3-b840-c82a1f7ef3ff" pod="openshift-must-gather-wbrmb/must-gather-qr5cn" err="pods \"must-gather-qr5cn\" is forbidden: User \"system:node:ip-10-0-143-128.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wbrmb\": no relationship found between node 'ip-10-0-143-128.ec2.internal' and this object" Apr 22 16:28:13.582558 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.582530 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wbrmb_must-gather-qr5cn_1e239310-630a-43d3-b840-c82a1f7ef3ff/copy/0.log" Apr 22 16:28:13.582881 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.582866 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wbrmb/must-gather-qr5cn" Apr 22 16:28:13.585126 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.585096 2573 status_manager.go:895] "Failed to get status for pod" podUID="1e239310-630a-43d3-b840-c82a1f7ef3ff" pod="openshift-must-gather-wbrmb/must-gather-qr5cn" err="pods \"must-gather-qr5cn\" is forbidden: User \"system:node:ip-10-0-143-128.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wbrmb\": no relationship found between node 'ip-10-0-143-128.ec2.internal' and this object" Apr 22 16:28:13.710023 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.709911 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw7vt\" (UniqueName: \"kubernetes.io/projected/1e239310-630a-43d3-b840-c82a1f7ef3ff-kube-api-access-bw7vt\") pod \"1e239310-630a-43d3-b840-c82a1f7ef3ff\" (UID: \"1e239310-630a-43d3-b840-c82a1f7ef3ff\") " Apr 22 16:28:13.710023 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.709967 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1e239310-630a-43d3-b840-c82a1f7ef3ff-must-gather-output\") pod \"1e239310-630a-43d3-b840-c82a1f7ef3ff\" (UID: \"1e239310-630a-43d3-b840-c82a1f7ef3ff\") " Apr 22 16:28:13.712368 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.712328 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e239310-630a-43d3-b840-c82a1f7ef3ff-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1e239310-630a-43d3-b840-c82a1f7ef3ff" (UID: "1e239310-630a-43d3-b840-c82a1f7ef3ff"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:28:13.712515 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.712456 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e239310-630a-43d3-b840-c82a1f7ef3ff-kube-api-access-bw7vt" (OuterVolumeSpecName: "kube-api-access-bw7vt") pod "1e239310-630a-43d3-b840-c82a1f7ef3ff" (UID: "1e239310-630a-43d3-b840-c82a1f7ef3ff"). InnerVolumeSpecName "kube-api-access-bw7vt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:28:13.797030 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.797001 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wbrmb_must-gather-qr5cn_1e239310-630a-43d3-b840-c82a1f7ef3ff/copy/0.log" Apr 22 16:28:13.797441 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.797414 2573 generic.go:358] "Generic (PLEG): container finished" podID="1e239310-630a-43d3-b840-c82a1f7ef3ff" containerID="74a6edbdc30815af5ac4dc45ec85ccd23fe9be0b7b26cd2238c90ccbf0942f9a" exitCode=143 Apr 22 16:28:13.797493 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.797476 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wbrmb/must-gather-qr5cn" Apr 22 16:28:13.797528 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.797507 2573 scope.go:117] "RemoveContainer" containerID="74a6edbdc30815af5ac4dc45ec85ccd23fe9be0b7b26cd2238c90ccbf0942f9a" Apr 22 16:28:13.799601 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.799568 2573 status_manager.go:895] "Failed to get status for pod" podUID="1e239310-630a-43d3-b840-c82a1f7ef3ff" pod="openshift-must-gather-wbrmb/must-gather-qr5cn" err="pods \"must-gather-qr5cn\" is forbidden: User \"system:node:ip-10-0-143-128.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wbrmb\": no relationship found between node 'ip-10-0-143-128.ec2.internal' and this object" Apr 22 16:28:13.806341 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.806313 2573 scope.go:117] "RemoveContainer" containerID="66c3aca517e821bce4b825f365a213a765fd59783df48569adf857cbddb57aff" Apr 22 16:28:13.809187 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.809157 2573 status_manager.go:895] "Failed to get status for pod" podUID="1e239310-630a-43d3-b840-c82a1f7ef3ff" pod="openshift-must-gather-wbrmb/must-gather-qr5cn" err="pods \"must-gather-qr5cn\" is forbidden: User \"system:node:ip-10-0-143-128.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wbrmb\": no relationship found between node 'ip-10-0-143-128.ec2.internal' and this object" Apr 22 16:28:13.811489 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.811462 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bw7vt\" (UniqueName: \"kubernetes.io/projected/1e239310-630a-43d3-b840-c82a1f7ef3ff-kube-api-access-bw7vt\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 16:28:13.811489 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.811490 2573 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1e239310-630a-43d3-b840-c82a1f7ef3ff-must-gather-output\") on node \"ip-10-0-143-128.ec2.internal\" DevicePath \"\"" Apr 22 16:28:13.821717 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.821690 2573 scope.go:117] "RemoveContainer" containerID="74a6edbdc30815af5ac4dc45ec85ccd23fe9be0b7b26cd2238c90ccbf0942f9a" Apr 22 16:28:13.822034 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:28:13.822012 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74a6edbdc30815af5ac4dc45ec85ccd23fe9be0b7b26cd2238c90ccbf0942f9a\": container with ID starting with 74a6edbdc30815af5ac4dc45ec85ccd23fe9be0b7b26cd2238c90ccbf0942f9a not found: ID does not exist" containerID="74a6edbdc30815af5ac4dc45ec85ccd23fe9be0b7b26cd2238c90ccbf0942f9a" Apr 22 16:28:13.822100 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.822045 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a6edbdc30815af5ac4dc45ec85ccd23fe9be0b7b26cd2238c90ccbf0942f9a"} err="failed to get container status \"74a6edbdc30815af5ac4dc45ec85ccd23fe9be0b7b26cd2238c90ccbf0942f9a\": rpc error: code = NotFound desc = could not find container \"74a6edbdc30815af5ac4dc45ec85ccd23fe9be0b7b26cd2238c90ccbf0942f9a\": container with ID starting with 74a6edbdc30815af5ac4dc45ec85ccd23fe9be0b7b26cd2238c90ccbf0942f9a not found: ID does not exist" Apr 22 16:28:13.822100 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.822098 2573 scope.go:117] "RemoveContainer" containerID="66c3aca517e821bce4b825f365a213a765fd59783df48569adf857cbddb57aff" Apr 22 16:28:13.822409 ip-10-0-143-128 kubenswrapper[2573]: E0422 16:28:13.822389 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66c3aca517e821bce4b825f365a213a765fd59783df48569adf857cbddb57aff\": container with ID starting with 66c3aca517e821bce4b825f365a213a765fd59783df48569adf857cbddb57aff not found: ID does not exist" containerID="66c3aca517e821bce4b825f365a213a765fd59783df48569adf857cbddb57aff" Apr 22 16:28:13.822459 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:13.822417 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66c3aca517e821bce4b825f365a213a765fd59783df48569adf857cbddb57aff"} err="failed to get container status \"66c3aca517e821bce4b825f365a213a765fd59783df48569adf857cbddb57aff\": rpc error: code = NotFound desc = could not find container \"66c3aca517e821bce4b825f365a213a765fd59783df48569adf857cbddb57aff\": container with ID starting with 66c3aca517e821bce4b825f365a213a765fd59783df48569adf857cbddb57aff not found: ID does not exist" Apr 22 16:28:15.108468 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:15.108439 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tskkt_500e5ead-fa2e-40ff-8137-d9dbe8098414/kube-state-metrics/0.log" Apr 22 16:28:15.131969 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:15.131930 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tskkt_500e5ead-fa2e-40ff-8137-d9dbe8098414/kube-rbac-proxy-main/0.log" Apr 22 16:28:15.154753 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:15.154723 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tskkt_500e5ead-fa2e-40ff-8137-d9dbe8098414/kube-rbac-proxy-self/0.log" Apr 22 16:28:15.183298 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:15.183264 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-56b9bf79c7-j65r7_3e3e029f-5d45-4031-9e8d-502a525ad806/metrics-server/0.log" Apr 22 16:28:15.253921 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:15.253874 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e239310-630a-43d3-b840-c82a1f7ef3ff" path="/var/lib/kubelet/pods/1e239310-630a-43d3-b840-c82a1f7ef3ff/volumes" Apr 22 16:28:15.404794 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:15.404709 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k4jmm_a9bbe7fd-fc20-4d50-828c-7ba2ab200da2/node-exporter/0.log" Apr 22 16:28:15.430646 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:15.430613 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k4jmm_a9bbe7fd-fc20-4d50-828c-7ba2ab200da2/kube-rbac-proxy/0.log" Apr 22 16:28:15.452645 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:15.452617 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k4jmm_a9bbe7fd-fc20-4d50-828c-7ba2ab200da2/init-textfile/0.log" Apr 22 16:28:15.780632 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:15.780598 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-g82q2_a4743dda-39bf-4a2e-b129-091f383cd787/prometheus-operator/0.log" Apr 22 16:28:15.808704 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:15.808670 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-g82q2_a4743dda-39bf-4a2e-b129-091f383cd787/kube-rbac-proxy/0.log" Apr 22 16:28:17.626810 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:17.626773 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/1.log" Apr 22 16:28:17.631081 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:17.631032 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-44bxm_11386c68-c09f-4923-91a0-bfd58155fe9e/console-operator/2.log" Apr 22 16:28:17.962490 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:17.962378 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b4bb6977d-t29g2_5769d82c-40d1-49a8-8038-d6c1ee6a9ece/console/0.log" Apr 22 16:28:18.290518 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.290475 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v"] Apr 22 16:28:18.290827 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.290815 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e239310-630a-43d3-b840-c82a1f7ef3ff" containerName="copy" Apr 22 16:28:18.290892 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.290829 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e239310-630a-43d3-b840-c82a1f7ef3ff" containerName="copy" Apr 22 16:28:18.290892 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.290842 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e239310-630a-43d3-b840-c82a1f7ef3ff" containerName="gather" Apr 22 16:28:18.290892 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.290848 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e239310-630a-43d3-b840-c82a1f7ef3ff" containerName="gather" Apr 22 16:28:18.290994 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.290894 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e239310-630a-43d3-b840-c82a1f7ef3ff" containerName="copy" Apr 22 16:28:18.290994 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.290902 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e239310-630a-43d3-b840-c82a1f7ef3ff" containerName="gather" Apr 22 16:28:18.296571 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.296543 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:18.299436 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.299408 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jrh8b\"/\"kube-root-ca.crt\"" Apr 22 16:28:18.299630 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.299408 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jrh8b\"/\"openshift-service-ca.crt\"" Apr 22 16:28:18.299630 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.299512 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jrh8b\"/\"default-dockercfg-rrfw9\"" Apr 22 16:28:18.299842 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.299820 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v"] Apr 22 16:28:18.447531 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.447484 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2-lib-modules\") pod \"perf-node-gather-daemonset-xpv9v\" (UID: \"57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:18.447531 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.447529 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2-podres\") pod \"perf-node-gather-daemonset-xpv9v\" (UID: \"57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:18.447810 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.447626 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bckxv\" (UniqueName: \"kubernetes.io/projected/57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2-kube-api-access-bckxv\") pod \"perf-node-gather-daemonset-xpv9v\" (UID: \"57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:18.447810 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.447712 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2-proc\") pod \"perf-node-gather-daemonset-xpv9v\" (UID: \"57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:18.447810 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.447747 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2-sys\") pod \"perf-node-gather-daemonset-xpv9v\" (UID: \"57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:18.548786 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.548687 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2-sys\") pod \"perf-node-gather-daemonset-xpv9v\" (UID: \"57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:18.548786 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.548730 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2-lib-modules\") pod \"perf-node-gather-daemonset-xpv9v\" (UID: \"57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:18.548786 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.548749 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2-podres\") pod \"perf-node-gather-daemonset-xpv9v\" (UID: \"57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:18.548786 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.548783 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bckxv\" (UniqueName: \"kubernetes.io/projected/57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2-kube-api-access-bckxv\") pod \"perf-node-gather-daemonset-xpv9v\" (UID: \"57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:18.549130 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.548828 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2-proc\") pod \"perf-node-gather-daemonset-xpv9v\" (UID: \"57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:18.549130 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.548825 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2-sys\") pod \"perf-node-gather-daemonset-xpv9v\" (UID: \"57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:18.549130 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.548875 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2-lib-modules\") pod \"perf-node-gather-daemonset-xpv9v\" (UID: \"57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:18.549130 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.548897 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2-proc\") pod \"perf-node-gather-daemonset-xpv9v\" (UID: \"57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:18.549130 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.548942 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2-podres\") pod \"perf-node-gather-daemonset-xpv9v\" (UID: \"57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:18.556887 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.556856 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bckxv\" (UniqueName: \"kubernetes.io/projected/57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2-kube-api-access-bckxv\") pod \"perf-node-gather-daemonset-xpv9v\" (UID: \"57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:18.607963 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.607922 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:18.735625 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.735457 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v"] Apr 22 16:28:18.738318 ip-10-0-143-128 kubenswrapper[2573]: W0422 16:28:18.738273 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod57e3e0ce_a7b2_4add_a62e_fcfeff4c64f2.slice/crio-bea0fecf5940294d797d58e205be30dc6373c52004fdb76762cb552f8d9789a0 WatchSource:0}: Error finding container bea0fecf5940294d797d58e205be30dc6373c52004fdb76762cb552f8d9789a0: Status 404 returned error can't find the container with id bea0fecf5940294d797d58e205be30dc6373c52004fdb76762cb552f8d9789a0 Apr 22 16:28:18.813110 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.813049 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" event={"ID":"57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2","Type":"ContainerStarted","Data":"26aa28ba86492b52b9dcfebb0dfa9764b104d448d1a55c4019b928f9e6c89203"} Apr 22 16:28:18.813110 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:18.813115 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" event={"ID":"57e3e0ce-a7b2-4add-a62e-fcfeff4c64f2","Type":"ContainerStarted","Data":"bea0fecf5940294d797d58e205be30dc6373c52004fdb76762cb552f8d9789a0"} Apr 22 16:28:19.073962 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:19.073887 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6w5vh_2bcb9971-bb8e-460b-b9e5-409f39381abb/dns/0.log" Apr 22 16:28:19.096491 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:19.096463 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6w5vh_2bcb9971-bb8e-460b-b9e5-409f39381abb/kube-rbac-proxy/0.log" Apr 22 16:28:19.207654 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:19.207624 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-64nrn_f9cf0d97-f5d8-44fe-a781-fa3940c08f48/dns-node-resolver/0.log" Apr 22 16:28:19.689251 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:19.689222 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8sqml_2bb9d855-0dbd-4a3b-93cc-7fb30fd48f69/node-ca/0.log" Apr 22 16:28:19.816404 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:19.816375 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:19.831547 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:19.831488 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" podStartSLOduration=1.831472508 podStartE2EDuration="1.831472508s" podCreationTimestamp="2026-04-22 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:28:19.829778288 +0000 UTC m=+3271.155531941" watchObservedRunningTime="2026-04-22 16:28:19.831472508 +0000 UTC m=+3271.157226159" Apr 22 16:28:20.758440 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:20.758402 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-x6h97_4078b99e-a844-47ea-8e8d-88fefc3efd1b/serve-healthcheck-canary/0.log" Apr 22 16:28:21.164476 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:21.164389 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fx4lq_b4c0ab11-708d-4eeb-bc03-2f8ab994f98e/kube-rbac-proxy/0.log" Apr 22 16:28:21.185302 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:21.185275 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fx4lq_b4c0ab11-708d-4eeb-bc03-2f8ab994f98e/exporter/0.log" Apr 22 16:28:21.205839 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:21.205811 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fx4lq_b4c0ab11-708d-4eeb-bc03-2f8ab994f98e/extractor/0.log" Apr 22 16:28:25.829562 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:25.829534 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-xpv9v" Apr 22 16:28:27.399288 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:27.399207 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zr4rf_37d4dbd3-61f0-47a0-bd23-69d3cd755850/kube-multus-additional-cni-plugins/0.log" Apr 22 16:28:27.420162 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:27.420125 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zr4rf_37d4dbd3-61f0-47a0-bd23-69d3cd755850/egress-router-binary-copy/0.log" Apr 22 16:28:27.443135 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:27.443102 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zr4rf_37d4dbd3-61f0-47a0-bd23-69d3cd755850/cni-plugins/0.log" Apr 22 16:28:27.463562 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:27.463537 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zr4rf_37d4dbd3-61f0-47a0-bd23-69d3cd755850/bond-cni-plugin/0.log" Apr 22 16:28:27.483970 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:27.483939 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zr4rf_37d4dbd3-61f0-47a0-bd23-69d3cd755850/routeoverride-cni/0.log" Apr 22 16:28:27.506794 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:27.506762 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zr4rf_37d4dbd3-61f0-47a0-bd23-69d3cd755850/whereabouts-cni-bincopy/0.log" Apr 22 16:28:27.527038 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:27.527008 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zr4rf_37d4dbd3-61f0-47a0-bd23-69d3cd755850/whereabouts-cni/0.log" Apr 22 16:28:27.555954 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:27.555920 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g55gp_a74db667-642b-4eca-91b6-af4048b9410f/kube-multus/0.log" Apr 22 16:28:27.688824 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:27.688793 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vk5nl_0708298d-9f47-4968-9489-c7cb22cb282c/network-metrics-daemon/0.log" Apr 22 16:28:27.708363 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:27.708335 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vk5nl_0708298d-9f47-4968-9489-c7cb22cb282c/kube-rbac-proxy/0.log" Apr 22 16:28:29.090204 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:29.090175 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-controller/0.log" Apr 22 16:28:29.107579 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:29.107539 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/0.log" Apr 22 16:28:29.125794 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:29.125764 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovn-acl-logging/1.log" Apr 22 16:28:29.144583 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:29.144553 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/kube-rbac-proxy-node/0.log" Apr 22 16:28:29.163757 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:29.163723 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 16:28:29.180927 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:29.180896 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/northd/0.log" Apr 22 16:28:29.203152 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:29.203112 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/nbdb/0.log" Apr 22 16:28:29.223151 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:29.223127 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/sbdb/0.log" Apr 22 16:28:29.333694 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:29.333609 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwt7w_e237e451-58c6-4255-bef9-a4ac5f2d06c7/ovnkube-controller/0.log" Apr 22 16:28:30.397138 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:30.397106 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-kgskm_ee532035-be0c-4e7e-a7b9-da60be33b91c/check-endpoints/0.log" Apr 22 16:28:30.444200 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:30.444169 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-jgsl7_07586edf-24f7-4873-81ac-df167bc41e5e/network-check-target-container/0.log" Apr 22 16:28:31.292386 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:31.292359 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-49km7_3377967e-b456-4b8d-922f-ecf8e91bf364/iptables-alerter/0.log" Apr 22 16:28:31.908199 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:31.908172 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-kvwxs_88c02699-dde6-4f8d-bf08-671bfdb840da/tuned/0.log" Apr 22 16:28:33.530393 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:33.530361 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-qxgng_33db7d39-27dc-47a6-83dc-91f5dff0fb7c/cluster-samples-operator/0.log" Apr 22 16:28:33.561652 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:33.561610 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-qxgng_33db7d39-27dc-47a6-83dc-91f5dff0fb7c/cluster-samples-operator-watch/0.log" Apr 22 16:28:34.745867 ip-10-0-143-128 kubenswrapper[2573]: I0422 16:28:34.745836 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-hd9lf_d688f2b7-158c-4398-9277-b2535423a024/service-ca-controller/0.log"