Apr 23 16:34:52.594903 ip-10-0-130-110 systemd[1]: Starting Kubernetes Kubelet... Apr 23 16:34:53.117094 ip-10-0-130-110 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:34:53.117094 ip-10-0-130-110 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 16:34:53.117094 ip-10-0-130-110 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:34:53.117094 ip-10-0-130-110 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 16:34:53.117094 ip-10-0-130-110 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:34:53.118236 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.117773 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 16:34:53.125698 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125675 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:53.125698 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125693 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:53.125698 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125697 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:53.125698 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125701 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:53.125698 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125704 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:53.125698 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125707 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125710 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125712 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125715 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125718 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125720 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125724 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125728 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125732 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125735 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125738 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125741 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125744 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125746 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125749 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125751 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125754 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125756 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125759 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:53.125928 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125761 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125764 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125766 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125769 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125772 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125781 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125785 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125787 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125790 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125793 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125795 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125798 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125801 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125804 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125806 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125809 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125812 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125815 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125820 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:53.126428 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125823 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125826 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125829 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125832 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125834 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125837 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125840 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125843 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125846 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125849 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125852 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125854 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125857 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125860 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125863 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125866 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125868 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125871 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125874 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125876 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:53.126922 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125879 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125881 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125884 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125886 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125888 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125891 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125894 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125897 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125900 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125902 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125905 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125907 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125910 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125913 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125915 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125919 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125921 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125923 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125926 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125929 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:53.127415 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125931 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125934 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.125936 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126329 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126334 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126336 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126339 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126341 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126344 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126347 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126349 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126352 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126354 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126357 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126359 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126362 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126364 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126366 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126369 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:53.127920 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126372 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126375 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126378 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126380 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126383 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126385 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126388 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126390 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126394 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126396 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126399 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126402 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126404 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126407 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126409 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126411 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126414 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126416 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126419 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126422 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:53.128371 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126424 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126427 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126431 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126435 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126438 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126441 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126444 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126447 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126450 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126454 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126457 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126460 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126462 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126466 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126469 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126471 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126474 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126476 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126479 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:53.128924 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126483 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126485 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126488 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126491 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126493 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126495 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126498 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126500 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126503 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126505 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126508 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126510 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126513 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126515 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126518 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126520 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126522 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126542 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126545 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126548 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:53.129469 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126551 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126555 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126558 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126560 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126563 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126565 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126568 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126571 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126573 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126576 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.126579 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128022 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128031 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128038 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128043 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128049 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128052 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128057 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128062 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128065 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128068 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 16:34:53.129973 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128071 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128074 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128078 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128081 2575 flags.go:64] FLAG: --cgroup-root="" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128083 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128086 2575 flags.go:64] FLAG: --client-ca-file="" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128089 2575 flags.go:64] FLAG: --cloud-config="" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128091 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128095 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128100 2575 flags.go:64] FLAG: --cluster-domain="" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128102 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128106 2575 flags.go:64] FLAG: --config-dir="" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128109 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128112 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128116 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128119 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128123 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128126 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128129 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128132 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128135 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128138 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128141 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128145 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128148 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 16:34:53.130479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128151 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128153 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128157 2575 flags.go:64] FLAG: --enable-server="true" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128160 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128164 2575 flags.go:64] FLAG: --event-burst="100" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128167 2575 flags.go:64] FLAG: --event-qps="50" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128170 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128173 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128175 2575 flags.go:64] FLAG: --eviction-hard="" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128179 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128182 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128185 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128188 2575 flags.go:64] FLAG: --eviction-soft="" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128191 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128193 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128196 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128199 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128202 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128205 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128208 2575 flags.go:64] FLAG: --feature-gates="" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128212 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128215 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128218 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128221 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128224 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 23 16:34:53.131117 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128227 2575 flags.go:64] FLAG: --help="false" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128230 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128233 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128236 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128239 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128242 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128246 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128249 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128252 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128254 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128258 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128261 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128264 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128266 2575 flags.go:64] FLAG: --kube-reserved="" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128270 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128273 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128276 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128278 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128281 2575 flags.go:64] FLAG: --lock-file="" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128284 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128286 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128290 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128295 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 16:34:53.131724 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128298 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128301 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128304 2575 flags.go:64] FLAG: --logging-format="text" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128307 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128311 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128314 2575 flags.go:64] FLAG: --manifest-url="" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128316 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128321 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128324 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128328 2575 flags.go:64] FLAG: --max-pods="110" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128331 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128334 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128336 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128339 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128342 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128345 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128348 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128356 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128359 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128362 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128365 2575 flags.go:64] FLAG: --pod-cidr="" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128368 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128374 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128378 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 16:34:53.132267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128381 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128384 2575 flags.go:64] FLAG: --port="10250" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128387 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128390 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00901d5de3512d2de" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128393 2575 flags.go:64] FLAG: --qos-reserved="" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128396 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128399 2575 flags.go:64] FLAG: --register-node="true" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128401 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128404 2575 flags.go:64] FLAG: --register-with-taints="" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128408 2575 flags.go:64] FLAG: --registry-burst="10" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128411 2575 flags.go:64] FLAG: --registry-qps="5" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128414 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128416 2575 flags.go:64] FLAG: --reserved-memory="" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128420 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128423 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128426 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128428 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128431 2575 flags.go:64] FLAG: --runonce="false" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128434 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128436 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128439 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128442 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128445 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128447 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128450 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128453 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 16:34:53.132850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128456 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128459 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128462 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128465 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128468 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128471 2575 flags.go:64] FLAG: --system-cgroups="" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128474 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128479 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128482 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128484 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128489 2575 flags.go:64] FLAG: --tls-min-version="" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128491 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128494 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128496 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128499 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128502 2575 flags.go:64] FLAG: --v="2" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128506 2575 flags.go:64] FLAG: --version="false" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128510 2575 flags.go:64] FLAG: --vmodule="" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128514 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.128517 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128623 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128628 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128631 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128633 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:53.133458 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128636 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128639 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128642 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128644 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128646 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128649 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128651 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128654 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128656 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128659 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128661 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128663 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128666 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128669 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128672 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128675 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128677 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128680 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128682 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:53.134065 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128684 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128687 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128689 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128692 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128694 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128697 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128699 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128701 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128704 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128706 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128709 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128711 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128713 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128716 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128718 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128721 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128723 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128725 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128728 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128730 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:53.134573 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128733 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128735 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128738 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128740 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128742 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128745 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128750 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128752 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128755 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128757 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.128760 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129463 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129469 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129472 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129475 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129478 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129482 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129485 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129487 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:53.135080 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129490 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129493 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129495 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129498 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129500 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129503 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129505 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129507 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129510 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129512 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129515 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129517 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129519 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129522 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129539 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129542 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129544 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129546 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129550 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129558 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:53.135562 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129565 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:53.136064 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129568 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:53.136064 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129570 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:53.136064 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.129573 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:53.136064 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.129581 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:34:53.138007 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.137982 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 16:34:53.138007 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.138005 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 16:34:53.138156 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138089 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:53.138156 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138097 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:53.138156 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138102 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:53.138156 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138107 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:53.138156 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138112 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:53.138156 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138116 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:53.138156 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138121 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:53.138156 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138125 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:53.138156 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138130 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:53.138156 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138134 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:53.138156 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138138 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:53.138156 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138142 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:53.138156 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138147 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:53.138156 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138151 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:53.138156 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138155 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:53.138156 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138159 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:53.138156 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138163 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138168 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138173 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138177 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138182 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138186 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138191 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138195 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138199 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138203 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138207 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138211 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138215 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138221 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138229 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138235 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138240 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138245 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138249 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:53.138937 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138253 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138258 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138262 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138265 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138269 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138273 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138277 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138281 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138285 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138289 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138294 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138298 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138301 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138306 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138310 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138315 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138319 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138323 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138327 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138331 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138336 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:53.139583 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138340 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138344 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138348 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138352 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138356 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138359 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138363 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138367 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138372 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138376 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138380 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138384 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138391 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138397 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138402 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138406 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138410 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138414 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138418 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138422 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:53.140227 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138427 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:53.140800 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138431 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:53.140800 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138434 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:53.140800 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138438 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:53.140800 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138443 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:53.140800 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138448 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:53.140800 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138453 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:53.140800 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138457 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:53.140800 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138461 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:53.140800 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138465 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:53.140800 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.138474 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:34:53.140800 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138669 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:53.140800 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138679 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:53.140800 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138684 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:53.140800 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138688 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:53.140800 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138692 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138696 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138700 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138705 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138709 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138713 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138720 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138726 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138731 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138735 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138739 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138743 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138747 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138751 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138755 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138758 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138762 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138766 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138770 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138775 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:53.141375 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138779 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138814 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138822 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138827 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138842 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138846 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138851 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138856 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138861 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138865 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138870 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138874 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138879 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138883 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138887 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138891 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138895 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138899 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138906 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:53.141998 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138912 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138916 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138921 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138925 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138930 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138934 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138938 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138942 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138946 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138950 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138954 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138958 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138962 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138967 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138971 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138976 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138980 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138984 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138989 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138993 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:53.142812 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.138997 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139001 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139005 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139009 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139013 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139017 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139022 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139026 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139030 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139034 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139038 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139042 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139046 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139050 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139055 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139059 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139063 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139067 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139071 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:53.143353 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139075 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:53.144026 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139079 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:53.144026 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139083 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:53.144026 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:53.139087 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:53.144026 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.139095 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:34:53.144026 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.139939 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 16:34:53.144026 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.143643 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 16:34:53.144898 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.144885 2575 server.go:1019] "Starting client certificate rotation" Apr 23 16:34:53.145016 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.144998 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:34:53.145048 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.145042 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:34:53.174728 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.174701 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:34:53.183580 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.183558 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:34:53.210191 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.210169 2575 log.go:25] "Validated CRI v1 runtime API" Apr 23 16:34:53.218904 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.218888 2575 log.go:25] "Validated CRI v1 image API" Apr 23 16:34:53.220670 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.220644 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 16:34:53.220751 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.220678 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:34:53.230170 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.230143 2575 fs.go:135] Filesystem UUIDs: map[25697a84-1ec8-4ef8-8651-eba7136e0751:/dev/nvme0n1p4 49cac335-07e9-4e12-b69c-999d1fbf29a1:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 23 16:34:53.230258 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.230169 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 16:34:53.236389 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.236279 2575 manager.go:217] Machine: {Timestamp:2026-04-23 16:34:53.233973304 +0000 UTC m=+0.496847774 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100395 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2cefa1582aac377465bb354621d623 SystemUUID:ec2cefa1-582a-ac37-7465-bb354621d623 BootID:f359bd42-e572-45aa-a203-5eff634d6151 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f5:91:c2:be:f5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f5:91:c2:be:f5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:82:43:08:13:13:fe Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 16:34:53.236389 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.236376 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 16:34:53.236564 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.236549 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 16:34:53.237828 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.237800 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 16:34:53.238006 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.237831 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-110.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 16:34:53.238091 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.238020 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 16:34:53.238091 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.238032 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 16:34:53.238091 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.238056 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:34:53.239208 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.239185 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:34:53.241039 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.241026 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:34:53.241169 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.241158 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 16:34:53.243876 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.243864 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 23 16:34:53.243944 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.243882 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 16:34:53.243944 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.243897 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 16:34:53.243944 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.243910 2575 kubelet.go:397] "Adding apiserver pod source" Apr 23 16:34:53.243944 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.243923 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 16:34:53.244991 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.244978 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:34:53.245061 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.245000 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:34:53.247967 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.247939 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 16:34:53.249956 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.249943 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 16:34:53.251913 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.251902 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 16:34:53.251951 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.251918 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 16:34:53.251951 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.251924 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 16:34:53.251951 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.251929 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 16:34:53.251951 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.251937 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 16:34:53.251951 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.251947 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 16:34:53.252087 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.251955 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 16:34:53.252087 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.251962 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 16:34:53.252087 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.251970 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 16:34:53.252087 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.251976 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 16:34:53.252087 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.251985 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 16:34:53.252087 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.251993 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 16:34:53.252650 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.252640 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 16:34:53.252682 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.252653 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 16:34:53.256501 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.256480 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 16:34:53.256594 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.256537 2575 server.go:1295] "Started kubelet" Apr 23 16:34:53.256648 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.256593 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 16:34:53.256772 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.256717 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 16:34:53.256810 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.256800 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 16:34:53.257307 ip-10-0-130-110 systemd[1]: Started Kubernetes Kubelet. Apr 23 16:34:53.259546 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.259505 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 16:34:53.259865 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.259840 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 16:34:53.259865 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.259837 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-110.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 16:34:53.260142 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.260119 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-110.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 16:34:53.261918 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.261901 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 23 16:34:53.267204 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.265907 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-110.ec2.internal.18a9099bebe34cfe default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-110.ec2.internal,UID:ip-10-0-130-110.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-110.ec2.internal,},FirstTimestamp:2026-04-23 16:34:53.256494334 +0000 UTC m=+0.519368805,LastTimestamp:2026-04-23 16:34:53.256494334 +0000 UTC m=+0.519368805,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-110.ec2.internal,}" Apr 23 16:34:53.267283 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.267245 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-p8h2m" Apr 23 16:34:53.270319 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.270304 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 16:34:53.270670 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.270647 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 16:34:53.271168 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.271151 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 16:34:53.272119 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.272101 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 16:34:53.272119 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.272117 2575 factory.go:55] Registering systemd factory Apr 23 16:34:53.272119 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.272123 2575 factory.go:223] Registration of the systemd container factory successfully Apr 23 16:34:53.272368 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.272348 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-110.ec2.internal\" not found" Apr 23 16:34:53.272368 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.272370 2575 factory.go:153] Registering CRI-O factory Apr 23 16:34:53.272368 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.272380 2575 factory.go:223] Registration of the crio container factory successfully Apr 23 16:34:53.272578 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.272393 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 16:34:53.272578 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.272396 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 16:34:53.272578 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.272428 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 16:34:53.272578 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.272401 2575 factory.go:103] Registering Raw factory Apr 23 16:34:53.272578 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.272463 2575 manager.go:1196] Started watching for new ooms in manager Apr 23 16:34:53.272578 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.272511 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 23 16:34:53.272578 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.272522 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 23 16:34:53.272956 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.272940 2575 manager.go:319] Starting recovery of all containers Apr 23 16:34:53.273544 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.273497 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-p8h2m" Apr 23 16:34:53.280511 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.280489 2575 manager.go:324] Recovery completed Apr 23 16:34:53.284727 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.284705 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:53.286084 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.286068 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:53.287407 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.287392 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:53.287475 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.287419 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:53.287475 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.287433 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:53.287938 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.287925 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 16:34:53.288001 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.287940 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 16:34:53.288001 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.287961 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:34:53.292037 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.292024 2575 policy_none.go:49] "None policy: Start" Apr 23 16:34:53.292037 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.292029 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-110.ec2.internal\" not found" node="ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.292144 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.292040 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 16:34:53.292144 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.292066 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 23 16:34:53.344221 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.344201 2575 manager.go:341] "Starting Device Plugin manager" Apr 23 16:34:53.356131 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.344236 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 16:34:53.356131 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.344245 2575 server.go:85] "Starting device plugin registration server" Apr 23 16:34:53.356131 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.344499 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 16:34:53.356131 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.344510 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 16:34:53.356131 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.344634 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 16:34:53.356131 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.344720 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 16:34:53.356131 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.344729 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 16:34:53.356131 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.345439 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 16:34:53.356131 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.345480 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-110.ec2.internal\" not found" Apr 23 16:34:53.357918 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.357896 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 16:34:53.359237 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.359212 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 16:34:53.359341 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.359244 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 16:34:53.359341 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.359261 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 16:34:53.359341 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.359268 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 16:34:53.359341 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.359298 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 16:34:53.361681 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.361662 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:53.445701 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.445628 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:53.446742 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.446723 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:53.446845 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.446755 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:53.446845 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.446767 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:53.446845 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.446796 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.456815 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.456797 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.456861 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.456819 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-110.ec2.internal\": node \"ip-10-0-130-110.ec2.internal\" not found" Apr 23 16:34:53.459769 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.459754 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-110.ec2.internal"] Apr 23 16:34:53.459817 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.459812 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:53.462398 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.462381 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:53.462495 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.462412 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:53.462495 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.462423 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:53.463661 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.463649 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:53.463818 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.463804 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.463854 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.463834 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:53.464926 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.464906 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:53.465009 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.464939 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:53.465009 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.464948 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:53.465009 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.464913 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:53.465120 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.465026 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:53.465120 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.465043 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:53.467130 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.467112 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.467199 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.467147 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:53.468025 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.468009 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:53.468094 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.468034 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:53.468094 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.468045 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:53.474196 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.474180 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/25b7669b8a48e9291e3d21d66b2ad87c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal\" (UID: \"25b7669b8a48e9291e3d21d66b2ad87c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.474267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.474203 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25b7669b8a48e9291e3d21d66b2ad87c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal\" (UID: \"25b7669b8a48e9291e3d21d66b2ad87c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.474267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.474218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/edb81deebb56b756d8013eea0bbef625-config\") pod \"kube-apiserver-proxy-ip-10-0-130-110.ec2.internal\" (UID: \"edb81deebb56b756d8013eea0bbef625\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.476090 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.476071 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-110.ec2.internal\" not found" Apr 23 16:34:53.491169 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.491144 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-110.ec2.internal\" not found" node="ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.495674 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.495658 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-110.ec2.internal\" not found" node="ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.574415 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.574393 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25b7669b8a48e9291e3d21d66b2ad87c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal\" (UID: \"25b7669b8a48e9291e3d21d66b2ad87c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.574517 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.574421 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/edb81deebb56b756d8013eea0bbef625-config\") pod \"kube-apiserver-proxy-ip-10-0-130-110.ec2.internal\" (UID: \"edb81deebb56b756d8013eea0bbef625\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.574517 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.574439 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/25b7669b8a48e9291e3d21d66b2ad87c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal\" (UID: \"25b7669b8a48e9291e3d21d66b2ad87c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.574517 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.574480 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/25b7669b8a48e9291e3d21d66b2ad87c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal\" (UID: \"25b7669b8a48e9291e3d21d66b2ad87c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.574517 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.574503 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/edb81deebb56b756d8013eea0bbef625-config\") pod \"kube-apiserver-proxy-ip-10-0-130-110.ec2.internal\" (UID: \"edb81deebb56b756d8013eea0bbef625\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.574670 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.574502 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25b7669b8a48e9291e3d21d66b2ad87c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal\" (UID: \"25b7669b8a48e9291e3d21d66b2ad87c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.576495 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.576478 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-110.ec2.internal\" not found" Apr 23 16:34:53.677604 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.677558 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-110.ec2.internal\" not found" Apr 23 16:34:53.778250 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.778229 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-110.ec2.internal\" not found" Apr 23 16:34:53.793413 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.793392 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.799356 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:53.799337 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-110.ec2.internal" Apr 23 16:34:53.878786 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.878754 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-110.ec2.internal\" not found" Apr 23 16:34:53.979353 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:53.979328 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-110.ec2.internal\" not found" Apr 23 16:34:54.079920 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:54.079856 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-110.ec2.internal\" not found" Apr 23 16:34:54.145086 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:54.145052 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 16:34:54.145710 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:54.145590 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:34:54.145710 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:54.145589 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:34:54.180771 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:54.180738 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-110.ec2.internal\" not found" Apr 23 16:34:54.200574 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:54.200548 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:54.270602 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:54.270576 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 16:34:54.279072 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:54.279025 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 16:29:53 +0000 UTC" deadline="2027-10-28 08:07:36.53616744 +0000 UTC" Apr 23 16:34:54.279072 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:54.279066 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13263h32m42.257104757s" Apr 23 16:34:54.281426 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:54.281214 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-110.ec2.internal\" not found" Apr 23 16:34:54.283936 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:54.283914 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:34:54.284023 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:54.283938 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedb81deebb56b756d8013eea0bbef625.slice/crio-00006bd34679f5d1c487da410079da63f5b2ec374f6e920ca114daf4c766e032 WatchSource:0}: Error finding container 00006bd34679f5d1c487da410079da63f5b2ec374f6e920ca114daf4c766e032: Status 404 returned error can't find the container with id 00006bd34679f5d1c487da410079da63f5b2ec374f6e920ca114daf4c766e032 Apr 23 16:34:54.284701 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:54.284673 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25b7669b8a48e9291e3d21d66b2ad87c.slice/crio-acd5ae81b23394aac9c66fe9d63f7cb474bbb0f21a56da6cfd55ae10f1c35aa9 WatchSource:0}: Error finding container acd5ae81b23394aac9c66fe9d63f7cb474bbb0f21a56da6cfd55ae10f1c35aa9: Status 404 returned error can't find the container with id acd5ae81b23394aac9c66fe9d63f7cb474bbb0f21a56da6cfd55ae10f1c35aa9 Apr 23 16:34:54.289278 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:54.289265 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:34:54.309910 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:54.309890 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-pp2hr" Apr 23 16:34:54.321689 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:54.321668 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-pp2hr" Apr 23 16:34:54.362123 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:54.362024 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-110.ec2.internal" event={"ID":"edb81deebb56b756d8013eea0bbef625","Type":"ContainerStarted","Data":"00006bd34679f5d1c487da410079da63f5b2ec374f6e920ca114daf4c766e032"} Apr 23 16:34:54.362939 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:54.362916 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal" event={"ID":"25b7669b8a48e9291e3d21d66b2ad87c","Type":"ContainerStarted","Data":"acd5ae81b23394aac9c66fe9d63f7cb474bbb0f21a56da6cfd55ae10f1c35aa9"} Apr 23 16:34:54.382445 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:54.382420 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-110.ec2.internal\" not found" Apr 23 16:34:54.483028 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:54.482997 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-110.ec2.internal\" not found" Apr 23 16:34:54.583606 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:54.583571 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-110.ec2.internal\" not found" Apr 23 16:34:54.684371 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:54.684295 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-110.ec2.internal\" not found" Apr 23 16:34:54.785049 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:54.785012 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-110.ec2.internal\" not found" Apr 23 16:34:54.852704 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:54.852675 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:54.872424 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:54.872395 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal" Apr 23 16:34:54.888280 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:54.888201 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:34:54.889277 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:54.889105 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-110.ec2.internal" Apr 23 16:34:54.900574 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:54.900555 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:34:55.215352 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.215324 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:55.244616 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.244585 2575 apiserver.go:52] "Watching apiserver" Apr 23 16:34:55.252982 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.252959 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 16:34:55.254172 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.254143 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-qlwjk","openshift-network-operator/iptables-alerter-gfsbv","openshift-ovn-kubernetes/ovnkube-node-75zl8","kube-system/konnectivity-agent-7fhkj","openshift-cluster-node-tuning-operator/tuned-8ltxj","openshift-image-registry/node-ca-fx8gg","openshift-multus/multus-additional-cni-plugins-kznw2","kube-system/kube-apiserver-proxy-ip-10-0-130-110.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal","openshift-multus/multus-x4nj8","openshift-multus/network-metrics-daemon-grr5m"] Apr 23 16:34:55.256782 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.256760 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:34:55.256892 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:55.256835 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qlwjk" podUID="3b7bc009-2549-40d8-b0f3-979edf176475" Apr 23 16:34:55.256952 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.256940 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gfsbv" Apr 23 16:34:55.258609 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.258584 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.261072 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.261051 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7fhkj" Apr 23 16:34:55.261182 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.261156 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.261960 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.261939 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-r7btr\"" Apr 23 16:34:55.262075 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.262010 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 16:34:55.262672 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.262608 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fx8gg" Apr 23 16:34:55.262768 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.262714 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 16:34:55.263110 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.263094 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 16:34:55.263234 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.263211 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 16:34:55.263293 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.263265 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:34:55.263386 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.263373 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 16:34:55.263465 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.263450 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 16:34:55.263543 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.263502 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-n6tl9\"" Apr 23 16:34:55.263685 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.263664 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 16:34:55.263685 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.263666 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 16:34:55.263935 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.263917 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.264080 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.264059 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 16:34:55.264312 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.264294 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 16:34:55.264567 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.264547 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qcf7r\"" Apr 23 16:34:55.264826 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.264811 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:34:55.264921 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.264862 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qhg9q\"" Apr 23 16:34:55.265275 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.265256 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.265498 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.265403 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 16:34:55.266543 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.266508 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 16:34:55.266633 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.266621 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 16:34:55.266873 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.266860 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.268065 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.267635 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 16:34:55.268065 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.267676 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wl7qd\"" Apr 23 16:34:55.268065 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.267717 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 16:34:55.268065 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.267930 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 16:34:55.268293 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.268138 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:34:55.268293 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:55.268213 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-grr5m" podUID="53be1a16-14c5-4f4a-b293-a31505dc39e3" Apr 23 16:34:55.268630 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.268613 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xvnpc\"" Apr 23 16:34:55.268823 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.268805 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-ccff8\"" Apr 23 16:34:55.268984 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.268972 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 16:34:55.269036 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.268973 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 16:34:55.271906 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.271886 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 16:34:55.272008 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.271892 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 16:34:55.273369 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.273352 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 16:34:55.274029 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.273929 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 16:34:55.274029 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.273952 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 16:34:55.274029 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.273967 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 16:34:55.274329 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.274311 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vq8nn\"" Apr 23 16:34:55.284384 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284362 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0a621457-5864-41d6-92e2-1094d54e41ae-system-cni-dir\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.284459 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284395 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0a621457-5864-41d6-92e2-1094d54e41ae-cni-binary-copy\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.284459 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284414 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-host-run-multus-certs\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.284459 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284433 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0a621457-5864-41d6-92e2-1094d54e41ae-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.284459 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284454 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-etc-modprobe-d\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.284639 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284476 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/db812877-6017-42fa-aef3-97e4f03153ce-tmp\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.284639 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284495 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-node-log\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.284639 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284509 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-cni-netd\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.284639 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284555 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-ovn-node-metrics-cert\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.284639 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284577 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0a621457-5864-41d6-92e2-1094d54e41ae-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.284639 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284593 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2333206a-7852-464c-abb6-f9aab741441c-iptables-alerter-script\") pod \"iptables-alerter-gfsbv\" (UID: \"2333206a-7852-464c-abb6-f9aab741441c\") " pod="openshift-network-operator/iptables-alerter-gfsbv" Apr 23 16:34:55.284639 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284611 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-cni-bin\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.284639 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284625 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3e7ecdff-727d-4e4b-8277-cecdb204253c-device-dir\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.284929 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284638 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-etc-systemd\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.284929 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284662 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-host-run-k8s-cni-cncf-io\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.284929 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284675 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-host-run-netns\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.284929 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284688 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-hostroot\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.284929 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284705 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-run-openvswitch\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.284929 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284718 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-run\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.284929 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284736 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-sys\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.284929 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284752 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-slash\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.284929 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284766 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3e7ecdff-727d-4e4b-8277-cecdb204253c-registration-dir\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.284929 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284783 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3e7ecdff-727d-4e4b-8277-cecdb204253c-etc-selinux\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.284929 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284806 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56kt2\" (UniqueName: \"kubernetes.io/projected/2333206a-7852-464c-abb6-f9aab741441c-kube-api-access-56kt2\") pod \"iptables-alerter-gfsbv\" (UID: \"2333206a-7852-464c-abb6-f9aab741441c\") " pod="openshift-network-operator/iptables-alerter-gfsbv" Apr 23 16:34:55.284929 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284825 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-etc-sysconfig\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.284929 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284856 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-lib-modules\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.284929 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284882 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.284929 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284898 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svrkn\" (UniqueName: \"kubernetes.io/projected/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-kube-api-access-svrkn\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.284929 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284911 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/db812877-6017-42fa-aef3-97e4f03153ce-etc-tuned\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.285434 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284931 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-cnibin\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.285434 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284945 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-multus-socket-dir-parent\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.285434 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284961 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-kubelet\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.285434 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284974 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0a621457-5864-41d6-92e2-1094d54e41ae-cnibin\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.285434 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.284988 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0a621457-5864-41d6-92e2-1094d54e41ae-os-release\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.285434 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285010 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9v6r\" (UniqueName: \"kubernetes.io/projected/0a621457-5864-41d6-92e2-1094d54e41ae-kube-api-access-z9v6r\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.285434 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285024 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3e7ecdff-727d-4e4b-8277-cecdb204253c-socket-dir\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.285434 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285039 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-var-lib-kubelet\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.285434 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285051 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-os-release\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.285434 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285066 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-ovnkube-script-lib\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.285434 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285088 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2cdbd610-b42f-4ea5-9463-a02f295038a1-konnectivity-ca\") pod \"konnectivity-agent-7fhkj\" (UID: \"2cdbd610-b42f-4ea5-9463-a02f295038a1\") " pod="kube-system/konnectivity-agent-7fhkj" Apr 23 16:34:55.285434 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285101 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-run-ovn\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.285434 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285122 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-log-socket\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.285434 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285147 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcrkg\" (UniqueName: \"kubernetes.io/projected/3e7ecdff-727d-4e4b-8277-cecdb204253c-kube-api-access-mcrkg\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.285434 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285162 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-host\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.285434 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285198 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-etc-kubernetes\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.286108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285224 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs\") pod \"network-metrics-daemon-grr5m\" (UID: \"53be1a16-14c5-4f4a-b293-a31505dc39e3\") " pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:34:55.286108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285240 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-run-netns\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.286108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285254 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-var-lib-openvswitch\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.286108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285269 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krlrw\" (UniqueName: \"kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw\") pod \"network-check-target-qlwjk\" (UID: \"3b7bc009-2549-40d8-b0f3-979edf176475\") " pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:34:55.286108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285283 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkzh9\" (UniqueName: \"kubernetes.io/projected/db812877-6017-42fa-aef3-97e4f03153ce-kube-api-access-bkzh9\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.286108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285296 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-cni-binary-copy\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.286108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285322 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-host-var-lib-cni-multus\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.286108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285346 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-host-var-lib-kubelet\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.286108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285369 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-multus-conf-dir\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.286108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285393 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3e7ecdff-727d-4e4b-8277-cecdb204253c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.286108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285416 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3e7ecdff-727d-4e4b-8277-cecdb204253c-sys-fs\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.286108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285605 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-system-cni-dir\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.286108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285636 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcx9j\" (UniqueName: \"kubernetes.io/projected/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-kube-api-access-fcx9j\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.286108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285661 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-env-overrides\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.286108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285686 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbt2r\" (UniqueName: \"kubernetes.io/projected/f44bf524-5ea5-4d89-8f40-5a45087669b8-kube-api-access-zbt2r\") pod \"node-ca-fx8gg\" (UID: \"f44bf524-5ea5-4d89-8f40-5a45087669b8\") " pod="openshift-image-registry/node-ca-fx8gg" Apr 23 16:34:55.286108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285713 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jxcn\" (UniqueName: \"kubernetes.io/projected/53be1a16-14c5-4f4a-b293-a31505dc39e3-kube-api-access-5jxcn\") pod \"network-metrics-daemon-grr5m\" (UID: \"53be1a16-14c5-4f4a-b293-a31505dc39e3\") " pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:34:55.286823 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285749 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-etc-sysctl-d\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.286823 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-etc-sysctl-conf\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.286823 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285798 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2333206a-7852-464c-abb6-f9aab741441c-host-slash\") pod \"iptables-alerter-gfsbv\" (UID: \"2333206a-7852-464c-abb6-f9aab741441c\") " pod="openshift-network-operator/iptables-alerter-gfsbv" Apr 23 16:34:55.286823 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-run-systemd\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.286823 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285844 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f44bf524-5ea5-4d89-8f40-5a45087669b8-host\") pod \"node-ca-fx8gg\" (UID: \"f44bf524-5ea5-4d89-8f40-5a45087669b8\") " pod="openshift-image-registry/node-ca-fx8gg" Apr 23 16:34:55.286823 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285870 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0a621457-5864-41d6-92e2-1094d54e41ae-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.286823 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285895 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-etc-kubernetes\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.286823 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285918 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-multus-cni-dir\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.286823 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285943 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-ovnkube-config\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.286823 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285968 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2cdbd610-b42f-4ea5-9463-a02f295038a1-agent-certs\") pod \"konnectivity-agent-7fhkj\" (UID: \"2cdbd610-b42f-4ea5-9463-a02f295038a1\") " pod="kube-system/konnectivity-agent-7fhkj" Apr 23 16:34:55.286823 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.285992 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-host-var-lib-cni-bin\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.286823 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.286018 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-multus-daemon-config\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.286823 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.286052 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-systemd-units\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.286823 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.286088 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-etc-openvswitch\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.286823 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.286119 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-run-ovn-kubernetes\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.286823 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.286145 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f44bf524-5ea5-4d89-8f40-5a45087669b8-serviceca\") pod \"node-ca-fx8gg\" (UID: \"f44bf524-5ea5-4d89-8f40-5a45087669b8\") " pod="openshift-image-registry/node-ca-fx8gg" Apr 23 16:34:55.288785 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.288767 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:55.322709 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.322679 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:29:54 +0000 UTC" deadline="2027-10-13 09:04:58.228745144 +0000 UTC" Apr 23 16:34:55.322709 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.322709 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12904h30m2.906039833s" Apr 23 16:34:55.386894 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.386845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-multus-daemon-config\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.386894 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.386890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-systemd-units\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.387101 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.386920 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-etc-openvswitch\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.387101 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.386944 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-run-ovn-kubernetes\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.387101 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.386969 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f44bf524-5ea5-4d89-8f40-5a45087669b8-serviceca\") pod \"node-ca-fx8gg\" (UID: \"f44bf524-5ea5-4d89-8f40-5a45087669b8\") " pod="openshift-image-registry/node-ca-fx8gg" Apr 23 16:34:55.387101 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387001 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0a621457-5864-41d6-92e2-1094d54e41ae-system-cni-dir\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.387101 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387019 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0a621457-5864-41d6-92e2-1094d54e41ae-cni-binary-copy\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.387101 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387034 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-host-run-multus-certs\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.387101 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387048 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0a621457-5864-41d6-92e2-1094d54e41ae-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.387101 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-etc-modprobe-d\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.387101 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387079 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/db812877-6017-42fa-aef3-97e4f03153ce-tmp\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.387101 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387093 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-node-log\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.387101 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387106 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-cni-netd\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.387661 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387121 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-ovn-node-metrics-cert\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.387661 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0a621457-5864-41d6-92e2-1094d54e41ae-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.387661 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387151 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2333206a-7852-464c-abb6-f9aab741441c-iptables-alerter-script\") pod \"iptables-alerter-gfsbv\" (UID: \"2333206a-7852-464c-abb6-f9aab741441c\") " pod="openshift-network-operator/iptables-alerter-gfsbv" Apr 23 16:34:55.387661 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387167 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-cni-bin\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.387661 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3e7ecdff-727d-4e4b-8277-cecdb204253c-device-dir\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.387661 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-etc-systemd\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.387661 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387232 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-host-run-k8s-cni-cncf-io\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.387661 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387253 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-host-run-netns\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.387661 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387270 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-hostroot\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.387661 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387297 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-run-openvswitch\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.387661 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387326 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-run\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.387661 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-etc-modprobe-d\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.387661 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387340 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-sys\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.387661 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-slash\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.387661 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387370 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3e7ecdff-727d-4e4b-8277-cecdb204253c-registration-dir\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.387661 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387376 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-systemd-units\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.387661 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387397 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3e7ecdff-727d-4e4b-8277-cecdb204253c-etc-selinux\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.388419 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387412 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-etc-openvswitch\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.388419 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387415 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56kt2\" (UniqueName: \"kubernetes.io/projected/2333206a-7852-464c-abb6-f9aab741441c-kube-api-access-56kt2\") pod \"iptables-alerter-gfsbv\" (UID: \"2333206a-7852-464c-abb6-f9aab741441c\") " pod="openshift-network-operator/iptables-alerter-gfsbv" Apr 23 16:34:55.388419 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-etc-sysconfig\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.388419 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-lib-modules\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.388419 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387487 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.388419 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387509 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svrkn\" (UniqueName: \"kubernetes.io/projected/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-kube-api-access-svrkn\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.388419 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/db812877-6017-42fa-aef3-97e4f03153ce-etc-tuned\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.388419 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-cnibin\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.388419 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-multus-socket-dir-parent\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.388419 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387615 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-kubelet\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.388419 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387632 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0a621457-5864-41d6-92e2-1094d54e41ae-cnibin\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.388419 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387665 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0a621457-5864-41d6-92e2-1094d54e41ae-os-release\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.388419 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387682 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9v6r\" (UniqueName: \"kubernetes.io/projected/0a621457-5864-41d6-92e2-1094d54e41ae-kube-api-access-z9v6r\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.388419 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3e7ecdff-727d-4e4b-8277-cecdb204253c-socket-dir\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.388419 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387719 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-var-lib-kubelet\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.388419 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387736 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-os-release\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.388419 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387752 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-ovnkube-script-lib\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.389261 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387772 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-host-run-k8s-cni-cncf-io\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.389261 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387831 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-host-run-netns\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.389261 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3e7ecdff-727d-4e4b-8277-cecdb204253c-device-dir\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.389261 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387901 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-run-openvswitch\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.389261 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387902 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-etc-systemd\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.389261 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387900 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 16:34:55.389261 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387934 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-hostroot\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.389261 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387951 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-sys\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.389261 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387968 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2cdbd610-b42f-4ea5-9463-a02f295038a1-konnectivity-ca\") pod \"konnectivity-agent-7fhkj\" (UID: \"2cdbd610-b42f-4ea5-9463-a02f295038a1\") " pod="kube-system/konnectivity-agent-7fhkj" Apr 23 16:34:55.389261 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387993 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-slash\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.389261 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.387998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-run-ovn\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.389261 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388037 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3e7ecdff-727d-4e4b-8277-cecdb204253c-registration-dir\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.389261 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388038 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-run-ovn\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.389261 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388076 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-multus-daemon-config\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.389261 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388088 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3e7ecdff-727d-4e4b-8277-cecdb204253c-etc-selinux\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.389261 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388125 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-cni-netd\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.389261 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-log-socket\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.389261 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388187 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-node-log\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.390108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388198 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcrkg\" (UniqueName: \"kubernetes.io/projected/3e7ecdff-727d-4e4b-8277-cecdb204253c-kube-api-access-mcrkg\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.390108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388223 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-host\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.390108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388248 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-etc-kubernetes\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.390108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388275 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs\") pod \"network-metrics-daemon-grr5m\" (UID: \"53be1a16-14c5-4f4a-b293-a31505dc39e3\") " pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:34:55.390108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388301 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-run-netns\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.390108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388314 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0a621457-5864-41d6-92e2-1094d54e41ae-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.390108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388326 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-var-lib-openvswitch\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.390108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388345 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2333206a-7852-464c-abb6-f9aab741441c-iptables-alerter-script\") pod \"iptables-alerter-gfsbv\" (UID: \"2333206a-7852-464c-abb6-f9aab741441c\") " pod="openshift-network-operator/iptables-alerter-gfsbv" Apr 23 16:34:55.390108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krlrw\" (UniqueName: \"kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw\") pod \"network-check-target-qlwjk\" (UID: \"3b7bc009-2549-40d8-b0f3-979edf176475\") " pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:34:55.390108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388373 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-run-ovn-kubernetes\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.390108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388374 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-ovnkube-script-lib\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.390108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388381 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkzh9\" (UniqueName: \"kubernetes.io/projected/db812877-6017-42fa-aef3-97e4f03153ce-kube-api-access-bkzh9\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.390108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388407 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-cni-bin\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.390108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388410 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-run\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.390108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388418 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-cni-binary-copy\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.390108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388458 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-host-var-lib-cni-multus\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.390108 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388489 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-host-var-lib-kubelet\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.390975 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-multus-conf-dir\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.390975 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388569 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3e7ecdff-727d-4e4b-8277-cecdb204253c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.390975 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388598 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3e7ecdff-727d-4e4b-8277-cecdb204253c-sys-fs\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.390975 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388619 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2cdbd610-b42f-4ea5-9463-a02f295038a1-konnectivity-ca\") pod \"konnectivity-agent-7fhkj\" (UID: \"2cdbd610-b42f-4ea5-9463-a02f295038a1\") " pod="kube-system/konnectivity-agent-7fhkj" Apr 23 16:34:55.390975 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388634 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-log-socket\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.390975 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-system-cni-dir\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.390975 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388676 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-system-cni-dir\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.390975 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388681 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcx9j\" (UniqueName: \"kubernetes.io/projected/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-kube-api-access-fcx9j\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.390975 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388711 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-env-overrides\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.390975 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388733 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-host-var-lib-cni-multus\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.390975 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388760 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbt2r\" (UniqueName: \"kubernetes.io/projected/f44bf524-5ea5-4d89-8f40-5a45087669b8-kube-api-access-zbt2r\") pod \"node-ca-fx8gg\" (UID: \"f44bf524-5ea5-4d89-8f40-5a45087669b8\") " pod="openshift-image-registry/node-ca-fx8gg" Apr 23 16:34:55.390975 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388779 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-host-var-lib-kubelet\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.390975 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388788 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jxcn\" (UniqueName: \"kubernetes.io/projected/53be1a16-14c5-4f4a-b293-a31505dc39e3-kube-api-access-5jxcn\") pod \"network-metrics-daemon-grr5m\" (UID: \"53be1a16-14c5-4f4a-b293-a31505dc39e3\") " pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:34:55.390975 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-etc-sysctl-d\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.390975 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388822 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-multus-conf-dir\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.390975 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-etc-sysctl-conf\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.390975 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388870 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2333206a-7852-464c-abb6-f9aab741441c-host-slash\") pod \"iptables-alerter-gfsbv\" (UID: \"2333206a-7852-464c-abb6-f9aab741441c\") " pod="openshift-network-operator/iptables-alerter-gfsbv" Apr 23 16:34:55.390975 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388875 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-cni-binary-copy\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.391880 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388875 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-host\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.391880 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388907 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-etc-kubernetes\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.391880 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388921 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3e7ecdff-727d-4e4b-8277-cecdb204253c-sys-fs\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.391880 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388948 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-run-systemd\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.391880 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388957 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-host-run-multus-certs\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.391880 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388877 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3e7ecdff-727d-4e4b-8277-cecdb204253c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.391880 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388984 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0a621457-5864-41d6-92e2-1094d54e41ae-system-cni-dir\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.391880 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:55.389056 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:55.391880 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:55.389136 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs podName:53be1a16-14c5-4f4a-b293-a31505dc39e3 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:55.889104799 +0000 UTC m=+3.151979257 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs") pod "network-metrics-daemon-grr5m" (UID: "53be1a16-14c5-4f4a-b293-a31505dc39e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:55.391880 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389219 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0a621457-5864-41d6-92e2-1094d54e41ae-os-release\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.391880 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389283 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-cnibin\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.391880 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-run-netns\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.391880 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389333 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-multus-socket-dir-parent\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.391880 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389377 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-var-lib-openvswitch\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.391880 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389391 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0a621457-5864-41d6-92e2-1094d54e41ae-cnibin\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.391880 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389399 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0a621457-5864-41d6-92e2-1094d54e41ae-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.391880 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389416 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-kubelet\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.392708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389457 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-env-overrides\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.392708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389494 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-lib-modules\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.392708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389510 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3e7ecdff-727d-4e4b-8277-cecdb204253c-socket-dir\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.392708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.388897 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-run-systemd\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.392708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389645 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.392708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389667 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f44bf524-5ea5-4d89-8f40-5a45087669b8-host\") pod \"node-ca-fx8gg\" (UID: \"f44bf524-5ea5-4d89-8f40-5a45087669b8\") " pod="openshift-image-registry/node-ca-fx8gg" Apr 23 16:34:55.392708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389709 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0a621457-5864-41d6-92e2-1094d54e41ae-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.392708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389742 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-etc-kubernetes\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.392708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389769 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-multus-cni-dir\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.392708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389769 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-etc-sysctl-conf\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.392708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389787 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2333206a-7852-464c-abb6-f9aab741441c-host-slash\") pod \"iptables-alerter-gfsbv\" (UID: \"2333206a-7852-464c-abb6-f9aab741441c\") " pod="openshift-network-operator/iptables-alerter-gfsbv" Apr 23 16:34:55.392708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389812 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-ovnkube-config\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.392708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2cdbd610-b42f-4ea5-9463-a02f295038a1-agent-certs\") pod \"konnectivity-agent-7fhkj\" (UID: \"2cdbd610-b42f-4ea5-9463-a02f295038a1\") " pod="kube-system/konnectivity-agent-7fhkj" Apr 23 16:34:55.392708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389844 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-etc-sysconfig\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.392708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389857 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-host-var-lib-cni-bin\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.392708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389866 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-etc-sysctl-d\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.392708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389905 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-var-lib-kubelet\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.392708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389916 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-host-var-lib-cni-bin\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.393566 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389918 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-multus-cni-dir\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.393566 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.389988 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f44bf524-5ea5-4d89-8f40-5a45087669b8-serviceca\") pod \"node-ca-fx8gg\" (UID: \"f44bf524-5ea5-4d89-8f40-5a45087669b8\") " pod="openshift-image-registry/node-ca-fx8gg" Apr 23 16:34:55.393566 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.390025 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0a621457-5864-41d6-92e2-1094d54e41ae-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.393566 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.390033 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f44bf524-5ea5-4d89-8f40-5a45087669b8-host\") pod \"node-ca-fx8gg\" (UID: \"f44bf524-5ea5-4d89-8f40-5a45087669b8\") " pod="openshift-image-registry/node-ca-fx8gg" Apr 23 16:34:55.393566 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.390075 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db812877-6017-42fa-aef3-97e4f03153ce-etc-kubernetes\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.393566 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.390091 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-os-release\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.393566 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.390378 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-ovnkube-config\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.393566 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.390857 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0a621457-5864-41d6-92e2-1094d54e41ae-cni-binary-copy\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.393566 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.391747 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/db812877-6017-42fa-aef3-97e4f03153ce-tmp\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.393566 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.391869 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-ovn-node-metrics-cert\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.393566 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.391903 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/db812877-6017-42fa-aef3-97e4f03153ce-etc-tuned\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.393566 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.392983 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2cdbd610-b42f-4ea5-9463-a02f295038a1-agent-certs\") pod \"konnectivity-agent-7fhkj\" (UID: \"2cdbd610-b42f-4ea5-9463-a02f295038a1\") " pod="kube-system/konnectivity-agent-7fhkj" Apr 23 16:34:55.398783 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.398761 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcx9j\" (UniqueName: \"kubernetes.io/projected/6e7cb2c4-5b50-4081-b3e2-d41f862f81f2-kube-api-access-fcx9j\") pod \"multus-x4nj8\" (UID: \"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2\") " pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.398889 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.398786 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkzh9\" (UniqueName: \"kubernetes.io/projected/db812877-6017-42fa-aef3-97e4f03153ce-kube-api-access-bkzh9\") pod \"tuned-8ltxj\" (UID: \"db812877-6017-42fa-aef3-97e4f03153ce\") " pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.403157 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:55.402917 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:34:55.403157 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:55.402940 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:34:55.403157 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:55.402953 2575 projected.go:194] Error preparing data for projected volume kube-api-access-krlrw for pod openshift-network-diagnostics/network-check-target-qlwjk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:55.403157 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:55.403015 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw podName:3b7bc009-2549-40d8-b0f3-979edf176475 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:55.902997298 +0000 UTC m=+3.165871755 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-krlrw" (UniqueName: "kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw") pod "network-check-target-qlwjk" (UID: "3b7bc009-2549-40d8-b0f3-979edf176475") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:55.403691 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.403582 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbt2r\" (UniqueName: \"kubernetes.io/projected/f44bf524-5ea5-4d89-8f40-5a45087669b8-kube-api-access-zbt2r\") pod \"node-ca-fx8gg\" (UID: \"f44bf524-5ea5-4d89-8f40-5a45087669b8\") " pod="openshift-image-registry/node-ca-fx8gg" Apr 23 16:34:55.403776 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.403755 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-56kt2\" (UniqueName: \"kubernetes.io/projected/2333206a-7852-464c-abb6-f9aab741441c-kube-api-access-56kt2\") pod \"iptables-alerter-gfsbv\" (UID: \"2333206a-7852-464c-abb6-f9aab741441c\") " pod="openshift-network-operator/iptables-alerter-gfsbv" Apr 23 16:34:55.405203 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.405174 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svrkn\" (UniqueName: \"kubernetes.io/projected/9a94ae8c-bc2c-4382-b11c-ef8e0a012d18-kube-api-access-svrkn\") pod \"ovnkube-node-75zl8\" (UID: \"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18\") " pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.405479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.405462 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9v6r\" (UniqueName: \"kubernetes.io/projected/0a621457-5864-41d6-92e2-1094d54e41ae-kube-api-access-z9v6r\") pod \"multus-additional-cni-plugins-kznw2\" (UID: \"0a621457-5864-41d6-92e2-1094d54e41ae\") " pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.405934 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.405911 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jxcn\" (UniqueName: \"kubernetes.io/projected/53be1a16-14c5-4f4a-b293-a31505dc39e3-kube-api-access-5jxcn\") pod \"network-metrics-daemon-grr5m\" (UID: \"53be1a16-14c5-4f4a-b293-a31505dc39e3\") " pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:34:55.407463 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.407445 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcrkg\" (UniqueName: \"kubernetes.io/projected/3e7ecdff-727d-4e4b-8277-cecdb204253c-kube-api-access-mcrkg\") pod \"aws-ebs-csi-driver-node-wqm4p\" (UID: \"3e7ecdff-727d-4e4b-8277-cecdb204253c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.571410 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.571362 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gfsbv" Apr 23 16:34:55.578754 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.578730 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:34:55.587417 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.587388 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7fhkj" Apr 23 16:34:55.592754 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.592734 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" Apr 23 16:34:55.598272 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.598248 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fx8gg" Apr 23 16:34:55.607429 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.607411 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kznw2" Apr 23 16:34:55.612983 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.612960 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" Apr 23 16:34:55.618506 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.618487 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x4nj8" Apr 23 16:34:55.892547 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.892453 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs\") pod \"network-metrics-daemon-grr5m\" (UID: \"53be1a16-14c5-4f4a-b293-a31505dc39e3\") " pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:34:55.892674 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:55.892596 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:55.892674 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:55.892651 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs podName:53be1a16-14c5-4f4a-b293-a31505dc39e3 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:56.892637515 +0000 UTC m=+4.155511972 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs") pod "network-metrics-daemon-grr5m" (UID: "53be1a16-14c5-4f4a-b293-a31505dc39e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:55.922012 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:55.921980 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf44bf524_5ea5_4d89_8f40_5a45087669b8.slice/crio-2a93d971875076e641adf4cc68d484b42bd6e7ac065a0566b492addeb8477f5e WatchSource:0}: Error finding container 2a93d971875076e641adf4cc68d484b42bd6e7ac065a0566b492addeb8477f5e: Status 404 returned error can't find the container with id 2a93d971875076e641adf4cc68d484b42bd6e7ac065a0566b492addeb8477f5e Apr 23 16:34:55.923401 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:55.923347 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb812877_6017_42fa_aef3_97e4f03153ce.slice/crio-4c9fbd5c4a2311602ca32b1e473678a413cdd4d5bce743050a30980b2cadae00 WatchSource:0}: Error finding container 4c9fbd5c4a2311602ca32b1e473678a413cdd4d5bce743050a30980b2cadae00: Status 404 returned error can't find the container with id 4c9fbd5c4a2311602ca32b1e473678a413cdd4d5bce743050a30980b2cadae00 Apr 23 16:34:55.924306 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:55.924281 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a94ae8c_bc2c_4382_b11c_ef8e0a012d18.slice/crio-c742f2c035edb278ca87d242064717443de26ba91017621c0ca80a1a18ab5868 WatchSource:0}: Error finding container c742f2c035edb278ca87d242064717443de26ba91017621c0ca80a1a18ab5868: Status 404 returned error can't find the container with id c742f2c035edb278ca87d242064717443de26ba91017621c0ca80a1a18ab5868 Apr 23 16:34:55.926557 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:55.926478 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e7cb2c4_5b50_4081_b3e2_d41f862f81f2.slice/crio-b60f4dc540d7ce1f7e660bdee352bdb40091fd28ce80a07020f123663ddb7c06 WatchSource:0}: Error finding container b60f4dc540d7ce1f7e660bdee352bdb40091fd28ce80a07020f123663ddb7c06: Status 404 returned error can't find the container with id b60f4dc540d7ce1f7e660bdee352bdb40091fd28ce80a07020f123663ddb7c06 Apr 23 16:34:55.928758 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:34:55.928736 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e7ecdff_727d_4e4b_8277_cecdb204253c.slice/crio-a8a6d4e754ad7a57795eae23f6cf366413143baa4d1d8748e37671400d7c8201 WatchSource:0}: Error finding container a8a6d4e754ad7a57795eae23f6cf366413143baa4d1d8748e37671400d7c8201: Status 404 returned error can't find the container with id a8a6d4e754ad7a57795eae23f6cf366413143baa4d1d8748e37671400d7c8201 Apr 23 16:34:55.993259 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:55.993236 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krlrw\" (UniqueName: \"kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw\") pod \"network-check-target-qlwjk\" (UID: \"3b7bc009-2549-40d8-b0f3-979edf176475\") " pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:34:55.993365 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:55.993351 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:34:55.993406 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:55.993370 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:34:55.993406 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:55.993379 2575 projected.go:194] Error preparing data for projected volume kube-api-access-krlrw for pod openshift-network-diagnostics/network-check-target-qlwjk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:55.993462 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:55.993427 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw podName:3b7bc009-2549-40d8-b0f3-979edf176475 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:56.993410857 +0000 UTC m=+4.256285331 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-krlrw" (UniqueName: "kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw") pod "network-check-target-qlwjk" (UID: "3b7bc009-2549-40d8-b0f3-979edf176475") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:56.323014 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:56.322975 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:29:54 +0000 UTC" deadline="2028-01-26 12:13:31.205920646 +0000 UTC" Apr 23 16:34:56.323014 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:56.323009 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15427h38m34.882914437s" Apr 23 16:34:56.361071 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:56.359493 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:34:56.361071 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:56.359643 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qlwjk" podUID="3b7bc009-2549-40d8-b0f3-979edf176475" Apr 23 16:34:56.372601 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:56.372521 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" event={"ID":"db812877-6017-42fa-aef3-97e4f03153ce","Type":"ContainerStarted","Data":"4c9fbd5c4a2311602ca32b1e473678a413cdd4d5bce743050a30980b2cadae00"} Apr 23 16:34:56.375632 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:56.375584 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-110.ec2.internal" event={"ID":"edb81deebb56b756d8013eea0bbef625","Type":"ContainerStarted","Data":"d34da8226d69cd99ec9b4dc8195de886b4aa071c5884e35ed20a1c2894a6421a"} Apr 23 16:34:56.378011 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:56.377966 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7fhkj" event={"ID":"2cdbd610-b42f-4ea5-9463-a02f295038a1","Type":"ContainerStarted","Data":"6ac9999c90f936ff5aa1e36134a1671f2a9736d8b8dfff4990d006de82e30c01"} Apr 23 16:34:56.381417 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:56.381391 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" event={"ID":"3e7ecdff-727d-4e4b-8277-cecdb204253c","Type":"ContainerStarted","Data":"a8a6d4e754ad7a57795eae23f6cf366413143baa4d1d8748e37671400d7c8201"} Apr 23 16:34:56.383433 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:56.383406 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x4nj8" event={"ID":"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2","Type":"ContainerStarted","Data":"b60f4dc540d7ce1f7e660bdee352bdb40091fd28ce80a07020f123663ddb7c06"} Apr 23 16:34:56.393945 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:56.393910 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fx8gg" event={"ID":"f44bf524-5ea5-4d89-8f40-5a45087669b8","Type":"ContainerStarted","Data":"2a93d971875076e641adf4cc68d484b42bd6e7ac065a0566b492addeb8477f5e"} Apr 23 16:34:56.397562 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:56.397076 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gfsbv" event={"ID":"2333206a-7852-464c-abb6-f9aab741441c","Type":"ContainerStarted","Data":"0adab9d6c811eba661527b405b92e6210ee974cfcadbe3b8abf36a9c32499483"} Apr 23 16:34:56.401972 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:56.401945 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kznw2" event={"ID":"0a621457-5864-41d6-92e2-1094d54e41ae","Type":"ContainerStarted","Data":"11bd869b5446018ab81b157066775fc113aa01f79eaf2b64b94ede10c5a20ed2"} Apr 23 16:34:56.408475 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:56.408442 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" event={"ID":"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18","Type":"ContainerStarted","Data":"c742f2c035edb278ca87d242064717443de26ba91017621c0ca80a1a18ab5868"} Apr 23 16:34:56.900082 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:56.900031 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs\") pod \"network-metrics-daemon-grr5m\" (UID: \"53be1a16-14c5-4f4a-b293-a31505dc39e3\") " pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:34:56.900266 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:56.900228 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:56.900324 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:56.900296 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs podName:53be1a16-14c5-4f4a-b293-a31505dc39e3 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:58.900275763 +0000 UTC m=+6.163150224 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs") pod "network-metrics-daemon-grr5m" (UID: "53be1a16-14c5-4f4a-b293-a31505dc39e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:57.002361 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:57.002325 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krlrw\" (UniqueName: \"kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw\") pod \"network-check-target-qlwjk\" (UID: \"3b7bc009-2549-40d8-b0f3-979edf176475\") " pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:34:57.002545 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:57.002491 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:34:57.002545 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:57.002510 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:34:57.002545 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:57.002521 2575 projected.go:194] Error preparing data for projected volume kube-api-access-krlrw for pod openshift-network-diagnostics/network-check-target-qlwjk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:57.002694 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:57.002600 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw podName:3b7bc009-2549-40d8-b0f3-979edf176475 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:59.002580506 +0000 UTC m=+6.265454980 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-krlrw" (UniqueName: "kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw") pod "network-check-target-qlwjk" (UID: "3b7bc009-2549-40d8-b0f3-979edf176475") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:57.362021 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:57.361924 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:34:57.362544 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:57.362066 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-grr5m" podUID="53be1a16-14c5-4f4a-b293-a31505dc39e3" Apr 23 16:34:57.431067 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:57.431029 2575 generic.go:358] "Generic (PLEG): container finished" podID="25b7669b8a48e9291e3d21d66b2ad87c" containerID="f51def523a1f97b85f6dc2311289202771a276f55587df06feb683af3482f6d4" exitCode=0 Apr 23 16:34:57.431297 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:57.431185 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal" event={"ID":"25b7669b8a48e9291e3d21d66b2ad87c","Type":"ContainerDied","Data":"f51def523a1f97b85f6dc2311289202771a276f55587df06feb683af3482f6d4"} Apr 23 16:34:57.448445 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:57.448346 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-110.ec2.internal" podStartSLOduration=3.448327847 podStartE2EDuration="3.448327847s" podCreationTimestamp="2026-04-23 16:34:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:34:56.390042375 +0000 UTC m=+3.652916854" watchObservedRunningTime="2026-04-23 16:34:57.448327847 +0000 UTC m=+4.711202330" Apr 23 16:34:58.360551 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:58.360454 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:34:58.360726 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:58.360600 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qlwjk" podUID="3b7bc009-2549-40d8-b0f3-979edf176475" Apr 23 16:34:58.437159 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:58.437117 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal" event={"ID":"25b7669b8a48e9291e3d21d66b2ad87c","Type":"ContainerStarted","Data":"b4a5fe9595dbb1460b75875aaa4a156b41486f706f5c3bccb1648e6db2803ae6"} Apr 23 16:34:58.918244 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:58.918206 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs\") pod \"network-metrics-daemon-grr5m\" (UID: \"53be1a16-14c5-4f4a-b293-a31505dc39e3\") " pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:34:58.918410 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:58.918360 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:58.918484 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:58.918434 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs podName:53be1a16-14c5-4f4a-b293-a31505dc39e3 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:02.918412973 +0000 UTC m=+10.181287480 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs") pod "network-metrics-daemon-grr5m" (UID: "53be1a16-14c5-4f4a-b293-a31505dc39e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:59.018809 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:59.018768 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krlrw\" (UniqueName: \"kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw\") pod \"network-check-target-qlwjk\" (UID: \"3b7bc009-2549-40d8-b0f3-979edf176475\") " pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:34:59.018993 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:59.018977 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:34:59.019066 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:59.019060 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:34:59.019120 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:59.019074 2575 projected.go:194] Error preparing data for projected volume kube-api-access-krlrw for pod openshift-network-diagnostics/network-check-target-qlwjk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:59.019171 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:59.019142 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw podName:3b7bc009-2549-40d8-b0f3-979edf176475 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:03.01912328 +0000 UTC m=+10.281997754 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-krlrw" (UniqueName: "kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw") pod "network-check-target-qlwjk" (UID: "3b7bc009-2549-40d8-b0f3-979edf176475") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:59.360577 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:34:59.360093 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:34:59.360577 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:34:59.360219 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-grr5m" podUID="53be1a16-14c5-4f4a-b293-a31505dc39e3" Apr 23 16:35:00.359872 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:00.359835 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:00.360343 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:00.359982 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qlwjk" podUID="3b7bc009-2549-40d8-b0f3-979edf176475" Apr 23 16:35:01.360677 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:01.360641 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:01.361170 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:01.360774 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-grr5m" podUID="53be1a16-14c5-4f4a-b293-a31505dc39e3" Apr 23 16:35:02.359621 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:02.359582 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:02.359799 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:02.359720 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qlwjk" podUID="3b7bc009-2549-40d8-b0f3-979edf176475" Apr 23 16:35:02.952252 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:02.951660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs\") pod \"network-metrics-daemon-grr5m\" (UID: \"53be1a16-14c5-4f4a-b293-a31505dc39e3\") " pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:02.952252 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:02.951838 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:02.952252 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:02.951899 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs podName:53be1a16-14c5-4f4a-b293-a31505dc39e3 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:10.951879963 +0000 UTC m=+18.214754429 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs") pod "network-metrics-daemon-grr5m" (UID: "53be1a16-14c5-4f4a-b293-a31505dc39e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:03.053343 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:03.052894 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krlrw\" (UniqueName: \"kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw\") pod \"network-check-target-qlwjk\" (UID: \"3b7bc009-2549-40d8-b0f3-979edf176475\") " pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:03.053343 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:03.053012 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:03.053343 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:03.053025 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:03.053343 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:03.053034 2575 projected.go:194] Error preparing data for projected volume kube-api-access-krlrw for pod openshift-network-diagnostics/network-check-target-qlwjk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:03.053343 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:03.053074 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw podName:3b7bc009-2549-40d8-b0f3-979edf176475 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:11.053061818 +0000 UTC m=+18.315936277 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-krlrw" (UniqueName: "kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw") pod "network-check-target-qlwjk" (UID: "3b7bc009-2549-40d8-b0f3-979edf176475") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:03.362088 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:03.362047 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:03.363644 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:03.362358 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-grr5m" podUID="53be1a16-14c5-4f4a-b293-a31505dc39e3" Apr 23 16:35:04.359904 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:04.359869 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:04.360349 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:04.360020 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qlwjk" podUID="3b7bc009-2549-40d8-b0f3-979edf176475" Apr 23 16:35:05.360319 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:05.360283 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:05.360757 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:05.360415 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-grr5m" podUID="53be1a16-14c5-4f4a-b293-a31505dc39e3" Apr 23 16:35:06.360297 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:06.360124 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:06.360472 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:06.360374 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qlwjk" podUID="3b7bc009-2549-40d8-b0f3-979edf176475" Apr 23 16:35:07.360587 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:07.360550 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:07.361044 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:07.360689 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-grr5m" podUID="53be1a16-14c5-4f4a-b293-a31505dc39e3" Apr 23 16:35:08.359592 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:08.359557 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:08.359764 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:08.359664 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qlwjk" podUID="3b7bc009-2549-40d8-b0f3-979edf176475" Apr 23 16:35:09.360538 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:09.360487 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:09.360970 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:09.360646 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-grr5m" podUID="53be1a16-14c5-4f4a-b293-a31505dc39e3" Apr 23 16:35:10.360302 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:10.359846 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:10.360302 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:10.359971 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qlwjk" podUID="3b7bc009-2549-40d8-b0f3-979edf176475" Apr 23 16:35:11.011547 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:11.011492 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs\") pod \"network-metrics-daemon-grr5m\" (UID: \"53be1a16-14c5-4f4a-b293-a31505dc39e3\") " pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:11.011941 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:11.011652 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:11.011941 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:11.011728 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs podName:53be1a16-14c5-4f4a-b293-a31505dc39e3 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:27.011704048 +0000 UTC m=+34.274578593 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs") pod "network-metrics-daemon-grr5m" (UID: "53be1a16-14c5-4f4a-b293-a31505dc39e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:11.112567 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:11.112512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krlrw\" (UniqueName: \"kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw\") pod \"network-check-target-qlwjk\" (UID: \"3b7bc009-2549-40d8-b0f3-979edf176475\") " pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:11.112731 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:11.112644 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:11.112731 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:11.112666 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:11.112731 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:11.112678 2575 projected.go:194] Error preparing data for projected volume kube-api-access-krlrw for pod openshift-network-diagnostics/network-check-target-qlwjk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:11.112905 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:11.112738 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw podName:3b7bc009-2549-40d8-b0f3-979edf176475 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:27.112724009 +0000 UTC m=+34.375598467 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-krlrw" (UniqueName: "kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw") pod "network-check-target-qlwjk" (UID: "3b7bc009-2549-40d8-b0f3-979edf176475") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:11.359517 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:11.359482 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:11.359703 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:11.359647 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-grr5m" podUID="53be1a16-14c5-4f4a-b293-a31505dc39e3" Apr 23 16:35:12.359491 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:12.359463 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:12.359936 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:12.359602 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qlwjk" podUID="3b7bc009-2549-40d8-b0f3-979edf176475" Apr 23 16:35:13.360742 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:13.360713 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:13.361178 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:13.360848 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-grr5m" podUID="53be1a16-14c5-4f4a-b293-a31505dc39e3" Apr 23 16:35:13.466131 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:13.465947 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7fhkj" event={"ID":"2cdbd610-b42f-4ea5-9463-a02f295038a1","Type":"ContainerStarted","Data":"a051d31119341b72de289379b54e703a2368850f7da97064e74abffc1a1b8cf6"} Apr 23 16:35:13.469232 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:13.469202 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fx8gg" event={"ID":"f44bf524-5ea5-4d89-8f40-5a45087669b8","Type":"ContainerStarted","Data":"9fd506ed0b20987d8f2c83e58de8f9b824d1cca47d6bc75428c384ad9fc58e44"} Apr 23 16:35:13.472652 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:13.472622 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kznw2" event={"ID":"0a621457-5864-41d6-92e2-1094d54e41ae","Type":"ContainerStarted","Data":"165cc1b18b526646477b8db6e7b858bf0ea087b51f39dc6d412ae9cb7018c373"} Apr 23 16:35:13.483104 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:13.482181 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-110.ec2.internal" podStartSLOduration=19.482164772 podStartE2EDuration="19.482164772s" podCreationTimestamp="2026-04-23 16:34:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:34:58.453986935 +0000 UTC m=+5.716861441" watchObservedRunningTime="2026-04-23 16:35:13.482164772 +0000 UTC m=+20.745039256" Apr 23 16:35:13.502956 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:13.502901 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7fhkj" podStartSLOduration=3.256262386 podStartE2EDuration="20.50288123s" podCreationTimestamp="2026-04-23 16:34:53 +0000 UTC" firstStartedPulling="2026-04-23 16:34:55.935875096 +0000 UTC m=+3.198749558" lastFinishedPulling="2026-04-23 16:35:13.182493944 +0000 UTC m=+20.445368402" observedRunningTime="2026-04-23 16:35:13.483176002 +0000 UTC m=+20.746050513" watchObservedRunningTime="2026-04-23 16:35:13.50288123 +0000 UTC m=+20.765755711" Apr 23 16:35:13.503127 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:13.503096 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fx8gg" podStartSLOduration=3.245202028 podStartE2EDuration="20.503088173s" podCreationTimestamp="2026-04-23 16:34:53 +0000 UTC" firstStartedPulling="2026-04-23 16:34:55.924483708 +0000 UTC m=+3.187358166" lastFinishedPulling="2026-04-23 16:35:13.182369847 +0000 UTC m=+20.445244311" observedRunningTime="2026-04-23 16:35:13.501868392 +0000 UTC m=+20.764742872" watchObservedRunningTime="2026-04-23 16:35:13.503088173 +0000 UTC m=+20.765962653" Apr 23 16:35:13.522143 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:13.521903 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" podStartSLOduration=3.263319552 podStartE2EDuration="20.521884944s" podCreationTimestamp="2026-04-23 16:34:53 +0000 UTC" firstStartedPulling="2026-04-23 16:34:55.925566942 +0000 UTC m=+3.188441413" lastFinishedPulling="2026-04-23 16:35:13.184132336 +0000 UTC m=+20.447006805" observedRunningTime="2026-04-23 16:35:13.521588556 +0000 UTC m=+20.784463035" watchObservedRunningTime="2026-04-23 16:35:13.521884944 +0000 UTC m=+20.784759424" Apr 23 16:35:14.359697 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:14.359663 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:14.359845 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:14.359767 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qlwjk" podUID="3b7bc009-2549-40d8-b0f3-979edf176475" Apr 23 16:35:14.480335 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:14.480293 2575 generic.go:358] "Generic (PLEG): container finished" podID="0a621457-5864-41d6-92e2-1094d54e41ae" containerID="165cc1b18b526646477b8db6e7b858bf0ea087b51f39dc6d412ae9cb7018c373" exitCode=0 Apr 23 16:35:14.481019 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:14.480372 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kznw2" event={"ID":"0a621457-5864-41d6-92e2-1094d54e41ae","Type":"ContainerDied","Data":"165cc1b18b526646477b8db6e7b858bf0ea087b51f39dc6d412ae9cb7018c373"} Apr 23 16:35:14.483034 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:14.482902 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 16:35:14.483339 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:14.483320 2575 generic.go:358] "Generic (PLEG): container finished" podID="9a94ae8c-bc2c-4382-b11c-ef8e0a012d18" containerID="89022a62933737e9242f3313647b5e54d13f5c8aa0afea5be3d43a03e5279d16" exitCode=1 Apr 23 16:35:14.483409 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:14.483390 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" event={"ID":"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18","Type":"ContainerStarted","Data":"7c2bd39ef0eccdda50558747ef1bcc09150428686eff53865d3b88e3e03103b1"} Apr 23 16:35:14.483510 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:14.483418 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" event={"ID":"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18","Type":"ContainerStarted","Data":"7c0135464bcb7d9400ee0fd452d3cfee845c3a40aff7a35c475858924ddcd4d6"} Apr 23 16:35:14.483510 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:14.483432 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" event={"ID":"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18","Type":"ContainerStarted","Data":"a9c0daf5df31add403968d052162910aa8d1d98424af2b750a8bdf31819444f2"} Apr 23 16:35:14.483510 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:14.483445 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" event={"ID":"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18","Type":"ContainerStarted","Data":"21f244605a4b1b2c4a1a49c1d7e6713171d3f0c21cc515afd91ee3fda3fc2a85"} Apr 23 16:35:14.483510 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:14.483455 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" event={"ID":"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18","Type":"ContainerDied","Data":"89022a62933737e9242f3313647b5e54d13f5c8aa0afea5be3d43a03e5279d16"} Apr 23 16:35:14.483510 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:14.483469 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" event={"ID":"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18","Type":"ContainerStarted","Data":"af293045a52b2644eaa516f63c34592e466e633fd7aedbc69d88dd75362b6dc9"} Apr 23 16:35:14.484471 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:14.484452 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8ltxj" event={"ID":"db812877-6017-42fa-aef3-97e4f03153ce","Type":"ContainerStarted","Data":"39684214c2e5783b32ed20f6982ba24c135a8856a16f8e9b4addf8e47bb62a2c"} Apr 23 16:35:14.485732 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:14.485712 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" event={"ID":"3e7ecdff-727d-4e4b-8277-cecdb204253c","Type":"ContainerStarted","Data":"6edfc8ebe5339cc5c17abed10999a6dc6b4a4fb0d2cf331bb7bb8e09ac763b5b"} Apr 23 16:35:14.487044 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:14.487020 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x4nj8" event={"ID":"6e7cb2c4-5b50-4081-b3e2-d41f862f81f2","Type":"ContainerStarted","Data":"c4570fbc638b84d6a71b9860b1d32e642e15ef26781e2a95d276c787fef88659"} Apr 23 16:35:14.515625 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:14.515575 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-x4nj8" podStartSLOduration=3.984706668 podStartE2EDuration="21.515558405s" podCreationTimestamp="2026-04-23 16:34:53 +0000 UTC" firstStartedPulling="2026-04-23 16:34:55.92895084 +0000 UTC m=+3.191825300" lastFinishedPulling="2026-04-23 16:35:13.459802579 +0000 UTC m=+20.722677037" observedRunningTime="2026-04-23 16:35:14.515422105 +0000 UTC m=+21.778296584" watchObservedRunningTime="2026-04-23 16:35:14.515558405 +0000 UTC m=+21.778432886" Apr 23 16:35:14.606025 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:14.606002 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 16:35:15.356746 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:15.355938 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T16:35:14.60602111Z","UUID":"5cb904c9-5ec0-4fb3-b6a3-f407d7479727","Handler":null,"Name":"","Endpoint":""} Apr 23 16:35:15.359604 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:15.359194 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 16:35:15.359604 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:15.359228 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 16:35:15.360992 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:15.360490 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:15.360992 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:15.360640 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-grr5m" podUID="53be1a16-14c5-4f4a-b293-a31505dc39e3" Apr 23 16:35:15.490683 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:15.490649 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gfsbv" event={"ID":"2333206a-7852-464c-abb6-f9aab741441c","Type":"ContainerStarted","Data":"be94147518156a11c04e28396c50381b84e97f654e75a5391f92bbfe8e7e7fbc"} Apr 23 16:35:15.492820 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:15.492738 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" event={"ID":"3e7ecdff-727d-4e4b-8277-cecdb204253c","Type":"ContainerStarted","Data":"0f98620bfb6a7787959aeeb4d89fb51b45b7750078e91bd0b5af7ce367e39a79"} Apr 23 16:35:15.492820 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:15.492770 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" event={"ID":"3e7ecdff-727d-4e4b-8277-cecdb204253c","Type":"ContainerStarted","Data":"c1ceca2f01102a5e5b0a740be225fbd8aae8b4a3e7d4c75cae05cf16783d5149"} Apr 23 16:35:15.560796 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:15.560742 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-gfsbv" podStartSLOduration=5.315413003 podStartE2EDuration="22.560727037s" podCreationTimestamp="2026-04-23 16:34:53 +0000 UTC" firstStartedPulling="2026-04-23 16:34:55.937158313 +0000 UTC m=+3.200032773" lastFinishedPulling="2026-04-23 16:35:13.182472339 +0000 UTC m=+20.445346807" observedRunningTime="2026-04-23 16:35:15.536995407 +0000 UTC m=+22.799869887" watchObservedRunningTime="2026-04-23 16:35:15.560727037 +0000 UTC m=+22.823601510" Apr 23 16:35:16.359612 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:16.359575 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:16.359814 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:16.359686 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qlwjk" podUID="3b7bc009-2549-40d8-b0f3-979edf176475" Apr 23 16:35:16.497839 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:16.497811 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 16:35:16.498288 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:16.498138 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" event={"ID":"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18","Type":"ContainerStarted","Data":"d1bc71925035f15054bd4d298a2151e84931f8c8a88ebeba6cb1ae7459fe1cfc"} Apr 23 16:35:17.297403 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:17.297370 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7fhkj" Apr 23 16:35:17.298477 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:17.298459 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7fhkj" Apr 23 16:35:17.315964 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:17.315908 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqm4p" podStartSLOduration=4.919071989 podStartE2EDuration="24.315892875s" podCreationTimestamp="2026-04-23 16:34:53 +0000 UTC" firstStartedPulling="2026-04-23 16:34:55.931876508 +0000 UTC m=+3.194750980" lastFinishedPulling="2026-04-23 16:35:15.328697408 +0000 UTC m=+22.591571866" observedRunningTime="2026-04-23 16:35:15.561996435 +0000 UTC m=+22.824870915" watchObservedRunningTime="2026-04-23 16:35:17.315892875 +0000 UTC m=+24.578767355" Apr 23 16:35:17.359731 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:17.359708 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:17.359887 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:17.359834 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-grr5m" podUID="53be1a16-14c5-4f4a-b293-a31505dc39e3" Apr 23 16:35:17.500036 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:17.500000 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7fhkj" Apr 23 16:35:17.500762 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:17.500514 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7fhkj" Apr 23 16:35:18.359752 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:18.359678 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:18.359910 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:18.359801 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qlwjk" podUID="3b7bc009-2549-40d8-b0f3-979edf176475" Apr 23 16:35:19.360130 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:19.359960 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:19.360663 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:19.360237 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-grr5m" podUID="53be1a16-14c5-4f4a-b293-a31505dc39e3" Apr 23 16:35:19.505755 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:19.505719 2575 generic.go:358] "Generic (PLEG): container finished" podID="0a621457-5864-41d6-92e2-1094d54e41ae" containerID="542f7a0f7123be11bc7518d267f2089ad9c73064355a2d75fa3c8f9c5f1c5f8d" exitCode=0 Apr 23 16:35:19.505957 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:19.505815 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kznw2" event={"ID":"0a621457-5864-41d6-92e2-1094d54e41ae","Type":"ContainerDied","Data":"542f7a0f7123be11bc7518d267f2089ad9c73064355a2d75fa3c8f9c5f1c5f8d"} Apr 23 16:35:19.508807 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:19.508791 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 16:35:19.509231 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:19.509111 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" event={"ID":"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18","Type":"ContainerStarted","Data":"aa5ecea1598129b74f51222d45496b497fa9d976d7c5d7e2c27222b3686bf566"} Apr 23 16:35:19.509578 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:19.509564 2575 scope.go:117] "RemoveContainer" containerID="89022a62933737e9242f3313647b5e54d13f5c8aa0afea5be3d43a03e5279d16" Apr 23 16:35:20.360619 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:20.360449 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:20.360902 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:20.360701 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qlwjk" podUID="3b7bc009-2549-40d8-b0f3-979edf176475" Apr 23 16:35:20.464907 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:20.464838 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qlwjk"] Apr 23 16:35:20.467418 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:20.467395 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-grr5m"] Apr 23 16:35:20.467522 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:20.467495 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:20.467619 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:20.467602 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-grr5m" podUID="53be1a16-14c5-4f4a-b293-a31505dc39e3" Apr 23 16:35:20.512615 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:20.512583 2575 generic.go:358] "Generic (PLEG): container finished" podID="0a621457-5864-41d6-92e2-1094d54e41ae" containerID="a6306e4664bf2564af1f4b4d8aa34a494934ae88da761b79e0ef9d539abfef3f" exitCode=0 Apr 23 16:35:20.512766 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:20.512662 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kznw2" event={"ID":"0a621457-5864-41d6-92e2-1094d54e41ae","Type":"ContainerDied","Data":"a6306e4664bf2564af1f4b4d8aa34a494934ae88da761b79e0ef9d539abfef3f"} Apr 23 16:35:20.515891 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:20.515875 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 16:35:20.516256 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:20.516241 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:20.516303 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:20.516249 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" event={"ID":"9a94ae8c-bc2c-4382-b11c-ef8e0a012d18","Type":"ContainerStarted","Data":"6ec90afa7dc806853b0722cc631f1a9ec20c348a34235580d9e7ace7ed3f6c6c"} Apr 23 16:35:20.516355 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:20.516327 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qlwjk" podUID="3b7bc009-2549-40d8-b0f3-979edf176475" Apr 23 16:35:20.516491 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:20.516477 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:35:20.516564 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:20.516497 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:35:20.516564 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:20.516507 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:35:20.531614 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:20.531583 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:35:20.532414 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:20.532400 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:35:20.574922 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:20.574871 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" podStartSLOduration=10.061728796 podStartE2EDuration="27.574853396s" podCreationTimestamp="2026-04-23 16:34:53 +0000 UTC" firstStartedPulling="2026-04-23 16:34:55.928375577 +0000 UTC m=+3.191250042" lastFinishedPulling="2026-04-23 16:35:13.44150018 +0000 UTC m=+20.704374642" observedRunningTime="2026-04-23 16:35:20.574317498 +0000 UTC m=+27.837191979" watchObservedRunningTime="2026-04-23 16:35:20.574853396 +0000 UTC m=+27.837727877" Apr 23 16:35:20.953105 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:20.953071 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6tzts"] Apr 23 16:35:20.954782 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:20.954766 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:20.954845 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:20.954827 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6tzts" podUID="234f7274-d920-45c3-b270-d3c0251a85d9" Apr 23 16:35:20.967307 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:20.967271 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6tzts"] Apr 23 16:35:21.078630 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:21.078601 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/234f7274-d920-45c3-b270-d3c0251a85d9-original-pull-secret\") pod \"global-pull-secret-syncer-6tzts\" (UID: \"234f7274-d920-45c3-b270-d3c0251a85d9\") " pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:21.078805 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:21.078638 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/234f7274-d920-45c3-b270-d3c0251a85d9-kubelet-config\") pod \"global-pull-secret-syncer-6tzts\" (UID: \"234f7274-d920-45c3-b270-d3c0251a85d9\") " pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:21.078805 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:21.078723 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/234f7274-d920-45c3-b270-d3c0251a85d9-dbus\") pod \"global-pull-secret-syncer-6tzts\" (UID: \"234f7274-d920-45c3-b270-d3c0251a85d9\") " pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:21.179316 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:21.179293 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/234f7274-d920-45c3-b270-d3c0251a85d9-original-pull-secret\") pod \"global-pull-secret-syncer-6tzts\" (UID: \"234f7274-d920-45c3-b270-d3c0251a85d9\") " pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:21.179401 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:21.179324 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/234f7274-d920-45c3-b270-d3c0251a85d9-kubelet-config\") pod \"global-pull-secret-syncer-6tzts\" (UID: \"234f7274-d920-45c3-b270-d3c0251a85d9\") " pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:21.179401 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:21.179354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/234f7274-d920-45c3-b270-d3c0251a85d9-dbus\") pod \"global-pull-secret-syncer-6tzts\" (UID: \"234f7274-d920-45c3-b270-d3c0251a85d9\") " pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:21.179501 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:21.179440 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:21.179501 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:21.179457 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/234f7274-d920-45c3-b270-d3c0251a85d9-kubelet-config\") pod \"global-pull-secret-syncer-6tzts\" (UID: \"234f7274-d920-45c3-b270-d3c0251a85d9\") " pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:21.179501 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:21.179501 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234f7274-d920-45c3-b270-d3c0251a85d9-original-pull-secret podName:234f7274-d920-45c3-b270-d3c0251a85d9 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:21.679482379 +0000 UTC m=+28.942356838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/234f7274-d920-45c3-b270-d3c0251a85d9-original-pull-secret") pod "global-pull-secret-syncer-6tzts" (UID: "234f7274-d920-45c3-b270-d3c0251a85d9") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:21.179653 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:21.179620 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/234f7274-d920-45c3-b270-d3c0251a85d9-dbus\") pod \"global-pull-secret-syncer-6tzts\" (UID: \"234f7274-d920-45c3-b270-d3c0251a85d9\") " pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:21.519475 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:21.519435 2575 generic.go:358] "Generic (PLEG): container finished" podID="0a621457-5864-41d6-92e2-1094d54e41ae" containerID="4c2fcd64eec7e0b87cb9f47f03dbebebfdfa7b5d668b6c73d41c14381c8890a6" exitCode=0 Apr 23 16:35:21.519898 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:21.519540 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kznw2" event={"ID":"0a621457-5864-41d6-92e2-1094d54e41ae","Type":"ContainerDied","Data":"4c2fcd64eec7e0b87cb9f47f03dbebebfdfa7b5d668b6c73d41c14381c8890a6"} Apr 23 16:35:21.520349 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:21.520082 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:21.520349 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:21.520203 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6tzts" podUID="234f7274-d920-45c3-b270-d3c0251a85d9" Apr 23 16:35:21.684696 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:21.684639 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/234f7274-d920-45c3-b270-d3c0251a85d9-original-pull-secret\") pod \"global-pull-secret-syncer-6tzts\" (UID: \"234f7274-d920-45c3-b270-d3c0251a85d9\") " pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:21.684862 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:21.684794 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:21.684914 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:21.684867 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234f7274-d920-45c3-b270-d3c0251a85d9-original-pull-secret podName:234f7274-d920-45c3-b270-d3c0251a85d9 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:22.684848343 +0000 UTC m=+29.947722805 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/234f7274-d920-45c3-b270-d3c0251a85d9-original-pull-secret") pod "global-pull-secret-syncer-6tzts" (UID: "234f7274-d920-45c3-b270-d3c0251a85d9") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:22.360231 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:22.360197 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:22.360231 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:22.360222 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:22.360489 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:22.360328 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qlwjk" podUID="3b7bc009-2549-40d8-b0f3-979edf176475" Apr 23 16:35:22.360489 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:22.360460 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-grr5m" podUID="53be1a16-14c5-4f4a-b293-a31505dc39e3" Apr 23 16:35:22.693136 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:22.693098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/234f7274-d920-45c3-b270-d3c0251a85d9-original-pull-secret\") pod \"global-pull-secret-syncer-6tzts\" (UID: \"234f7274-d920-45c3-b270-d3c0251a85d9\") " pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:22.693874 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:22.693225 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:22.693874 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:22.693284 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234f7274-d920-45c3-b270-d3c0251a85d9-original-pull-secret podName:234f7274-d920-45c3-b270-d3c0251a85d9 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:24.693270211 +0000 UTC m=+31.956144671 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/234f7274-d920-45c3-b270-d3c0251a85d9-original-pull-secret") pod "global-pull-secret-syncer-6tzts" (UID: "234f7274-d920-45c3-b270-d3c0251a85d9") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:23.360372 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:23.360345 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:23.360552 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:23.360450 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6tzts" podUID="234f7274-d920-45c3-b270-d3c0251a85d9" Apr 23 16:35:24.360367 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:24.360337 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:24.360367 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:24.360358 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:24.360884 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:24.360458 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qlwjk" podUID="3b7bc009-2549-40d8-b0f3-979edf176475" Apr 23 16:35:24.360884 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:24.360597 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-grr5m" podUID="53be1a16-14c5-4f4a-b293-a31505dc39e3" Apr 23 16:35:24.707243 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:24.707161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/234f7274-d920-45c3-b270-d3c0251a85d9-original-pull-secret\") pod \"global-pull-secret-syncer-6tzts\" (UID: \"234f7274-d920-45c3-b270-d3c0251a85d9\") " pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:24.707397 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:24.707303 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:24.707397 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:24.707357 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234f7274-d920-45c3-b270-d3c0251a85d9-original-pull-secret podName:234f7274-d920-45c3-b270-d3c0251a85d9 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:28.707342272 +0000 UTC m=+35.970216730 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/234f7274-d920-45c3-b270-d3c0251a85d9-original-pull-secret") pod "global-pull-secret-syncer-6tzts" (UID: "234f7274-d920-45c3-b270-d3c0251a85d9") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:25.360202 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:25.360160 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:25.360376 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:25.360308 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6tzts" podUID="234f7274-d920-45c3-b270-d3c0251a85d9" Apr 23 16:35:26.089609 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.089410 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-110.ec2.internal" event="NodeReady" Apr 23 16:35:26.090031 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.089717 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 16:35:26.130076 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.130038 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-84465bb98d-vbtvd"] Apr 23 16:35:26.152427 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.152383 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb"] Apr 23 16:35:26.152637 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.152582 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.156683 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.156222 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 16:35:26.156683 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.156230 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-d4d8w\"" Apr 23 16:35:26.156683 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.156493 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 16:35:26.156683 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.156684 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 16:35:26.166383 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.166345 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-sqtch"] Apr 23 16:35:26.169386 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.169357 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 16:35:26.183657 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.182757 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-9d787c8db-s4x7f"] Apr 23 16:35:26.183657 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.182799 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.183657 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.182813 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" Apr 23 16:35:26.187030 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.187005 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 16:35:26.187387 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.187369 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 16:35:26.187465 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.187419 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 16:35:26.187559 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.187545 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 16:35:26.187730 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.187700 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 16:35:26.187798 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.187778 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 23 16:35:26.187961 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.187946 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 16:35:26.188333 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.188315 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-67fr9\"" Apr 23 16:35:26.188985 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.188955 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 23 16:35:26.189463 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.189444 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-mvzbd\"" Apr 23 16:35:26.193972 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.193870 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r"] Apr 23 16:35:26.196123 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.196021 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:26.200012 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.199988 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 16:35:26.201088 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.201068 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 16:35:26.201182 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.201138 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 23 16:35:26.201650 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.201630 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 23 16:35:26.203577 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.203554 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 23 16:35:26.204075 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.204047 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 23 16:35:26.204075 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.204074 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 16:35:26.204213 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.204080 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-trh28\"" Apr 23 16:35:26.211849 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.211795 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb"] Apr 23 16:35:26.212228 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.212205 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r" Apr 23 16:35:26.215064 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.215043 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-d6jjb\"" Apr 23 16:35:26.215744 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.215728 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 23 16:35:26.215849 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.215770 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:26.215849 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.215822 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 23 16:35:26.216362 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.216343 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:26.226649 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.226625 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb"] Apr 23 16:35:26.227128 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.227105 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb" Apr 23 16:35:26.231922 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.231896 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 23 16:35:26.232194 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.232177 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:26.232415 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.232400 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-tnww5\"" Apr 23 16:35:26.232777 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.232760 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:26.245036 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.245018 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-768xz"] Apr 23 16:35:26.245217 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.245190 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb" Apr 23 16:35:26.248449 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.248426 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 23 16:35:26.248826 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.248805 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 23 16:35:26.249014 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.248990 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:26.249153 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.249040 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-hdqzh\"" Apr 23 16:35:26.249153 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.249139 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:26.257925 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.257905 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-j4s49"] Apr 23 16:35:26.258069 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.258054 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-768xz" Apr 23 16:35:26.261725 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.261705 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:26.262353 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.262336 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-mcgbc\"" Apr 23 16:35:26.262475 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.262385 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:26.285607 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.285580 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-84465bb98d-vbtvd"] Apr 23 16:35:26.285747 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.285617 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb"] Apr 23 16:35:26.285747 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.285633 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r"] Apr 23 16:35:26.285747 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.285646 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb"] Apr 23 16:35:26.285747 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.285660 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb"] Apr 23 16:35:26.285747 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.285659 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:26.285747 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.285667 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-9d787c8db-s4x7f"] Apr 23 16:35:26.285747 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.285679 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5kp97"] Apr 23 16:35:26.289551 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.289503 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:26.289666 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.289551 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:26.289666 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.289555 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 16:35:26.289666 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.289510 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-mhx4z\"" Apr 23 16:35:26.289881 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.289864 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 16:35:26.295357 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.295328 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 16:35:26.307158 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.307126 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-768xz"] Apr 23 16:35:26.307311 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.307165 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-sqtch"] Apr 23 16:35:26.307311 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.307179 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-j4s49"] Apr 23 16:35:26.307311 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.307195 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5kp97"] Apr 23 16:35:26.307311 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.307218 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-blbwq"] Apr 23 16:35:26.307311 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.307238 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5kp97" Apr 23 16:35:26.311415 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.311390 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 16:35:26.311854 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.311681 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 16:35:26.311854 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.311730 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 16:35:26.311854 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.311776 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w6ksl\"" Apr 23 16:35:26.316968 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.316940 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjz6b\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-kube-api-access-gjz6b\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.317076 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.316983 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxb88\" (UniqueName: \"kubernetes.io/projected/5f517baa-7cef-488d-bba8-ced00dd978ea-kube-api-access-rxb88\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:26.317076 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317064 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/982b5b8d-3beb-472b-af96-34991930ce23-serving-cert\") pod \"service-ca-operator-d6fc45fc5-wgk4r\" (UID: \"982b5b8d-3beb-472b-af96-34991930ce23\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r" Apr 23 16:35:26.317180 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317134 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d5e2043-b837-4d41-b2f9-b6155ad80b36-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-sqtch\" (UID: \"4d5e2043-b837-4d41-b2f9-b6155ad80b36\") " pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.317255 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317212 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:26.317314 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317282 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:26.317314 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317309 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d5e2043-b837-4d41-b2f9-b6155ad80b36-tmp\") pod \"insights-operator-585dfdc468-sqtch\" (UID: \"4d5e2043-b837-4d41-b2f9-b6155ad80b36\") " pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.317402 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317334 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4d5e2043-b837-4d41-b2f9-b6155ad80b36-snapshots\") pod \"insights-operator-585dfdc468-sqtch\" (UID: \"4d5e2043-b837-4d41-b2f9-b6155ad80b36\") " pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.317438 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317386 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvvh8\" (UniqueName: \"kubernetes.io/projected/4d5e2043-b837-4d41-b2f9-b6155ad80b36-kube-api-access-hvvh8\") pod \"insights-operator-585dfdc468-sqtch\" (UID: \"4d5e2043-b837-4d41-b2f9-b6155ad80b36\") " pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.317495 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f11952e-c294-49f7-be03-3644d42784ef-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jx8jb\" (UID: \"7f11952e-c294-49f7-be03-3644d42784ef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb" Apr 23 16:35:26.317495 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317464 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38274876-af8d-4558-a05f-022de9171c7d-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6vzmb\" (UID: \"38274876-af8d-4558-a05f-022de9171c7d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb" Apr 23 16:35:26.317495 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317486 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-jmwrb\" (UID: \"46a1bc5d-9337-4a1e-9e0f-4ae982b76d79\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" Apr 23 16:35:26.317672 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317509 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-jmwrb\" (UID: \"46a1bc5d-9337-4a1e-9e0f-4ae982b76d79\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" Apr 23 16:35:26.317672 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317551 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d8bc09ea-e1da-495e-98df-c9bcabf133f3-ca-trust-extracted\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.317672 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317574 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg9k8\" (UniqueName: \"kubernetes.io/projected/982b5b8d-3beb-472b-af96-34991930ce23-kube-api-access-jg9k8\") pod \"service-ca-operator-d6fc45fc5-wgk4r\" (UID: \"982b5b8d-3beb-472b-af96-34991930ce23\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r" Apr 23 16:35:26.317672 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317594 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d8bc09ea-e1da-495e-98df-c9bcabf133f3-image-registry-private-configuration\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.317672 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317619 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d5e2043-b837-4d41-b2f9-b6155ad80b36-serving-cert\") pod \"insights-operator-585dfdc468-sqtch\" (UID: \"4d5e2043-b837-4d41-b2f9-b6155ad80b36\") " pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.317894 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317684 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhcl8\" (UniqueName: \"kubernetes.io/projected/7f11952e-c294-49f7-be03-3644d42784ef-kube-api-access-rhcl8\") pod \"cluster-samples-operator-6dc5bdb6b4-jx8jb\" (UID: \"7f11952e-c294-49f7-be03-3644d42784ef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb" Apr 23 16:35:26.317894 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317705 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-certificates\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.317894 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317732 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d5e2043-b837-4d41-b2f9-b6155ad80b36-service-ca-bundle\") pod \"insights-operator-585dfdc468-sqtch\" (UID: \"4d5e2043-b837-4d41-b2f9-b6155ad80b36\") " pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.317894 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317754 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/982b5b8d-3beb-472b-af96-34991930ce23-config\") pod \"service-ca-operator-d6fc45fc5-wgk4r\" (UID: \"982b5b8d-3beb-472b-af96-34991930ce23\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r" Apr 23 16:35:26.317894 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317779 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzj8f\" (UniqueName: \"kubernetes.io/projected/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-kube-api-access-xzj8f\") pod \"cluster-monitoring-operator-75587bd455-jmwrb\" (UID: \"46a1bc5d-9337-4a1e-9e0f-4ae982b76d79\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" Apr 23 16:35:26.317894 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317796 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d8bc09ea-e1da-495e-98df-c9bcabf133f3-installation-pull-secrets\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.317894 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hxzl\" (UniqueName: \"kubernetes.io/projected/38274876-af8d-4558-a05f-022de9171c7d-kube-api-access-2hxzl\") pod \"kube-storage-version-migrator-operator-6769c5d45-6vzmb\" (UID: \"38274876-af8d-4558-a05f-022de9171c7d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb" Apr 23 16:35:26.317894 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317853 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-default-certificate\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:26.317894 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317882 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8bc09ea-e1da-495e-98df-c9bcabf133f3-trusted-ca\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.318250 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38274876-af8d-4558-a05f-022de9171c7d-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6vzmb\" (UID: \"38274876-af8d-4558-a05f-022de9171c7d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb" Apr 23 16:35:26.318250 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317937 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-stats-auth\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:26.318250 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317965 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.318250 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.317989 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-bound-sa-token\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.325450 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.325422 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-blbwq"] Apr 23 16:35:26.325620 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.325601 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:26.328688 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.328668 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-szgdw\"" Apr 23 16:35:26.328917 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.328897 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 16:35:26.329152 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.329131 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 16:35:26.329384 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.329365 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 16:35:26.329478 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.329405 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 16:35:26.359637 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.359602 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:26.359637 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.359656 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:26.362545 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.362498 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:35:26.362674 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.362593 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hhjgv\"" Apr 23 16:35:26.362674 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.362629 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:35:26.362674 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.362650 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:35:26.362896 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.362678 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5zkmp\"" Apr 23 16:35:26.419005 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.418971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d8bc09ea-e1da-495e-98df-c9bcabf133f3-image-registry-private-configuration\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.419005 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d5e2043-b837-4d41-b2f9-b6155ad80b36-serving-cert\") pod \"insights-operator-585dfdc468-sqtch\" (UID: \"4d5e2043-b837-4d41-b2f9-b6155ad80b36\") " pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.419252 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhcl8\" (UniqueName: \"kubernetes.io/projected/7f11952e-c294-49f7-be03-3644d42784ef-kube-api-access-rhcl8\") pod \"cluster-samples-operator-6dc5bdb6b4-jx8jb\" (UID: \"7f11952e-c294-49f7-be03-3644d42784ef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb" Apr 23 16:35:26.419312 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419274 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0fa4bfb-d80b-4502-9523-25faf9cd73c4-config\") pod \"console-operator-9d4b6777b-j4s49\" (UID: \"c0fa4bfb-d80b-4502-9523-25faf9cd73c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:26.419312 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419300 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-certificates\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.419395 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419317 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d5e2043-b837-4d41-b2f9-b6155ad80b36-service-ca-bundle\") pod \"insights-operator-585dfdc468-sqtch\" (UID: \"4d5e2043-b837-4d41-b2f9-b6155ad80b36\") " pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.419395 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/982b5b8d-3beb-472b-af96-34991930ce23-config\") pod \"service-ca-operator-d6fc45fc5-wgk4r\" (UID: \"982b5b8d-3beb-472b-af96-34991930ce23\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r" Apr 23 16:35:26.419487 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419400 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzj8f\" (UniqueName: \"kubernetes.io/projected/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-kube-api-access-xzj8f\") pod \"cluster-monitoring-operator-75587bd455-jmwrb\" (UID: \"46a1bc5d-9337-4a1e-9e0f-4ae982b76d79\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" Apr 23 16:35:26.419487 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419422 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d8bc09ea-e1da-495e-98df-c9bcabf133f3-installation-pull-secrets\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.419487 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgh8p\" (UniqueName: \"kubernetes.io/projected/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-kube-api-access-wgh8p\") pod \"dns-default-blbwq\" (UID: \"a1edcb0c-3893-4226-86ba-f6c8e3795d5a\") " pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:26.419487 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419478 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hxzl\" (UniqueName: \"kubernetes.io/projected/38274876-af8d-4558-a05f-022de9171c7d-kube-api-access-2hxzl\") pod \"kube-storage-version-migrator-operator-6769c5d45-6vzmb\" (UID: \"38274876-af8d-4558-a05f-022de9171c7d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb" Apr 23 16:35:26.419709 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419503 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-default-certificate\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:26.419709 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419545 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8bc09ea-e1da-495e-98df-c9bcabf133f3-trusted-ca\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.419709 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38274876-af8d-4558-a05f-022de9171c7d-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6vzmb\" (UID: \"38274876-af8d-4558-a05f-022de9171c7d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb" Apr 23 16:35:26.419709 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419590 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-stats-auth\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:26.419709 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419622 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.419709 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419638 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-bound-sa-token\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.419709 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419656 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjz6b\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-kube-api-access-gjz6b\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.419709 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419690 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0fa4bfb-d80b-4502-9523-25faf9cd73c4-serving-cert\") pod \"console-operator-9d4b6777b-j4s49\" (UID: \"c0fa4bfb-d80b-4502-9523-25faf9cd73c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:26.419979 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxb88\" (UniqueName: \"kubernetes.io/projected/5f517baa-7cef-488d-bba8-ced00dd978ea-kube-api-access-rxb88\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:26.419979 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419759 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/982b5b8d-3beb-472b-af96-34991930ce23-serving-cert\") pod \"service-ca-operator-d6fc45fc5-wgk4r\" (UID: \"982b5b8d-3beb-472b-af96-34991930ce23\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r" Apr 23 16:35:26.419979 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419818 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d5e2043-b837-4d41-b2f9-b6155ad80b36-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-sqtch\" (UID: \"4d5e2043-b837-4d41-b2f9-b6155ad80b36\") " pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.419979 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419846 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:26.419979 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419880 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:26.419979 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419908 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d5e2043-b837-4d41-b2f9-b6155ad80b36-tmp\") pod \"insights-operator-585dfdc468-sqtch\" (UID: \"4d5e2043-b837-4d41-b2f9-b6155ad80b36\") " pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.419979 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4d5e2043-b837-4d41-b2f9-b6155ad80b36-snapshots\") pod \"insights-operator-585dfdc468-sqtch\" (UID: \"4d5e2043-b837-4d41-b2f9-b6155ad80b36\") " pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.419979 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvvh8\" (UniqueName: \"kubernetes.io/projected/4d5e2043-b837-4d41-b2f9-b6155ad80b36-kube-api-access-hvvh8\") pod \"insights-operator-585dfdc468-sqtch\" (UID: \"4d5e2043-b837-4d41-b2f9-b6155ad80b36\") " pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.420346 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.419993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f11952e-c294-49f7-be03-3644d42784ef-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jx8jb\" (UID: \"7f11952e-c294-49f7-be03-3644d42784ef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb" Apr 23 16:35:26.420346 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.420106 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:35:26.420346 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.420112 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/982b5b8d-3beb-472b-af96-34991930ce23-config\") pod \"service-ca-operator-d6fc45fc5-wgk4r\" (UID: \"982b5b8d-3beb-472b-af96-34991930ce23\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r" Apr 23 16:35:26.420346 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.420173 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs podName:5f517baa-7cef-488d-bba8-ced00dd978ea nodeName:}" failed. No retries permitted until 2026-04-23 16:35:26.920150195 +0000 UTC m=+34.183024667 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs") pod "router-default-9d787c8db-s4x7f" (UID: "5f517baa-7cef-488d-bba8-ced00dd978ea") : secret "router-metrics-certs-default" not found Apr 23 16:35:26.420346 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.420337 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d5e2043-b837-4d41-b2f9-b6155ad80b36-tmp\") pod \"insights-operator-585dfdc468-sqtch\" (UID: \"4d5e2043-b837-4d41-b2f9-b6155ad80b36\") " pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.420754 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.420732 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8bc09ea-e1da-495e-98df-c9bcabf133f3-trusted-ca\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.420839 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.420764 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38274876-af8d-4558-a05f-022de9171c7d-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6vzmb\" (UID: \"38274876-af8d-4558-a05f-022de9171c7d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb" Apr 23 16:35:26.420839 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.420821 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4d5e2043-b837-4d41-b2f9-b6155ad80b36-snapshots\") pod \"insights-operator-585dfdc468-sqtch\" (UID: \"4d5e2043-b837-4d41-b2f9-b6155ad80b36\") " pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.420839 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.420833 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-certificates\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.421093 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.421071 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d5e2043-b837-4d41-b2f9-b6155ad80b36-service-ca-bundle\") pod \"insights-operator-585dfdc468-sqtch\" (UID: \"4d5e2043-b837-4d41-b2f9-b6155ad80b36\") " pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.421178 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.421142 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle podName:5f517baa-7cef-488d-bba8-ced00dd978ea nodeName:}" failed. No retries permitted until 2026-04-23 16:35:26.921120697 +0000 UTC m=+34.183995178 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle") pod "router-default-9d787c8db-s4x7f" (UID: "5f517baa-7cef-488d-bba8-ced00dd978ea") : configmap references non-existent config key: service-ca.crt Apr 23 16:35:26.422309 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.421428 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:35:26.422309 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.421449 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84465bb98d-vbtvd: secret "image-registry-tls" not found Apr 23 16:35:26.422309 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.421518 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls podName:d8bc09ea-e1da-495e-98df-c9bcabf133f3 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:26.921499551 +0000 UTC m=+34.184374025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls") pod "image-registry-84465bb98d-vbtvd" (UID: "d8bc09ea-e1da-495e-98df-c9bcabf133f3") : secret "image-registry-tls" not found Apr 23 16:35:26.422309 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.421565 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0fa4bfb-d80b-4502-9523-25faf9cd73c4-trusted-ca\") pod \"console-operator-9d4b6777b-j4s49\" (UID: \"c0fa4bfb-d80b-4502-9523-25faf9cd73c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:26.422309 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.421608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38274876-af8d-4558-a05f-022de9171c7d-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6vzmb\" (UID: \"38274876-af8d-4558-a05f-022de9171c7d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb" Apr 23 16:35:26.422309 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.421639 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-jmwrb\" (UID: \"46a1bc5d-9337-4a1e-9e0f-4ae982b76d79\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" Apr 23 16:35:26.422309 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.421762 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d5e2043-b837-4d41-b2f9-b6155ad80b36-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-sqtch\" (UID: \"4d5e2043-b837-4d41-b2f9-b6155ad80b36\") " pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.422309 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.421790 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-jmwrb\" (UID: \"46a1bc5d-9337-4a1e-9e0f-4ae982b76d79\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" Apr 23 16:35:26.422309 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.421820 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:35:26.422309 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.421866 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f11952e-c294-49f7-be03-3644d42784ef-samples-operator-tls podName:7f11952e-c294-49f7-be03-3644d42784ef nodeName:}" failed. No retries permitted until 2026-04-23 16:35:26.921849839 +0000 UTC m=+34.184724298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7f11952e-c294-49f7-be03-3644d42784ef-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jx8jb" (UID: "7f11952e-c294-49f7-be03-3644d42784ef") : secret "samples-operator-tls" not found Apr 23 16:35:26.422309 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.421821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert\") pod \"ingress-canary-5kp97\" (UID: \"8b59acd4-b73a-4a98-962f-47e8e830e452\") " pod="openshift-ingress-canary/ingress-canary-5kp97" Apr 23 16:35:26.422309 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.421906 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:35:26.422309 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.421909 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d8bc09ea-e1da-495e-98df-c9bcabf133f3-ca-trust-extracted\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.422309 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.421943 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9klww\" (UniqueName: \"kubernetes.io/projected/ff43e2b1-dadc-4902-8486-e9cccc93a38b-kube-api-access-9klww\") pod \"volume-data-source-validator-7c6cbb6c87-768xz\" (UID: \"ff43e2b1-dadc-4902-8486-e9cccc93a38b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-768xz" Apr 23 16:35:26.422309 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.421954 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls podName:46a1bc5d-9337-4a1e-9e0f-4ae982b76d79 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:26.921939082 +0000 UTC m=+34.184813542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-jmwrb" (UID: "46a1bc5d-9337-4a1e-9e0f-4ae982b76d79") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:35:26.422309 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.421989 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-tmp-dir\") pod \"dns-default-blbwq\" (UID: \"a1edcb0c-3893-4226-86ba-f6c8e3795d5a\") " pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:26.423198 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.422020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm576\" (UniqueName: \"kubernetes.io/projected/c0fa4bfb-d80b-4502-9523-25faf9cd73c4-kube-api-access-bm576\") pod \"console-operator-9d4b6777b-j4s49\" (UID: \"c0fa4bfb-d80b-4502-9523-25faf9cd73c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:26.423198 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.422064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg9k8\" (UniqueName: \"kubernetes.io/projected/982b5b8d-3beb-472b-af96-34991930ce23-kube-api-access-jg9k8\") pod \"service-ca-operator-d6fc45fc5-wgk4r\" (UID: \"982b5b8d-3beb-472b-af96-34991930ce23\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r" Apr 23 16:35:26.423198 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.422091 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbrbs\" (UniqueName: \"kubernetes.io/projected/8b59acd4-b73a-4a98-962f-47e8e830e452-kube-api-access-kbrbs\") pod \"ingress-canary-5kp97\" (UID: \"8b59acd4-b73a-4a98-962f-47e8e830e452\") " pod="openshift-ingress-canary/ingress-canary-5kp97" Apr 23 16:35:26.423198 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.422123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-config-volume\") pod \"dns-default-blbwq\" (UID: \"a1edcb0c-3893-4226-86ba-f6c8e3795d5a\") " pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:26.423198 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.422150 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls\") pod \"dns-default-blbwq\" (UID: \"a1edcb0c-3893-4226-86ba-f6c8e3795d5a\") " pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:26.423198 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.422251 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d8bc09ea-e1da-495e-98df-c9bcabf133f3-ca-trust-extracted\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.423198 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.422647 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-jmwrb\" (UID: \"46a1bc5d-9337-4a1e-9e0f-4ae982b76d79\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" Apr 23 16:35:26.424141 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.424111 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/982b5b8d-3beb-472b-af96-34991930ce23-serving-cert\") pod \"service-ca-operator-d6fc45fc5-wgk4r\" (UID: \"982b5b8d-3beb-472b-af96-34991930ce23\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r" Apr 23 16:35:26.424141 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.424128 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d8bc09ea-e1da-495e-98df-c9bcabf133f3-image-registry-private-configuration\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.424275 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.424116 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d5e2043-b837-4d41-b2f9-b6155ad80b36-serving-cert\") pod \"insights-operator-585dfdc468-sqtch\" (UID: \"4d5e2043-b837-4d41-b2f9-b6155ad80b36\") " pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.424275 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.424140 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d8bc09ea-e1da-495e-98df-c9bcabf133f3-installation-pull-secrets\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.425206 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.425189 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38274876-af8d-4558-a05f-022de9171c7d-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6vzmb\" (UID: \"38274876-af8d-4558-a05f-022de9171c7d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb" Apr 23 16:35:26.430244 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.430192 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxb88\" (UniqueName: \"kubernetes.io/projected/5f517baa-7cef-488d-bba8-ced00dd978ea-kube-api-access-rxb88\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:26.430416 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.430393 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhcl8\" (UniqueName: \"kubernetes.io/projected/7f11952e-c294-49f7-be03-3644d42784ef-kube-api-access-rhcl8\") pod \"cluster-samples-operator-6dc5bdb6b4-jx8jb\" (UID: \"7f11952e-c294-49f7-be03-3644d42784ef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb" Apr 23 16:35:26.432513 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.432490 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hxzl\" (UniqueName: \"kubernetes.io/projected/38274876-af8d-4558-a05f-022de9171c7d-kube-api-access-2hxzl\") pod \"kube-storage-version-migrator-operator-6769c5d45-6vzmb\" (UID: \"38274876-af8d-4558-a05f-022de9171c7d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb" Apr 23 16:35:26.432801 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.432776 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvvh8\" (UniqueName: \"kubernetes.io/projected/4d5e2043-b837-4d41-b2f9-b6155ad80b36-kube-api-access-hvvh8\") pod \"insights-operator-585dfdc468-sqtch\" (UID: \"4d5e2043-b837-4d41-b2f9-b6155ad80b36\") " pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.436653 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.436567 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg9k8\" (UniqueName: \"kubernetes.io/projected/982b5b8d-3beb-472b-af96-34991930ce23-kube-api-access-jg9k8\") pod \"service-ca-operator-d6fc45fc5-wgk4r\" (UID: \"982b5b8d-3beb-472b-af96-34991930ce23\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r" Apr 23 16:35:26.436740 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.436691 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-bound-sa-token\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.436740 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.436695 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjz6b\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-kube-api-access-gjz6b\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.438407 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.438386 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-stats-auth\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:26.438588 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.438521 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-default-certificate\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:26.440199 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.440180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzj8f\" (UniqueName: \"kubernetes.io/projected/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-kube-api-access-xzj8f\") pod \"cluster-monitoring-operator-75587bd455-jmwrb\" (UID: \"46a1bc5d-9337-4a1e-9e0f-4ae982b76d79\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" Apr 23 16:35:26.501336 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.501294 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-sqtch" Apr 23 16:35:26.522581 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.522554 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r" Apr 23 16:35:26.522843 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.522806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0fa4bfb-d80b-4502-9523-25faf9cd73c4-serving-cert\") pod \"console-operator-9d4b6777b-j4s49\" (UID: \"c0fa4bfb-d80b-4502-9523-25faf9cd73c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:26.522941 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.522925 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0fa4bfb-d80b-4502-9523-25faf9cd73c4-trusted-ca\") pod \"console-operator-9d4b6777b-j4s49\" (UID: \"c0fa4bfb-d80b-4502-9523-25faf9cd73c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:26.523000 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.522968 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert\") pod \"ingress-canary-5kp97\" (UID: \"8b59acd4-b73a-4a98-962f-47e8e830e452\") " pod="openshift-ingress-canary/ingress-canary-5kp97" Apr 23 16:35:26.523059 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.522997 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9klww\" (UniqueName: \"kubernetes.io/projected/ff43e2b1-dadc-4902-8486-e9cccc93a38b-kube-api-access-9klww\") pod \"volume-data-source-validator-7c6cbb6c87-768xz\" (UID: \"ff43e2b1-dadc-4902-8486-e9cccc93a38b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-768xz" Apr 23 16:35:26.523059 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.523024 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-tmp-dir\") pod \"dns-default-blbwq\" (UID: \"a1edcb0c-3893-4226-86ba-f6c8e3795d5a\") " pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:26.523150 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.523053 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bm576\" (UniqueName: \"kubernetes.io/projected/c0fa4bfb-d80b-4502-9523-25faf9cd73c4-kube-api-access-bm576\") pod \"console-operator-9d4b6777b-j4s49\" (UID: \"c0fa4bfb-d80b-4502-9523-25faf9cd73c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:26.523150 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.523101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbrbs\" (UniqueName: \"kubernetes.io/projected/8b59acd4-b73a-4a98-962f-47e8e830e452-kube-api-access-kbrbs\") pod \"ingress-canary-5kp97\" (UID: \"8b59acd4-b73a-4a98-962f-47e8e830e452\") " pod="openshift-ingress-canary/ingress-canary-5kp97" Apr 23 16:35:26.523150 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.523131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-config-volume\") pod \"dns-default-blbwq\" (UID: \"a1edcb0c-3893-4226-86ba-f6c8e3795d5a\") " pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:26.523295 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.523161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls\") pod \"dns-default-blbwq\" (UID: \"a1edcb0c-3893-4226-86ba-f6c8e3795d5a\") " pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:26.523295 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.523249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0fa4bfb-d80b-4502-9523-25faf9cd73c4-config\") pod \"console-operator-9d4b6777b-j4s49\" (UID: \"c0fa4bfb-d80b-4502-9523-25faf9cd73c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:26.523473 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.523305 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgh8p\" (UniqueName: \"kubernetes.io/projected/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-kube-api-access-wgh8p\") pod \"dns-default-blbwq\" (UID: \"a1edcb0c-3893-4226-86ba-f6c8e3795d5a\") " pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:26.523605 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.523056 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:26.523605 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.523704 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert podName:8b59acd4-b73a-4a98-962f-47e8e830e452 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:27.023678212 +0000 UTC m=+34.286552686 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert") pod "ingress-canary-5kp97" (UID: "8b59acd4-b73a-4a98-962f-47e8e830e452") : secret "canary-serving-cert" not found Apr 23 16:35:26.524075 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.524020 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-tmp-dir\") pod \"dns-default-blbwq\" (UID: \"a1edcb0c-3893-4226-86ba-f6c8e3795d5a\") " pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:26.524443 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.524408 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:26.524555 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.524499 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls podName:a1edcb0c-3893-4226-86ba-f6c8e3795d5a nodeName:}" failed. No retries permitted until 2026-04-23 16:35:27.024468988 +0000 UTC m=+34.287343448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls") pod "dns-default-blbwq" (UID: "a1edcb0c-3893-4226-86ba-f6c8e3795d5a") : secret "dns-default-metrics-tls" not found Apr 23 16:35:26.524628 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.524588 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-config-volume\") pod \"dns-default-blbwq\" (UID: \"a1edcb0c-3893-4226-86ba-f6c8e3795d5a\") " pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:26.524743 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.524719 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0fa4bfb-d80b-4502-9523-25faf9cd73c4-config\") pod \"console-operator-9d4b6777b-j4s49\" (UID: \"c0fa4bfb-d80b-4502-9523-25faf9cd73c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:26.524816 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.524723 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0fa4bfb-d80b-4502-9523-25faf9cd73c4-trusted-ca\") pod \"console-operator-9d4b6777b-j4s49\" (UID: \"c0fa4bfb-d80b-4502-9523-25faf9cd73c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:26.529000 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.528970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0fa4bfb-d80b-4502-9523-25faf9cd73c4-serving-cert\") pod \"console-operator-9d4b6777b-j4s49\" (UID: \"c0fa4bfb-d80b-4502-9523-25faf9cd73c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:26.535754 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.535727 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9klww\" (UniqueName: \"kubernetes.io/projected/ff43e2b1-dadc-4902-8486-e9cccc93a38b-kube-api-access-9klww\") pod \"volume-data-source-validator-7c6cbb6c87-768xz\" (UID: \"ff43e2b1-dadc-4902-8486-e9cccc93a38b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-768xz" Apr 23 16:35:26.535962 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.535938 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgh8p\" (UniqueName: \"kubernetes.io/projected/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-kube-api-access-wgh8p\") pod \"dns-default-blbwq\" (UID: \"a1edcb0c-3893-4226-86ba-f6c8e3795d5a\") " pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:26.537471 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.537439 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbrbs\" (UniqueName: \"kubernetes.io/projected/8b59acd4-b73a-4a98-962f-47e8e830e452-kube-api-access-kbrbs\") pod \"ingress-canary-5kp97\" (UID: \"8b59acd4-b73a-4a98-962f-47e8e830e452\") " pod="openshift-ingress-canary/ingress-canary-5kp97" Apr 23 16:35:26.537816 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.537791 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm576\" (UniqueName: \"kubernetes.io/projected/c0fa4bfb-d80b-4502-9523-25faf9cd73c4-kube-api-access-bm576\") pod \"console-operator-9d4b6777b-j4s49\" (UID: \"c0fa4bfb-d80b-4502-9523-25faf9cd73c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:26.560019 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.559993 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-qph7m"] Apr 23 16:35:26.567372 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.567349 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qph7m" Apr 23 16:35:26.569604 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.569585 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-768xz" Apr 23 16:35:26.569721 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.569648 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb" Apr 23 16:35:26.570401 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.570329 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-285c2\"" Apr 23 16:35:26.597552 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.597503 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:26.725839 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.725746 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5cbdaaf-58dd-418a-9710-5657fc8aae17-hosts-file\") pod \"node-resolver-qph7m\" (UID: \"e5cbdaaf-58dd-418a-9710-5657fc8aae17\") " pod="openshift-dns/node-resolver-qph7m" Apr 23 16:35:26.725839 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.725819 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk5zc\" (UniqueName: \"kubernetes.io/projected/e5cbdaaf-58dd-418a-9710-5657fc8aae17-kube-api-access-qk5zc\") pod \"node-resolver-qph7m\" (UID: \"e5cbdaaf-58dd-418a-9710-5657fc8aae17\") " pod="openshift-dns/node-resolver-qph7m" Apr 23 16:35:26.726046 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.725883 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5cbdaaf-58dd-418a-9710-5657fc8aae17-tmp-dir\") pod \"node-resolver-qph7m\" (UID: \"e5cbdaaf-58dd-418a-9710-5657fc8aae17\") " pod="openshift-dns/node-resolver-qph7m" Apr 23 16:35:26.826808 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.826775 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5cbdaaf-58dd-418a-9710-5657fc8aae17-hosts-file\") pod \"node-resolver-qph7m\" (UID: \"e5cbdaaf-58dd-418a-9710-5657fc8aae17\") " pod="openshift-dns/node-resolver-qph7m" Apr 23 16:35:26.827011 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.826849 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qk5zc\" (UniqueName: \"kubernetes.io/projected/e5cbdaaf-58dd-418a-9710-5657fc8aae17-kube-api-access-qk5zc\") pod \"node-resolver-qph7m\" (UID: \"e5cbdaaf-58dd-418a-9710-5657fc8aae17\") " pod="openshift-dns/node-resolver-qph7m" Apr 23 16:35:26.827011 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.826896 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5cbdaaf-58dd-418a-9710-5657fc8aae17-tmp-dir\") pod \"node-resolver-qph7m\" (UID: \"e5cbdaaf-58dd-418a-9710-5657fc8aae17\") " pod="openshift-dns/node-resolver-qph7m" Apr 23 16:35:26.827011 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.826933 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5cbdaaf-58dd-418a-9710-5657fc8aae17-hosts-file\") pod \"node-resolver-qph7m\" (UID: \"e5cbdaaf-58dd-418a-9710-5657fc8aae17\") " pod="openshift-dns/node-resolver-qph7m" Apr 23 16:35:26.837763 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.837733 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5cbdaaf-58dd-418a-9710-5657fc8aae17-tmp-dir\") pod \"node-resolver-qph7m\" (UID: \"e5cbdaaf-58dd-418a-9710-5657fc8aae17\") " pod="openshift-dns/node-resolver-qph7m" Apr 23 16:35:26.840346 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.840310 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk5zc\" (UniqueName: \"kubernetes.io/projected/e5cbdaaf-58dd-418a-9710-5657fc8aae17-kube-api-access-qk5zc\") pod \"node-resolver-qph7m\" (UID: \"e5cbdaaf-58dd-418a-9710-5657fc8aae17\") " pod="openshift-dns/node-resolver-qph7m" Apr 23 16:35:26.877829 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.877796 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qph7m" Apr 23 16:35:26.927971 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.927926 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:26.928125 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.928000 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:26.928125 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.928028 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:26.928125 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.928060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f11952e-c294-49f7-be03-3644d42784ef-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jx8jb\" (UID: \"7f11952e-c294-49f7-be03-3644d42784ef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb" Apr 23 16:35:26.928125 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.928080 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:35:26.928125 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.928096 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84465bb98d-vbtvd: secret "image-registry-tls" not found Apr 23 16:35:26.928125 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:26.928096 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-jmwrb\" (UID: \"46a1bc5d-9337-4a1e-9e0f-4ae982b76d79\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" Apr 23 16:35:26.928430 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.928151 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls podName:d8bc09ea-e1da-495e-98df-c9bcabf133f3 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:27.928133693 +0000 UTC m=+35.191008159 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls") pod "image-registry-84465bb98d-vbtvd" (UID: "d8bc09ea-e1da-495e-98df-c9bcabf133f3") : secret "image-registry-tls" not found Apr 23 16:35:26.928430 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.928177 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:35:26.928430 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.928187 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:35:26.928430 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.928197 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:35:26.928430 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.928194 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle podName:5f517baa-7cef-488d-bba8-ced00dd978ea nodeName:}" failed. No retries permitted until 2026-04-23 16:35:27.928173094 +0000 UTC m=+35.191047571 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle") pod "router-default-9d787c8db-s4x7f" (UID: "5f517baa-7cef-488d-bba8-ced00dd978ea") : configmap references non-existent config key: service-ca.crt Apr 23 16:35:26.928430 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.928262 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs podName:5f517baa-7cef-488d-bba8-ced00dd978ea nodeName:}" failed. No retries permitted until 2026-04-23 16:35:27.928250195 +0000 UTC m=+35.191124657 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs") pod "router-default-9d787c8db-s4x7f" (UID: "5f517baa-7cef-488d-bba8-ced00dd978ea") : secret "router-metrics-certs-default" not found Apr 23 16:35:26.928430 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.928276 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f11952e-c294-49f7-be03-3644d42784ef-samples-operator-tls podName:7f11952e-c294-49f7-be03-3644d42784ef nodeName:}" failed. No retries permitted until 2026-04-23 16:35:27.928268706 +0000 UTC m=+35.191143164 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7f11952e-c294-49f7-be03-3644d42784ef-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jx8jb" (UID: "7f11952e-c294-49f7-be03-3644d42784ef") : secret "samples-operator-tls" not found Apr 23 16:35:26.928430 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:26.928338 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls podName:46a1bc5d-9337-4a1e-9e0f-4ae982b76d79 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:27.928305772 +0000 UTC m=+35.191180231 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-jmwrb" (UID: "46a1bc5d-9337-4a1e-9e0f-4ae982b76d79") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:35:27.029175 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.029133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert\") pod \"ingress-canary-5kp97\" (UID: \"8b59acd4-b73a-4a98-962f-47e8e830e452\") " pod="openshift-ingress-canary/ingress-canary-5kp97" Apr 23 16:35:27.029332 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.029190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls\") pod \"dns-default-blbwq\" (UID: \"a1edcb0c-3893-4226-86ba-f6c8e3795d5a\") " pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:27.029332 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.029227 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs\") pod \"network-metrics-daemon-grr5m\" (UID: \"53be1a16-14c5-4f4a-b293-a31505dc39e3\") " pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:27.029332 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:27.029254 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:27.029332 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:27.029312 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert podName:8b59acd4-b73a-4a98-962f-47e8e830e452 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:28.029293557 +0000 UTC m=+35.292168038 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert") pod "ingress-canary-5kp97" (UID: "8b59acd4-b73a-4a98-962f-47e8e830e452") : secret "canary-serving-cert" not found Apr 23 16:35:27.029508 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:27.029346 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 16:35:27.029508 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:27.029393 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs podName:53be1a16-14c5-4f4a-b293-a31505dc39e3 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:59.029382777 +0000 UTC m=+66.292257235 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs") pod "network-metrics-daemon-grr5m" (UID: "53be1a16-14c5-4f4a-b293-a31505dc39e3") : secret "metrics-daemon-secret" not found Apr 23 16:35:27.029508 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:27.029439 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:27.029508 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:27.029473 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls podName:a1edcb0c-3893-4226-86ba-f6c8e3795d5a nodeName:}" failed. No retries permitted until 2026-04-23 16:35:28.029461936 +0000 UTC m=+35.292336394 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls") pod "dns-default-blbwq" (UID: "a1edcb0c-3893-4226-86ba-f6c8e3795d5a") : secret "dns-default-metrics-tls" not found Apr 23 16:35:27.130369 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.130331 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krlrw\" (UniqueName: \"kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw\") pod \"network-check-target-qlwjk\" (UID: \"3b7bc009-2549-40d8-b0f3-979edf176475\") " pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:27.132974 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.132943 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krlrw\" (UniqueName: \"kubernetes.io/projected/3b7bc009-2549-40d8-b0f3-979edf176475-kube-api-access-krlrw\") pod \"network-check-target-qlwjk\" (UID: \"3b7bc009-2549-40d8-b0f3-979edf176475\") " pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:27.275019 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.274521 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:27.362418 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.362160 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:27.365944 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.365731 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 16:35:27.433847 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.433647 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r"] Apr 23 16:35:27.437605 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.437189 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-sqtch"] Apr 23 16:35:27.439798 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.439776 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-768xz"] Apr 23 16:35:27.444837 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.444814 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb"] Apr 23 16:35:27.447569 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.447546 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-j4s49"] Apr 23 16:35:27.487516 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:35:27.487484 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d5e2043_b837_4d41_b2f9_b6155ad80b36.slice/crio-f73bcd75d99ac5956ada8fb1bf54348e11e163e38a1a235c28166cb236bed523 WatchSource:0}: Error finding container f73bcd75d99ac5956ada8fb1bf54348e11e163e38a1a235c28166cb236bed523: Status 404 returned error can't find the container with id f73bcd75d99ac5956ada8fb1bf54348e11e163e38a1a235c28166cb236bed523 Apr 23 16:35:27.488688 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:35:27.488665 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod982b5b8d_3beb_472b_af96_34991930ce23.slice/crio-de63be647eb667316de3e4eb2b70f2a8a19e793f9a209568625ab5c3ba60160d WatchSource:0}: Error finding container de63be647eb667316de3e4eb2b70f2a8a19e793f9a209568625ab5c3ba60160d: Status 404 returned error can't find the container with id de63be647eb667316de3e4eb2b70f2a8a19e793f9a209568625ab5c3ba60160d Apr 23 16:35:27.489141 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:35:27.489120 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff43e2b1_dadc_4902_8486_e9cccc93a38b.slice/crio-6011a1f6f53c3d32a1dc7064df9d95e6d04e1f4e532a0496e2abbd67a6d348c8 WatchSource:0}: Error finding container 6011a1f6f53c3d32a1dc7064df9d95e6d04e1f4e532a0496e2abbd67a6d348c8: Status 404 returned error can't find the container with id 6011a1f6f53c3d32a1dc7064df9d95e6d04e1f4e532a0496e2abbd67a6d348c8 Apr 23 16:35:27.490938 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:35:27.490915 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38274876_af8d_4558_a05f_022de9171c7d.slice/crio-31d8cd5004f99eaeab05d8d3d8896f97e94e7948008af28ff97e7b3b9bc20a0f WatchSource:0}: Error finding container 31d8cd5004f99eaeab05d8d3d8896f97e94e7948008af28ff97e7b3b9bc20a0f: Status 404 returned error can't find the container with id 31d8cd5004f99eaeab05d8d3d8896f97e94e7948008af28ff97e7b3b9bc20a0f Apr 23 16:35:27.491462 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:35:27.491384 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0fa4bfb_d80b_4502_9523_25faf9cd73c4.slice/crio-35d8b714afb1fcddead82b3f0f74d96f0522120b2e2f282a24c17dd8ff9aa86d WatchSource:0}: Error finding container 35d8b714afb1fcddead82b3f0f74d96f0522120b2e2f282a24c17dd8ff9aa86d: Status 404 returned error can't find the container with id 35d8b714afb1fcddead82b3f0f74d96f0522120b2e2f282a24c17dd8ff9aa86d Apr 23 16:35:27.496947 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.496924 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qlwjk"] Apr 23 16:35:27.499647 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:35:27.499623 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b7bc009_2549_40d8_b0f3_979edf176475.slice/crio-9e5de97c6f6648ed8b17130addc3ace8dddd5621ffe0f76951e83c33a093d141 WatchSource:0}: Error finding container 9e5de97c6f6648ed8b17130addc3ace8dddd5621ffe0f76951e83c33a093d141: Status 404 returned error can't find the container with id 9e5de97c6f6648ed8b17130addc3ace8dddd5621ffe0f76951e83c33a093d141 Apr 23 16:35:27.533708 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.533684 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qlwjk" event={"ID":"3b7bc009-2549-40d8-b0f3-979edf176475","Type":"ContainerStarted","Data":"9e5de97c6f6648ed8b17130addc3ace8dddd5621ffe0f76951e83c33a093d141"} Apr 23 16:35:27.534667 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.534640 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-sqtch" event={"ID":"4d5e2043-b837-4d41-b2f9-b6155ad80b36","Type":"ContainerStarted","Data":"f73bcd75d99ac5956ada8fb1bf54348e11e163e38a1a235c28166cb236bed523"} Apr 23 16:35:27.535635 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.535614 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb" event={"ID":"38274876-af8d-4558-a05f-022de9171c7d","Type":"ContainerStarted","Data":"31d8cd5004f99eaeab05d8d3d8896f97e94e7948008af28ff97e7b3b9bc20a0f"} Apr 23 16:35:27.536403 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.536387 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-768xz" event={"ID":"ff43e2b1-dadc-4902-8486-e9cccc93a38b","Type":"ContainerStarted","Data":"6011a1f6f53c3d32a1dc7064df9d95e6d04e1f4e532a0496e2abbd67a6d348c8"} Apr 23 16:35:27.537313 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.537294 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" event={"ID":"c0fa4bfb-d80b-4502-9523-25faf9cd73c4","Type":"ContainerStarted","Data":"35d8b714afb1fcddead82b3f0f74d96f0522120b2e2f282a24c17dd8ff9aa86d"} Apr 23 16:35:27.538456 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.538438 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qph7m" event={"ID":"e5cbdaaf-58dd-418a-9710-5657fc8aae17","Type":"ContainerStarted","Data":"b08d63d757147e2a91d0fda361584527b5db88166cb991f3f6c4de9b5a657a7e"} Apr 23 16:35:27.538548 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.538461 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qph7m" event={"ID":"e5cbdaaf-58dd-418a-9710-5657fc8aae17","Type":"ContainerStarted","Data":"373d16b125e02e7b7904d07dca0056554bbc157dc09b4ac6520565396eeaf771"} Apr 23 16:35:27.539426 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.539403 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r" event={"ID":"982b5b8d-3beb-472b-af96-34991930ce23","Type":"ContainerStarted","Data":"de63be647eb667316de3e4eb2b70f2a8a19e793f9a209568625ab5c3ba60160d"} Apr 23 16:35:27.559007 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.557872 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qph7m" podStartSLOduration=1.557857864 podStartE2EDuration="1.557857864s" podCreationTimestamp="2026-04-23 16:35:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:27.557491626 +0000 UTC m=+34.820366102" watchObservedRunningTime="2026-04-23 16:35:27.557857864 +0000 UTC m=+34.820732339" Apr 23 16:35:27.938320 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.938215 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:27.938320 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.938313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:27.938545 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:27.938382 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:35:27.938545 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:27.938398 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84465bb98d-vbtvd: secret "image-registry-tls" not found Apr 23 16:35:27.938545 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:27.938453 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:35:27.938545 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:27.938458 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls podName:d8bc09ea-e1da-495e-98df-c9bcabf133f3 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:29.938439199 +0000 UTC m=+37.201313674 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls") pod "image-registry-84465bb98d-vbtvd" (UID: "d8bc09ea-e1da-495e-98df-c9bcabf133f3") : secret "image-registry-tls" not found Apr 23 16:35:27.938545 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.938383 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:27.938545 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:27.938488 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs podName:5f517baa-7cef-488d-bba8-ced00dd978ea nodeName:}" failed. No retries permitted until 2026-04-23 16:35:29.938477594 +0000 UTC m=+37.201352053 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs") pod "router-default-9d787c8db-s4x7f" (UID: "5f517baa-7cef-488d-bba8-ced00dd978ea") : secret "router-metrics-certs-default" not found Apr 23 16:35:27.938851 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:27.938549 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle podName:5f517baa-7cef-488d-bba8-ced00dd978ea nodeName:}" failed. No retries permitted until 2026-04-23 16:35:29.938502978 +0000 UTC m=+37.201377439 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle") pod "router-default-9d787c8db-s4x7f" (UID: "5f517baa-7cef-488d-bba8-ced00dd978ea") : configmap references non-existent config key: service-ca.crt Apr 23 16:35:27.938851 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.938602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f11952e-c294-49f7-be03-3644d42784ef-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jx8jb\" (UID: \"7f11952e-c294-49f7-be03-3644d42784ef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb" Apr 23 16:35:27.938851 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:27.938643 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-jmwrb\" (UID: \"46a1bc5d-9337-4a1e-9e0f-4ae982b76d79\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" Apr 23 16:35:27.938851 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:27.938839 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:35:27.938851 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:27.938850 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:35:27.939055 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:27.938884 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls podName:46a1bc5d-9337-4a1e-9e0f-4ae982b76d79 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:29.938870083 +0000 UTC m=+37.201744562 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-jmwrb" (UID: "46a1bc5d-9337-4a1e-9e0f-4ae982b76d79") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:35:27.939055 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:27.938906 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f11952e-c294-49f7-be03-3644d42784ef-samples-operator-tls podName:7f11952e-c294-49f7-be03-3644d42784ef nodeName:}" failed. No retries permitted until 2026-04-23 16:35:29.938893211 +0000 UTC m=+37.201767668 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7f11952e-c294-49f7-be03-3644d42784ef-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jx8jb" (UID: "7f11952e-c294-49f7-be03-3644d42784ef") : secret "samples-operator-tls" not found Apr 23 16:35:28.040368 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:28.040330 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert\") pod \"ingress-canary-5kp97\" (UID: \"8b59acd4-b73a-4a98-962f-47e8e830e452\") " pod="openshift-ingress-canary/ingress-canary-5kp97" Apr 23 16:35:28.040542 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:28.040391 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls\") pod \"dns-default-blbwq\" (UID: \"a1edcb0c-3893-4226-86ba-f6c8e3795d5a\") " pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:28.040651 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:28.040634 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:28.040708 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:28.040700 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls podName:a1edcb0c-3893-4226-86ba-f6c8e3795d5a nodeName:}" failed. No retries permitted until 2026-04-23 16:35:30.040681812 +0000 UTC m=+37.303556278 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls") pod "dns-default-blbwq" (UID: "a1edcb0c-3893-4226-86ba-f6c8e3795d5a") : secret "dns-default-metrics-tls" not found Apr 23 16:35:28.041135 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:28.041118 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:28.041195 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:28.041169 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert podName:8b59acd4-b73a-4a98-962f-47e8e830e452 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:30.041154919 +0000 UTC m=+37.304029380 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert") pod "ingress-canary-5kp97" (UID: "8b59acd4-b73a-4a98-962f-47e8e830e452") : secret "canary-serving-cert" not found Apr 23 16:35:28.548158 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:28.547228 2575 generic.go:358] "Generic (PLEG): container finished" podID="0a621457-5864-41d6-92e2-1094d54e41ae" containerID="2a76e53a964606de9c190b39ecb66e9c47e0b85086fde45b11f97a66054d0e27" exitCode=0 Apr 23 16:35:28.548158 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:28.547274 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kznw2" event={"ID":"0a621457-5864-41d6-92e2-1094d54e41ae","Type":"ContainerDied","Data":"2a76e53a964606de9c190b39ecb66e9c47e0b85086fde45b11f97a66054d0e27"} Apr 23 16:35:28.747522 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:28.746576 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/234f7274-d920-45c3-b270-d3c0251a85d9-original-pull-secret\") pod \"global-pull-secret-syncer-6tzts\" (UID: \"234f7274-d920-45c3-b270-d3c0251a85d9\") " pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:28.757340 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:28.757285 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/234f7274-d920-45c3-b270-d3c0251a85d9-original-pull-secret\") pod \"global-pull-secret-syncer-6tzts\" (UID: \"234f7274-d920-45c3-b270-d3c0251a85d9\") " pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:28.892933 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:28.892903 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6tzts" Apr 23 16:35:29.072226 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:29.072194 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6tzts"] Apr 23 16:35:29.554077 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:29.554034 2575 generic.go:358] "Generic (PLEG): container finished" podID="0a621457-5864-41d6-92e2-1094d54e41ae" containerID="9a1d4d0fbacdd827e847b6c1468ef9434c26ff485d0223bf4421730863c00bdf" exitCode=0 Apr 23 16:35:29.554779 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:29.554094 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kznw2" event={"ID":"0a621457-5864-41d6-92e2-1094d54e41ae","Type":"ContainerDied","Data":"9a1d4d0fbacdd827e847b6c1468ef9434c26ff485d0223bf4421730863c00bdf"} Apr 23 16:35:29.962328 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:29.962287 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:29.962508 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:29.962345 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:29.962508 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:29.962382 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f11952e-c294-49f7-be03-3644d42784ef-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jx8jb\" (UID: \"7f11952e-c294-49f7-be03-3644d42784ef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb" Apr 23 16:35:29.962508 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:29.962415 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-jmwrb\" (UID: \"46a1bc5d-9337-4a1e-9e0f-4ae982b76d79\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" Apr 23 16:35:29.962696 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:29.962544 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:29.962746 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:29.962704 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:35:29.962746 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:29.962721 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84465bb98d-vbtvd: secret "image-registry-tls" not found Apr 23 16:35:29.962830 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:29.962785 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls podName:d8bc09ea-e1da-495e-98df-c9bcabf133f3 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:33.962763204 +0000 UTC m=+41.225637684 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls") pod "image-registry-84465bb98d-vbtvd" (UID: "d8bc09ea-e1da-495e-98df-c9bcabf133f3") : secret "image-registry-tls" not found Apr 23 16:35:29.963194 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:29.963172 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle podName:5f517baa-7cef-488d-bba8-ced00dd978ea nodeName:}" failed. No retries permitted until 2026-04-23 16:35:33.963160378 +0000 UTC m=+41.226034836 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle") pod "router-default-9d787c8db-s4x7f" (UID: "5f517baa-7cef-488d-bba8-ced00dd978ea") : configmap references non-existent config key: service-ca.crt Apr 23 16:35:29.963303 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:29.963254 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:35:29.963303 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:29.963296 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs podName:5f517baa-7cef-488d-bba8-ced00dd978ea nodeName:}" failed. No retries permitted until 2026-04-23 16:35:33.96328343 +0000 UTC m=+41.226157902 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs") pod "router-default-9d787c8db-s4x7f" (UID: "5f517baa-7cef-488d-bba8-ced00dd978ea") : secret "router-metrics-certs-default" not found Apr 23 16:35:29.963406 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:29.963354 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:35:29.963406 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:29.963391 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f11952e-c294-49f7-be03-3644d42784ef-samples-operator-tls podName:7f11952e-c294-49f7-be03-3644d42784ef nodeName:}" failed. No retries permitted until 2026-04-23 16:35:33.963379927 +0000 UTC m=+41.226254399 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7f11952e-c294-49f7-be03-3644d42784ef-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jx8jb" (UID: "7f11952e-c294-49f7-be03-3644d42784ef") : secret "samples-operator-tls" not found Apr 23 16:35:29.963516 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:29.963443 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:35:29.963516 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:29.963475 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls podName:46a1bc5d-9337-4a1e-9e0f-4ae982b76d79 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:33.96346551 +0000 UTC m=+41.226339983 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-jmwrb" (UID: "46a1bc5d-9337-4a1e-9e0f-4ae982b76d79") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:35:30.063725 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:30.063679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert\") pod \"ingress-canary-5kp97\" (UID: \"8b59acd4-b73a-4a98-962f-47e8e830e452\") " pod="openshift-ingress-canary/ingress-canary-5kp97" Apr 23 16:35:30.063897 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:30.063739 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls\") pod \"dns-default-blbwq\" (UID: \"a1edcb0c-3893-4226-86ba-f6c8e3795d5a\") " pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:30.063897 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:30.063849 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:30.064070 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:30.063923 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert podName:8b59acd4-b73a-4a98-962f-47e8e830e452 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:34.063903304 +0000 UTC m=+41.326777761 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert") pod "ingress-canary-5kp97" (UID: "8b59acd4-b73a-4a98-962f-47e8e830e452") : secret "canary-serving-cert" not found Apr 23 16:35:30.064070 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:30.063852 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:30.064070 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:30.064022 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls podName:a1edcb0c-3893-4226-86ba-f6c8e3795d5a nodeName:}" failed. No retries permitted until 2026-04-23 16:35:34.0640033 +0000 UTC m=+41.326877771 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls") pod "dns-default-blbwq" (UID: "a1edcb0c-3893-4226-86ba-f6c8e3795d5a") : secret "dns-default-metrics-tls" not found Apr 23 16:35:30.557787 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:30.557751 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6tzts" event={"ID":"234f7274-d920-45c3-b270-d3c0251a85d9","Type":"ContainerStarted","Data":"3d139d67ceb1f1f0658a825d29ade7b44e3bba9fe6c7643f79710f24c1092258"} Apr 23 16:35:33.201332 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:33.201296 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4jx4z"] Apr 23 16:35:33.228753 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:33.228140 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4jx4z"] Apr 23 16:35:33.228753 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:33.228265 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4jx4z" Apr 23 16:35:33.231613 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:33.231574 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-znxpk\"" Apr 23 16:35:33.391584 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:33.391551 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wkd7\" (UniqueName: \"kubernetes.io/projected/d1122629-0bc9-46dd-aed2-f2d5e2295f30-kube-api-access-9wkd7\") pod \"network-check-source-8894fc9bd-4jx4z\" (UID: \"d1122629-0bc9-46dd-aed2-f2d5e2295f30\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4jx4z" Apr 23 16:35:33.493094 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:33.492996 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wkd7\" (UniqueName: \"kubernetes.io/projected/d1122629-0bc9-46dd-aed2-f2d5e2295f30-kube-api-access-9wkd7\") pod \"network-check-source-8894fc9bd-4jx4z\" (UID: \"d1122629-0bc9-46dd-aed2-f2d5e2295f30\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4jx4z" Apr 23 16:35:33.505050 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:33.505014 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wkd7\" (UniqueName: \"kubernetes.io/projected/d1122629-0bc9-46dd-aed2-f2d5e2295f30-kube-api-access-9wkd7\") pod \"network-check-source-8894fc9bd-4jx4z\" (UID: \"d1122629-0bc9-46dd-aed2-f2d5e2295f30\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4jx4z" Apr 23 16:35:33.541979 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:33.541924 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4jx4z" Apr 23 16:35:33.997107 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:33.997077 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:33.997225 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:33.997133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:33.997225 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:33.997171 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f11952e-c294-49f7-be03-3644d42784ef-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jx8jb\" (UID: \"7f11952e-c294-49f7-be03-3644d42784ef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb" Apr 23 16:35:33.997225 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:33.997197 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-jmwrb\" (UID: \"46a1bc5d-9337-4a1e-9e0f-4ae982b76d79\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" Apr 23 16:35:33.997391 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:33.997297 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:33.997431 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:33.997416 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:35:33.997469 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:33.997430 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84465bb98d-vbtvd: secret "image-registry-tls" not found Apr 23 16:35:33.997513 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:33.997496 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls podName:d8bc09ea-e1da-495e-98df-c9bcabf133f3 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:41.997475741 +0000 UTC m=+49.260350200 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls") pod "image-registry-84465bb98d-vbtvd" (UID: "d8bc09ea-e1da-495e-98df-c9bcabf133f3") : secret "image-registry-tls" not found Apr 23 16:35:33.997695 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:33.997670 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:35:33.997833 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:33.997697 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:35:33.997833 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:33.997670 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:35:33.997833 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:33.997749 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls podName:46a1bc5d-9337-4a1e-9e0f-4ae982b76d79 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:41.997736589 +0000 UTC m=+49.260611052 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-jmwrb" (UID: "46a1bc5d-9337-4a1e-9e0f-4ae982b76d79") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:35:33.997833 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:33.997791 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle podName:5f517baa-7cef-488d-bba8-ced00dd978ea nodeName:}" failed. No retries permitted until 2026-04-23 16:35:41.997777883 +0000 UTC m=+49.260652344 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle") pod "router-default-9d787c8db-s4x7f" (UID: "5f517baa-7cef-488d-bba8-ced00dd978ea") : configmap references non-existent config key: service-ca.crt Apr 23 16:35:33.997833 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:33.997811 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs podName:5f517baa-7cef-488d-bba8-ced00dd978ea nodeName:}" failed. No retries permitted until 2026-04-23 16:35:41.997800397 +0000 UTC m=+49.260674860 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs") pod "router-default-9d787c8db-s4x7f" (UID: "5f517baa-7cef-488d-bba8-ced00dd978ea") : secret "router-metrics-certs-default" not found Apr 23 16:35:33.998036 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:33.997841 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f11952e-c294-49f7-be03-3644d42784ef-samples-operator-tls podName:7f11952e-c294-49f7-be03-3644d42784ef nodeName:}" failed. No retries permitted until 2026-04-23 16:35:41.99783269 +0000 UTC m=+49.260707148 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7f11952e-c294-49f7-be03-3644d42784ef-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jx8jb" (UID: "7f11952e-c294-49f7-be03-3644d42784ef") : secret "samples-operator-tls" not found Apr 23 16:35:34.098440 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:34.098394 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls\") pod \"dns-default-blbwq\" (UID: \"a1edcb0c-3893-4226-86ba-f6c8e3795d5a\") " pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:34.098664 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:34.098514 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:34.098664 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:34.098595 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert\") pod \"ingress-canary-5kp97\" (UID: \"8b59acd4-b73a-4a98-962f-47e8e830e452\") " pod="openshift-ingress-canary/ingress-canary-5kp97" Apr 23 16:35:34.098664 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:34.098647 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls podName:a1edcb0c-3893-4226-86ba-f6c8e3795d5a nodeName:}" failed. No retries permitted until 2026-04-23 16:35:42.098616521 +0000 UTC m=+49.361490994 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls") pod "dns-default-blbwq" (UID: "a1edcb0c-3893-4226-86ba-f6c8e3795d5a") : secret "dns-default-metrics-tls" not found Apr 23 16:35:34.098664 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:34.098652 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:34.098885 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:34.098708 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert podName:8b59acd4-b73a-4a98-962f-47e8e830e452 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:42.098693252 +0000 UTC m=+49.361567711 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert") pod "ingress-canary-5kp97" (UID: "8b59acd4-b73a-4a98-962f-47e8e830e452") : secret "canary-serving-cert" not found Apr 23 16:35:34.453157 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:34.453126 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4jx4z"] Apr 23 16:35:34.457656 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:35:34.457628 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1122629_0bc9_46dd_aed2_f2d5e2295f30.slice/crio-36d13123953d24e26b4366de6fb1e4812015b298c8fc376078effc7b44d43176 WatchSource:0}: Error finding container 36d13123953d24e26b4366de6fb1e4812015b298c8fc376078effc7b44d43176: Status 404 returned error can't find the container with id 36d13123953d24e26b4366de6fb1e4812015b298c8fc376078effc7b44d43176 Apr 23 16:35:34.571223 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:34.571047 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4jx4z" event={"ID":"d1122629-0bc9-46dd-aed2-f2d5e2295f30","Type":"ContainerStarted","Data":"36d13123953d24e26b4366de6fb1e4812015b298c8fc376078effc7b44d43176"} Apr 23 16:35:34.575867 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:34.574714 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-sqtch" event={"ID":"4d5e2043-b837-4d41-b2f9-b6155ad80b36","Type":"ContainerStarted","Data":"869b250ef1696944393d7d450136adc463bec194ea76b16a98d8113af530d1f8"} Apr 23 16:35:34.577605 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:34.577509 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb" event={"ID":"38274876-af8d-4558-a05f-022de9171c7d","Type":"ContainerStarted","Data":"94030261ff5411f26d9cc012dbe497d78f67e8575d33d9adf66f48d4f0050180"} Apr 23 16:35:34.579946 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:34.579882 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-768xz" event={"ID":"ff43e2b1-dadc-4902-8486-e9cccc93a38b","Type":"ContainerStarted","Data":"ca780a9c38d012744de5a5eb9883b9ff292aaf0d912271e05c2e21be9ca4ffbe"} Apr 23 16:35:34.583596 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:34.583577 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kznw2" event={"ID":"0a621457-5864-41d6-92e2-1094d54e41ae","Type":"ContainerStarted","Data":"eef427482668e83bc711daec911bf89b2dd277861705949608d16e61f8f9e350"} Apr 23 16:35:34.620663 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:34.620072 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-sqtch" podStartSLOduration=16.809979881 podStartE2EDuration="23.620051755s" podCreationTimestamp="2026-04-23 16:35:11 +0000 UTC" firstStartedPulling="2026-04-23 16:35:27.48998378 +0000 UTC m=+34.752858252" lastFinishedPulling="2026-04-23 16:35:34.300055666 +0000 UTC m=+41.562930126" observedRunningTime="2026-04-23 16:35:34.608562201 +0000 UTC m=+41.871436682" watchObservedRunningTime="2026-04-23 16:35:34.620051755 +0000 UTC m=+41.882926235" Apr 23 16:35:34.637438 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:34.637382 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-768xz" podStartSLOduration=6.841623574 podStartE2EDuration="13.637360812s" podCreationTimestamp="2026-04-23 16:35:21 +0000 UTC" firstStartedPulling="2026-04-23 16:35:27.504959344 +0000 UTC m=+34.767833807" lastFinishedPulling="2026-04-23 16:35:34.300696582 +0000 UTC m=+41.563571045" observedRunningTime="2026-04-23 16:35:34.636806164 +0000 UTC m=+41.899680645" watchObservedRunningTime="2026-04-23 16:35:34.637360812 +0000 UTC m=+41.900235294" Apr 23 16:35:34.671580 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:34.671496 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb" podStartSLOduration=6.873083087 podStartE2EDuration="13.671475299s" podCreationTimestamp="2026-04-23 16:35:21 +0000 UTC" firstStartedPulling="2026-04-23 16:35:27.505009303 +0000 UTC m=+34.767883762" lastFinishedPulling="2026-04-23 16:35:34.303401502 +0000 UTC m=+41.566275974" observedRunningTime="2026-04-23 16:35:34.669702721 +0000 UTC m=+41.932577202" watchObservedRunningTime="2026-04-23 16:35:34.671475299 +0000 UTC m=+41.934349781" Apr 23 16:35:34.767768 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:34.767311 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kznw2" podStartSLOduration=10.175864176 podStartE2EDuration="41.767293822s" podCreationTimestamp="2026-04-23 16:34:53 +0000 UTC" firstStartedPulling="2026-04-23 16:34:55.935893575 +0000 UTC m=+3.198768048" lastFinishedPulling="2026-04-23 16:35:27.527323237 +0000 UTC m=+34.790197694" observedRunningTime="2026-04-23 16:35:34.76584811 +0000 UTC m=+42.028722592" watchObservedRunningTime="2026-04-23 16:35:34.767293822 +0000 UTC m=+42.030168304" Apr 23 16:35:35.588852 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:35.588816 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/0.log" Apr 23 16:35:35.589299 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:35.588861 2575 generic.go:358] "Generic (PLEG): container finished" podID="c0fa4bfb-d80b-4502-9523-25faf9cd73c4" containerID="b2e8ab1d0ebaa5c0243b8cc8cb4cb76e765927b46172de5dea378ca0b89faac0" exitCode=255 Apr 23 16:35:35.589299 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:35.589115 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" event={"ID":"c0fa4bfb-d80b-4502-9523-25faf9cd73c4","Type":"ContainerDied","Data":"b2e8ab1d0ebaa5c0243b8cc8cb4cb76e765927b46172de5dea378ca0b89faac0"} Apr 23 16:35:35.589299 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:35.589232 2575 scope.go:117] "RemoveContainer" containerID="b2e8ab1d0ebaa5c0243b8cc8cb4cb76e765927b46172de5dea378ca0b89faac0" Apr 23 16:35:35.591775 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:35.590760 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4jx4z" event={"ID":"d1122629-0bc9-46dd-aed2-f2d5e2295f30","Type":"ContainerStarted","Data":"4901a4e0a04386e47ee5bcc80b2a43f53e7d591b5eba833def50a0aa6bdfa8da"} Apr 23 16:35:35.592275 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:35.592253 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r" event={"ID":"982b5b8d-3beb-472b-af96-34991930ce23","Type":"ContainerStarted","Data":"abfb06a484f9996005767fdd01a5428475e88f271131ffbdacebe8520116d4cc"} Apr 23 16:35:35.594047 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:35.593977 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qlwjk" event={"ID":"3b7bc009-2549-40d8-b0f3-979edf176475","Type":"ContainerStarted","Data":"5957f7b9b662584abf7f5f0acea66e38ea3800393e39d8575cd8078bbde5aeeb"} Apr 23 16:35:35.629976 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:35.629928 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4jx4z" podStartSLOduration=2.629914296 podStartE2EDuration="2.629914296s" podCreationTimestamp="2026-04-23 16:35:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:35.628571805 +0000 UTC m=+42.891446285" watchObservedRunningTime="2026-04-23 16:35:35.629914296 +0000 UTC m=+42.892788807" Apr 23 16:35:35.654270 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:35.654212 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r" podStartSLOduration=7.859302012 podStartE2EDuration="14.654198559s" podCreationTimestamp="2026-04-23 16:35:21 +0000 UTC" firstStartedPulling="2026-04-23 16:35:27.505283026 +0000 UTC m=+34.768157484" lastFinishedPulling="2026-04-23 16:35:34.30017956 +0000 UTC m=+41.563054031" observedRunningTime="2026-04-23 16:35:35.652893625 +0000 UTC m=+42.915768105" watchObservedRunningTime="2026-04-23 16:35:35.654198559 +0000 UTC m=+42.917073074" Apr 23 16:35:36.448692 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:36.448626 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qlwjk" podStartSLOduration=36.639013673 podStartE2EDuration="43.448604066s" podCreationTimestamp="2026-04-23 16:34:53 +0000 UTC" firstStartedPulling="2026-04-23 16:35:27.504957854 +0000 UTC m=+34.767832314" lastFinishedPulling="2026-04-23 16:35:34.314548236 +0000 UTC m=+41.577422707" observedRunningTime="2026-04-23 16:35:35.680460505 +0000 UTC m=+42.943334987" watchObservedRunningTime="2026-04-23 16:35:36.448604066 +0000 UTC m=+43.711478548" Apr 23 16:35:36.449887 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:36.449862 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-qzd42"] Apr 23 16:35:36.488760 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:36.488732 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-qzd42"] Apr 23 16:35:36.488927 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:36.488788 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qzd42" Apr 23 16:35:36.491857 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:36.491832 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:36.493226 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:36.493203 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-xpkqg\"" Apr 23 16:35:36.493589 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:36.493569 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 16:35:36.521658 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:36.521620 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbh6d\" (UniqueName: \"kubernetes.io/projected/82ab9f64-b97b-43bc-95b2-9c09b443c480-kube-api-access-kbh6d\") pod \"migrator-74bb7799d9-qzd42\" (UID: \"82ab9f64-b97b-43bc-95b2-9c09b443c480\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qzd42" Apr 23 16:35:36.601565 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:36.598675 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:35:36.601565 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:36.600028 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:36.601565 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:36.600091 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:36.622504 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:36.622464 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbh6d\" (UniqueName: \"kubernetes.io/projected/82ab9f64-b97b-43bc-95b2-9c09b443c480-kube-api-access-kbh6d\") pod \"migrator-74bb7799d9-qzd42\" (UID: \"82ab9f64-b97b-43bc-95b2-9c09b443c480\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qzd42" Apr 23 16:35:36.634915 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:36.634874 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbh6d\" (UniqueName: \"kubernetes.io/projected/82ab9f64-b97b-43bc-95b2-9c09b443c480-kube-api-access-kbh6d\") pod \"migrator-74bb7799d9-qzd42\" (UID: \"82ab9f64-b97b-43bc-95b2-9c09b443c480\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qzd42" Apr 23 16:35:36.799725 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:36.799691 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qzd42" Apr 23 16:35:37.051985 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.051925 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qph7m_e5cbdaaf-58dd-418a-9710-5657fc8aae17/dns-node-resolver/0.log" Apr 23 16:35:37.113998 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.113969 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-qzd42"] Apr 23 16:35:37.116411 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:35:37.116379 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82ab9f64_b97b_43bc_95b2_9c09b443c480.slice/crio-e82a3299e50257abc8bae8fbf0ff3aa53455d3d9320000c5e94d2db6fc0984fb WatchSource:0}: Error finding container e82a3299e50257abc8bae8fbf0ff3aa53455d3d9320000c5e94d2db6fc0984fb: Status 404 returned error can't find the container with id e82a3299e50257abc8bae8fbf0ff3aa53455d3d9320000c5e94d2db6fc0984fb Apr 23 16:35:37.602697 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.602603 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qzd42" event={"ID":"82ab9f64-b97b-43bc-95b2-9c09b443c480","Type":"ContainerStarted","Data":"e82a3299e50257abc8bae8fbf0ff3aa53455d3d9320000c5e94d2db6fc0984fb"} Apr 23 16:35:37.604004 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.603976 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6tzts" event={"ID":"234f7274-d920-45c3-b270-d3c0251a85d9","Type":"ContainerStarted","Data":"9ccbe6d7d0faf7036270677768cb76575fb5a731d5b05122f7743220b79a58a4"} Apr 23 16:35:37.605417 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.605399 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 16:35:37.605813 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.605795 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/0.log" Apr 23 16:35:37.605873 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.605833 2575 generic.go:358] "Generic (PLEG): container finished" podID="c0fa4bfb-d80b-4502-9523-25faf9cd73c4" containerID="f9deac18e5435651f06d874a717075688ddbdd4dfd617b877699d4f28e4bbad4" exitCode=255 Apr 23 16:35:37.605933 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.605918 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" event={"ID":"c0fa4bfb-d80b-4502-9523-25faf9cd73c4","Type":"ContainerDied","Data":"f9deac18e5435651f06d874a717075688ddbdd4dfd617b877699d4f28e4bbad4"} Apr 23 16:35:37.606039 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.605945 2575 scope.go:117] "RemoveContainer" containerID="b2e8ab1d0ebaa5c0243b8cc8cb4cb76e765927b46172de5dea378ca0b89faac0" Apr 23 16:35:37.606105 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.606090 2575 scope.go:117] "RemoveContainer" containerID="f9deac18e5435651f06d874a717075688ddbdd4dfd617b877699d4f28e4bbad4" Apr 23 16:35:37.606295 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:37.606278 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-j4s49_openshift-console-operator(c0fa4bfb-d80b-4502-9523-25faf9cd73c4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" podUID="c0fa4bfb-d80b-4502-9523-25faf9cd73c4" Apr 23 16:35:37.626212 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.626163 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6tzts" podStartSLOduration=10.121733717 podStartE2EDuration="17.626144652s" podCreationTimestamp="2026-04-23 16:35:20 +0000 UTC" firstStartedPulling="2026-04-23 16:35:29.725382558 +0000 UTC m=+36.988257016" lastFinishedPulling="2026-04-23 16:35:37.229793491 +0000 UTC m=+44.492667951" observedRunningTime="2026-04-23 16:35:37.625262084 +0000 UTC m=+44.888136563" watchObservedRunningTime="2026-04-23 16:35:37.626144652 +0000 UTC m=+44.889019132" Apr 23 16:35:37.922122 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.922037 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-qnbp2"] Apr 23 16:35:37.945737 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.945697 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-qnbp2"] Apr 23 16:35:37.945896 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.945840 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-qnbp2" Apr 23 16:35:37.948939 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.948911 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 23 16:35:37.949115 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.948912 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 23 16:35:37.949307 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.949291 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-hn5tp\"" Apr 23 16:35:37.950118 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.950098 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 23 16:35:37.950283 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:37.950270 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 23 16:35:38.033106 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:38.033065 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3f736efa-d279-4b27-baaf-42da0efe49ee-signing-key\") pod \"service-ca-865cb79987-qnbp2\" (UID: \"3f736efa-d279-4b27-baaf-42da0efe49ee\") " pod="openshift-service-ca/service-ca-865cb79987-qnbp2" Apr 23 16:35:38.033283 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:38.033141 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3f736efa-d279-4b27-baaf-42da0efe49ee-signing-cabundle\") pod \"service-ca-865cb79987-qnbp2\" (UID: \"3f736efa-d279-4b27-baaf-42da0efe49ee\") " pod="openshift-service-ca/service-ca-865cb79987-qnbp2" Apr 23 16:35:38.033283 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:38.033202 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fphbl\" (UniqueName: \"kubernetes.io/projected/3f736efa-d279-4b27-baaf-42da0efe49ee-kube-api-access-fphbl\") pod \"service-ca-865cb79987-qnbp2\" (UID: \"3f736efa-d279-4b27-baaf-42da0efe49ee\") " pod="openshift-service-ca/service-ca-865cb79987-qnbp2" Apr 23 16:35:38.052681 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:38.052652 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fx8gg_f44bf524-5ea5-4d89-8f40-5a45087669b8/node-ca/0.log" Apr 23 16:35:38.133781 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:38.133744 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3f736efa-d279-4b27-baaf-42da0efe49ee-signing-cabundle\") pod \"service-ca-865cb79987-qnbp2\" (UID: \"3f736efa-d279-4b27-baaf-42da0efe49ee\") " pod="openshift-service-ca/service-ca-865cb79987-qnbp2" Apr 23 16:35:38.133957 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:38.133807 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fphbl\" (UniqueName: \"kubernetes.io/projected/3f736efa-d279-4b27-baaf-42da0efe49ee-kube-api-access-fphbl\") pod \"service-ca-865cb79987-qnbp2\" (UID: \"3f736efa-d279-4b27-baaf-42da0efe49ee\") " pod="openshift-service-ca/service-ca-865cb79987-qnbp2" Apr 23 16:35:38.133957 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:38.133876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3f736efa-d279-4b27-baaf-42da0efe49ee-signing-key\") pod \"service-ca-865cb79987-qnbp2\" (UID: \"3f736efa-d279-4b27-baaf-42da0efe49ee\") " pod="openshift-service-ca/service-ca-865cb79987-qnbp2" Apr 23 16:35:38.134471 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:38.134452 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3f736efa-d279-4b27-baaf-42da0efe49ee-signing-cabundle\") pod \"service-ca-865cb79987-qnbp2\" (UID: \"3f736efa-d279-4b27-baaf-42da0efe49ee\") " pod="openshift-service-ca/service-ca-865cb79987-qnbp2" Apr 23 16:35:38.136626 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:38.136602 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3f736efa-d279-4b27-baaf-42da0efe49ee-signing-key\") pod \"service-ca-865cb79987-qnbp2\" (UID: \"3f736efa-d279-4b27-baaf-42da0efe49ee\") " pod="openshift-service-ca/service-ca-865cb79987-qnbp2" Apr 23 16:35:38.147163 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:38.147137 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fphbl\" (UniqueName: \"kubernetes.io/projected/3f736efa-d279-4b27-baaf-42da0efe49ee-kube-api-access-fphbl\") pod \"service-ca-865cb79987-qnbp2\" (UID: \"3f736efa-d279-4b27-baaf-42da0efe49ee\") " pod="openshift-service-ca/service-ca-865cb79987-qnbp2" Apr 23 16:35:38.257698 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:38.257616 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-qnbp2" Apr 23 16:35:38.393718 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:38.393685 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-qnbp2"] Apr 23 16:35:38.396359 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:35:38.396331 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f736efa_d279_4b27_baaf_42da0efe49ee.slice/crio-0e4ee992f5d38582793f0cde6ce0a8fa91acf2eee9b36bbd9a007f5cf0497125 WatchSource:0}: Error finding container 0e4ee992f5d38582793f0cde6ce0a8fa91acf2eee9b36bbd9a007f5cf0497125: Status 404 returned error can't find the container with id 0e4ee992f5d38582793f0cde6ce0a8fa91acf2eee9b36bbd9a007f5cf0497125 Apr 23 16:35:38.611895 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:38.611870 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 16:35:38.612337 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:38.612290 2575 scope.go:117] "RemoveContainer" containerID="f9deac18e5435651f06d874a717075688ddbdd4dfd617b877699d4f28e4bbad4" Apr 23 16:35:38.612541 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:38.612504 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-j4s49_openshift-console-operator(c0fa4bfb-d80b-4502-9523-25faf9cd73c4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" podUID="c0fa4bfb-d80b-4502-9523-25faf9cd73c4" Apr 23 16:35:38.613838 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:38.613812 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-qnbp2" event={"ID":"3f736efa-d279-4b27-baaf-42da0efe49ee","Type":"ContainerStarted","Data":"0e4ee992f5d38582793f0cde6ce0a8fa91acf2eee9b36bbd9a007f5cf0497125"} Apr 23 16:35:39.618216 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:39.618172 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qzd42" event={"ID":"82ab9f64-b97b-43bc-95b2-9c09b443c480","Type":"ContainerStarted","Data":"48db3561807b9fc07c88d069120af15f6f670114d3784ab048dc641aebaf99ae"} Apr 23 16:35:39.618216 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:39.618211 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qzd42" event={"ID":"82ab9f64-b97b-43bc-95b2-9c09b443c480","Type":"ContainerStarted","Data":"2832653d28db02b98dd5bdb9c648cf60896d6b3a90a2a53490da111679aa454d"} Apr 23 16:35:39.619488 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:39.619458 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-qnbp2" event={"ID":"3f736efa-d279-4b27-baaf-42da0efe49ee","Type":"ContainerStarted","Data":"1e293d1e7d5da39c86e176248c519bb7b18f3a6cdd95196e2daa9967a478b8ab"} Apr 23 16:35:39.663333 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:39.663272 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qzd42" podStartSLOduration=1.783022838 podStartE2EDuration="3.663247202s" podCreationTimestamp="2026-04-23 16:35:36 +0000 UTC" firstStartedPulling="2026-04-23 16:35:37.118390193 +0000 UTC m=+44.381264651" lastFinishedPulling="2026-04-23 16:35:38.998614541 +0000 UTC m=+46.261489015" observedRunningTime="2026-04-23 16:35:39.660017212 +0000 UTC m=+46.922891690" watchObservedRunningTime="2026-04-23 16:35:39.663247202 +0000 UTC m=+46.926121682" Apr 23 16:35:39.713666 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:39.713606 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-qnbp2" podStartSLOduration=2.713586049 podStartE2EDuration="2.713586049s" podCreationTimestamp="2026-04-23 16:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:39.71082274 +0000 UTC m=+46.973697222" watchObservedRunningTime="2026-04-23 16:35:39.713586049 +0000 UTC m=+46.976460528" Apr 23 16:35:42.065348 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:42.065316 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:42.065816 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:42.065360 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f11952e-c294-49f7-be03-3644d42784ef-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jx8jb\" (UID: \"7f11952e-c294-49f7-be03-3644d42784ef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb" Apr 23 16:35:42.065816 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:42.065388 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-jmwrb\" (UID: \"46a1bc5d-9337-4a1e-9e0f-4ae982b76d79\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" Apr 23 16:35:42.065816 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:42.065498 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:42.065816 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:42.065565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:42.065816 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:42.065497 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:35:42.065816 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:42.065637 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:35:42.065816 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:42.065657 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84465bb98d-vbtvd: secret "image-registry-tls" not found Apr 23 16:35:42.065816 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:42.065561 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:35:42.065816 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:42.065644 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs podName:5f517baa-7cef-488d-bba8-ced00dd978ea nodeName:}" failed. No retries permitted until 2026-04-23 16:35:58.065623251 +0000 UTC m=+65.328497727 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs") pod "router-default-9d787c8db-s4x7f" (UID: "5f517baa-7cef-488d-bba8-ced00dd978ea") : secret "router-metrics-certs-default" not found Apr 23 16:35:42.065816 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:42.065721 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle podName:5f517baa-7cef-488d-bba8-ced00dd978ea nodeName:}" failed. No retries permitted until 2026-04-23 16:35:58.065704306 +0000 UTC m=+65.328578770 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle") pod "router-default-9d787c8db-s4x7f" (UID: "5f517baa-7cef-488d-bba8-ced00dd978ea") : configmap references non-existent config key: service-ca.crt Apr 23 16:35:42.065816 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:42.065736 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls podName:d8bc09ea-e1da-495e-98df-c9bcabf133f3 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:58.065726957 +0000 UTC m=+65.328601420 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls") pod "image-registry-84465bb98d-vbtvd" (UID: "d8bc09ea-e1da-495e-98df-c9bcabf133f3") : secret "image-registry-tls" not found Apr 23 16:35:42.065816 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:42.065751 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls podName:46a1bc5d-9337-4a1e-9e0f-4ae982b76d79 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:58.065741728 +0000 UTC m=+65.328616186 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-jmwrb" (UID: "46a1bc5d-9337-4a1e-9e0f-4ae982b76d79") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:35:42.068094 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:42.068067 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f11952e-c294-49f7-be03-3644d42784ef-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jx8jb\" (UID: \"7f11952e-c294-49f7-be03-3644d42784ef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb" Apr 23 16:35:42.137776 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:42.137733 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb" Apr 23 16:35:42.166988 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:42.166957 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert\") pod \"ingress-canary-5kp97\" (UID: \"8b59acd4-b73a-4a98-962f-47e8e830e452\") " pod="openshift-ingress-canary/ingress-canary-5kp97" Apr 23 16:35:42.167140 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:42.167002 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls\") pod \"dns-default-blbwq\" (UID: \"a1edcb0c-3893-4226-86ba-f6c8e3795d5a\") " pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:42.167140 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:42.167116 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:42.167271 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:42.167153 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:42.167271 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:42.167181 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert podName:8b59acd4-b73a-4a98-962f-47e8e830e452 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:58.167164604 +0000 UTC m=+65.430039063 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert") pod "ingress-canary-5kp97" (UID: "8b59acd4-b73a-4a98-962f-47e8e830e452") : secret "canary-serving-cert" not found Apr 23 16:35:42.167271 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:42.167259 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls podName:a1edcb0c-3893-4226-86ba-f6c8e3795d5a nodeName:}" failed. No retries permitted until 2026-04-23 16:35:58.167246669 +0000 UTC m=+65.430121126 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls") pod "dns-default-blbwq" (UID: "a1edcb0c-3893-4226-86ba-f6c8e3795d5a") : secret "dns-default-metrics-tls" not found Apr 23 16:35:42.271432 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:42.271405 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb"] Apr 23 16:35:42.628710 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:42.628628 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb" event={"ID":"7f11952e-c294-49f7-be03-3644d42784ef","Type":"ContainerStarted","Data":"9efd647f35cd2e31d1caab63e26cce06105938fe5e5323f0cad8f5179fd4b507"} Apr 23 16:35:44.635755 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:44.635720 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb" event={"ID":"7f11952e-c294-49f7-be03-3644d42784ef","Type":"ContainerStarted","Data":"b0a5bec300664459261c91af0739480d5fb967a8d88815aa160213059ebb4c05"} Apr 23 16:35:44.635755 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:44.635757 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb" event={"ID":"7f11952e-c294-49f7-be03-3644d42784ef","Type":"ContainerStarted","Data":"d7b32ea0d390a3038b6d4c211c9fcb28045ae543fc584c5b579dbdcc5aeda1eb"} Apr 23 16:35:44.658721 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:44.658662 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jx8jb" podStartSLOduration=21.987516989 podStartE2EDuration="23.658640648s" podCreationTimestamp="2026-04-23 16:35:21 +0000 UTC" firstStartedPulling="2026-04-23 16:35:42.350354898 +0000 UTC m=+49.613229356" lastFinishedPulling="2026-04-23 16:35:44.021478555 +0000 UTC m=+51.284353015" observedRunningTime="2026-04-23 16:35:44.658062714 +0000 UTC m=+51.920937196" watchObservedRunningTime="2026-04-23 16:35:44.658640648 +0000 UTC m=+51.921515121" Apr 23 16:35:46.598849 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:46.598796 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:46.598849 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:46.598861 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:46.599492 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:46.599345 2575 scope.go:117] "RemoveContainer" containerID="f9deac18e5435651f06d874a717075688ddbdd4dfd617b877699d4f28e4bbad4" Apr 23 16:35:46.599618 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:35:46.599598 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-j4s49_openshift-console-operator(c0fa4bfb-d80b-4502-9523-25faf9cd73c4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" podUID="c0fa4bfb-d80b-4502-9523-25faf9cd73c4" Apr 23 16:35:52.533677 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:52.533642 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-75zl8" Apr 23 16:35:58.093544 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.093481 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:58.094022 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.093559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:58.094022 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.093588 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-jmwrb\" (UID: \"46a1bc5d-9337-4a1e-9e0f-4ae982b76d79\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" Apr 23 16:35:58.094022 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.093809 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:58.094209 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.094187 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f517baa-7cef-488d-bba8-ced00dd978ea-service-ca-bundle\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:58.096213 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.096184 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f517baa-7cef-488d-bba8-ced00dd978ea-metrics-certs\") pod \"router-default-9d787c8db-s4x7f\" (UID: \"5f517baa-7cef-488d-bba8-ced00dd978ea\") " pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:58.096323 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.096187 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/46a1bc5d-9337-4a1e-9e0f-4ae982b76d79-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-jmwrb\" (UID: \"46a1bc5d-9337-4a1e-9e0f-4ae982b76d79\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" Apr 23 16:35:58.096323 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.096241 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls\") pod \"image-registry-84465bb98d-vbtvd\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:58.194800 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.194765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert\") pod \"ingress-canary-5kp97\" (UID: \"8b59acd4-b73a-4a98-962f-47e8e830e452\") " pod="openshift-ingress-canary/ingress-canary-5kp97" Apr 23 16:35:58.194800 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.194807 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls\") pod \"dns-default-blbwq\" (UID: \"a1edcb0c-3893-4226-86ba-f6c8e3795d5a\") " pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:58.197238 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.197217 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a1edcb0c-3893-4226-86ba-f6c8e3795d5a-metrics-tls\") pod \"dns-default-blbwq\" (UID: \"a1edcb0c-3893-4226-86ba-f6c8e3795d5a\") " pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:58.197402 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.197381 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b59acd4-b73a-4a98-962f-47e8e830e452-cert\") pod \"ingress-canary-5kp97\" (UID: \"8b59acd4-b73a-4a98-962f-47e8e830e452\") " pod="openshift-ingress-canary/ingress-canary-5kp97" Apr 23 16:35:58.271665 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.271640 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-d4d8w\"" Apr 23 16:35:58.280035 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.280016 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:58.310810 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.310772 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-67fr9\"" Apr 23 16:35:58.316461 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.316436 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-trh28\"" Apr 23 16:35:58.318748 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.318731 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" Apr 23 16:35:58.324555 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.324461 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:58.420409 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.420380 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w6ksl\"" Apr 23 16:35:58.428217 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.427725 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5kp97" Apr 23 16:35:58.441408 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.441379 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-szgdw\"" Apr 23 16:35:58.443877 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.443853 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-84465bb98d-vbtvd"] Apr 23 16:35:58.448334 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.446978 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-blbwq" Apr 23 16:35:58.490018 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.489950 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb"] Apr 23 16:35:58.497126 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:35:58.497036 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46a1bc5d_9337_4a1e_9e0f_4ae982b76d79.slice/crio-a2dffa796af9a33ff0f278db3ca010022a2aa180e0284f448ce7ffbf7d6e36d1 WatchSource:0}: Error finding container a2dffa796af9a33ff0f278db3ca010022a2aa180e0284f448ce7ffbf7d6e36d1: Status 404 returned error can't find the container with id a2dffa796af9a33ff0f278db3ca010022a2aa180e0284f448ce7ffbf7d6e36d1 Apr 23 16:35:58.510420 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.510382 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-9d787c8db-s4x7f"] Apr 23 16:35:58.517491 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:35:58.517449 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f517baa_7cef_488d_bba8_ced00dd978ea.slice/crio-b862573bfd01a3a5eeef60218bb4654389d14831d6d1d2f82676c531ed80d19f WatchSource:0}: Error finding container b862573bfd01a3a5eeef60218bb4654389d14831d6d1d2f82676c531ed80d19f: Status 404 returned error can't find the container with id b862573bfd01a3a5eeef60218bb4654389d14831d6d1d2f82676c531ed80d19f Apr 23 16:35:58.577072 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.576978 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5kp97"] Apr 23 16:35:58.579190 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:35:58.579159 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b59acd4_b73a_4a98_962f_47e8e830e452.slice/crio-e505b76f5ad5e0f7bf610cd52ffb41c54df9b143d64341335d61fb034e1d5a1a WatchSource:0}: Error finding container e505b76f5ad5e0f7bf610cd52ffb41c54df9b143d64341335d61fb034e1d5a1a: Status 404 returned error can't find the container with id e505b76f5ad5e0f7bf610cd52ffb41c54df9b143d64341335d61fb034e1d5a1a Apr 23 16:35:58.596772 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.596749 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-blbwq"] Apr 23 16:35:58.599822 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:35:58.599797 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1edcb0c_3893_4226_86ba_f6c8e3795d5a.slice/crio-1969e07090a8c42e952df45715bed9cb3569bbda9b0e08d8d07b39df0f2dce38 WatchSource:0}: Error finding container 1969e07090a8c42e952df45715bed9cb3569bbda9b0e08d8d07b39df0f2dce38: Status 404 returned error can't find the container with id 1969e07090a8c42e952df45715bed9cb3569bbda9b0e08d8d07b39df0f2dce38 Apr 23 16:35:58.685142 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.685072 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" event={"ID":"46a1bc5d-9337-4a1e-9e0f-4ae982b76d79","Type":"ContainerStarted","Data":"a2dffa796af9a33ff0f278db3ca010022a2aa180e0284f448ce7ffbf7d6e36d1"} Apr 23 16:35:58.687008 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.686976 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" event={"ID":"d8bc09ea-e1da-495e-98df-c9bcabf133f3","Type":"ContainerStarted","Data":"083645918f2e68a6c73d064f14f3316de9823728f84268ea6ddf84284b54d689"} Apr 23 16:35:58.687008 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.687012 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" event={"ID":"d8bc09ea-e1da-495e-98df-c9bcabf133f3","Type":"ContainerStarted","Data":"335adf2182387e2f28f07e92ee837fb25d3fc702b9e5289da1c483a850ab6a8e"} Apr 23 16:35:58.687307 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.687282 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:35:58.688381 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.688336 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-blbwq" event={"ID":"a1edcb0c-3893-4226-86ba-f6c8e3795d5a","Type":"ContainerStarted","Data":"1969e07090a8c42e952df45715bed9cb3569bbda9b0e08d8d07b39df0f2dce38"} Apr 23 16:35:58.689473 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.689433 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5kp97" event={"ID":"8b59acd4-b73a-4a98-962f-47e8e830e452","Type":"ContainerStarted","Data":"e505b76f5ad5e0f7bf610cd52ffb41c54df9b143d64341335d61fb034e1d5a1a"} Apr 23 16:35:58.690990 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.690966 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-9d787c8db-s4x7f" event={"ID":"5f517baa-7cef-488d-bba8-ced00dd978ea","Type":"ContainerStarted","Data":"615fc81b4f9e0f44a4a4f51af5de0ebbb98c5eb6a9d9439105964087cc26cb5e"} Apr 23 16:35:58.691095 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.690995 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-9d787c8db-s4x7f" event={"ID":"5f517baa-7cef-488d-bba8-ced00dd978ea","Type":"ContainerStarted","Data":"b862573bfd01a3a5eeef60218bb4654389d14831d6d1d2f82676c531ed80d19f"} Apr 23 16:35:58.714453 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.714390 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" podStartSLOduration=64.714373412 podStartE2EDuration="1m4.714373412s" podCreationTimestamp="2026-04-23 16:34:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:58.713739859 +0000 UTC m=+65.976614340" watchObservedRunningTime="2026-04-23 16:35:58.714373412 +0000 UTC m=+65.977247891" Apr 23 16:35:58.736429 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:58.736373 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-9d787c8db-s4x7f" podStartSLOduration=47.7363571 podStartE2EDuration="47.7363571s" podCreationTimestamp="2026-04-23 16:35:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:58.735485187 +0000 UTC m=+65.998359667" watchObservedRunningTime="2026-04-23 16:35:58.7363571 +0000 UTC m=+65.999231580" Apr 23 16:35:59.105022 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:59.104974 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs\") pod \"network-metrics-daemon-grr5m\" (UID: \"53be1a16-14c5-4f4a-b293-a31505dc39e3\") " pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:59.108518 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:59.108488 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53be1a16-14c5-4f4a-b293-a31505dc39e3-metrics-certs\") pod \"network-metrics-daemon-grr5m\" (UID: \"53be1a16-14c5-4f4a-b293-a31505dc39e3\") " pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:59.325726 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:59.325692 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:59.329128 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:59.329103 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:59.363910 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:59.363035 2575 scope.go:117] "RemoveContainer" containerID="f9deac18e5435651f06d874a717075688ddbdd4dfd617b877699d4f28e4bbad4" Apr 23 16:35:59.380299 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:59.380089 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5zkmp\"" Apr 23 16:35:59.387971 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:59.387736 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-grr5m" Apr 23 16:35:59.559100 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:59.559046 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-grr5m"] Apr 23 16:35:59.564718 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:35:59.564683 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53be1a16_14c5_4f4a_b293_a31505dc39e3.slice/crio-234641f247da5e3f4c8dc212355b106333c0407085b5f84f76023d0904a86328 WatchSource:0}: Error finding container 234641f247da5e3f4c8dc212355b106333c0407085b5f84f76023d0904a86328: Status 404 returned error can't find the container with id 234641f247da5e3f4c8dc212355b106333c0407085b5f84f76023d0904a86328 Apr 23 16:35:59.701003 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:59.700914 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-grr5m" event={"ID":"53be1a16-14c5-4f4a-b293-a31505dc39e3","Type":"ContainerStarted","Data":"234641f247da5e3f4c8dc212355b106333c0407085b5f84f76023d0904a86328"} Apr 23 16:35:59.706275 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:59.706241 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 16:35:59.706382 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:59.706311 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" event={"ID":"c0fa4bfb-d80b-4502-9523-25faf9cd73c4","Type":"ContainerStarted","Data":"2170e26d6b5c1a5904b05c90872bf8e60c4b7a429318cc4edb0eef6cc053b03d"} Apr 23 16:35:59.706870 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:59.706850 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:59.707190 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:59.707154 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:35:59.708475 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:59.708453 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-9d787c8db-s4x7f" Apr 23 16:35:59.728180 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:35:59.727605 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" podStartSLOduration=31.932614062 podStartE2EDuration="38.727589201s" podCreationTimestamp="2026-04-23 16:35:21 +0000 UTC" firstStartedPulling="2026-04-23 16:35:27.504863748 +0000 UTC m=+34.767738206" lastFinishedPulling="2026-04-23 16:35:34.299838881 +0000 UTC m=+41.562713345" observedRunningTime="2026-04-23 16:35:59.726514189 +0000 UTC m=+66.989388670" watchObservedRunningTime="2026-04-23 16:35:59.727589201 +0000 UTC m=+66.990463683" Apr 23 16:36:00.208294 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:00.208266 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-j4s49" Apr 23 16:36:02.716922 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:02.716888 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-blbwq" event={"ID":"a1edcb0c-3893-4226-86ba-f6c8e3795d5a","Type":"ContainerStarted","Data":"7d1cdbc4d73da82f113dd6cb386ea81d04154e543bb650df7593f5d7e1ca88e0"} Apr 23 16:36:02.718252 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:02.718228 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5kp97" event={"ID":"8b59acd4-b73a-4a98-962f-47e8e830e452","Type":"ContainerStarted","Data":"47ab64e66036bbac2c50ed896e62b7ca1e6d10646616bac0ff8377c4b18bdcd3"} Apr 23 16:36:02.719515 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:02.719491 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" event={"ID":"46a1bc5d-9337-4a1e-9e0f-4ae982b76d79","Type":"ContainerStarted","Data":"ae2646dbba4538296b0b06dd3e29182f3651abf075177a4856605bbe9c598ae2"} Apr 23 16:36:02.737979 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:02.737932 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5kp97" podStartSLOduration=33.014299152 podStartE2EDuration="36.737913749s" podCreationTimestamp="2026-04-23 16:35:26 +0000 UTC" firstStartedPulling="2026-04-23 16:35:58.581208535 +0000 UTC m=+65.844083009" lastFinishedPulling="2026-04-23 16:36:02.304823137 +0000 UTC m=+69.567697606" observedRunningTime="2026-04-23 16:36:02.736279983 +0000 UTC m=+69.999154463" watchObservedRunningTime="2026-04-23 16:36:02.737913749 +0000 UTC m=+70.000788230" Apr 23 16:36:02.756030 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:02.755958 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jmwrb" podStartSLOduration=48.245354548 podStartE2EDuration="51.755853881s" podCreationTimestamp="2026-04-23 16:35:11 +0000 UTC" firstStartedPulling="2026-04-23 16:35:58.499131214 +0000 UTC m=+65.762005686" lastFinishedPulling="2026-04-23 16:36:02.00963056 +0000 UTC m=+69.272505019" observedRunningTime="2026-04-23 16:36:02.755000031 +0000 UTC m=+70.017874511" watchObservedRunningTime="2026-04-23 16:36:02.755853881 +0000 UTC m=+70.018728362" Apr 23 16:36:03.278671 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.278500 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-557799f949-h72lc"] Apr 23 16:36:03.292386 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.292350 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7"] Apr 23 16:36:03.292574 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.292492 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-557799f949-h72lc" Apr 23 16:36:03.296384 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.296284 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 16:36:03.296561 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.296545 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 16:36:03.296976 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.296959 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 16:36:03.297119 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.296960 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 16:36:03.297119 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.296991 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-7dkm8\"" Apr 23 16:36:03.307234 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.307212 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4"] Apr 23 16:36:03.307374 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.307355 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.314216 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.314193 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 16:36:03.314606 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.314571 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 16:36:03.315099 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.314712 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 16:36:03.315099 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.315061 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 16:36:03.327340 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.327319 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-557799f949-h72lc"] Apr 23 16:36:03.327428 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.327344 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7"] Apr 23 16:36:03.327428 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.327355 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4"] Apr 23 16:36:03.327502 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.327465 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4" Apr 23 16:36:03.330987 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.330888 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 23 16:36:03.339328 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.339304 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4ckx\" (UniqueName: \"kubernetes.io/projected/10181942-7ebe-4f47-ace7-5fdafd31e1bc-kube-api-access-v4ckx\") pod \"managed-serviceaccount-addon-agent-557799f949-h72lc\" (UID: \"10181942-7ebe-4f47-ace7-5fdafd31e1bc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-557799f949-h72lc" Apr 23 16:36:03.339425 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.339346 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/10181942-7ebe-4f47-ace7-5fdafd31e1bc-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-557799f949-h72lc\" (UID: \"10181942-7ebe-4f47-ace7-5fdafd31e1bc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-557799f949-h72lc" Apr 23 16:36:03.381158 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.381127 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-lnzv8"] Apr 23 16:36:03.401753 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.401724 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-v6r4h"] Apr 23 16:36:03.401877 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.401862 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-lnzv8" Apr 23 16:36:03.404454 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.404428 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 16:36:03.404595 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.404470 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 16:36:03.404677 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.404602 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-zpbhl\"" Apr 23 16:36:03.421873 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.421851 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-82lq4"] Apr 23 16:36:03.421995 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.421981 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-v6r4h" Apr 23 16:36:03.429132 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.429106 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 23 16:36:03.429253 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.429235 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 23 16:36:03.429310 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.429294 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-6bjkr\"" Apr 23 16:36:03.433725 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.433702 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-lnzv8"] Apr 23 16:36:03.434522 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.433838 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-82lq4" Apr 23 16:36:03.439968 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.439933 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/2f9f73f5-7fe4-474b-bd07-fd0230dea768-klusterlet-config\") pod \"klusterlet-addon-workmgr-9557b9cb4-vdxl4\" (UID: \"2f9f73f5-7fe4-474b-bd07-fd0230dea768\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4" Apr 23 16:36:03.440129 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.439989 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/ec8315d5-149b-471d-bf2c-68ce3f157fc0-ca\") pod \"cluster-proxy-proxy-agent-688f48857-bgjh7\" (UID: \"ec8315d5-149b-471d-bf2c-68ce3f157fc0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.440129 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.440016 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/ec8315d5-149b-471d-bf2c-68ce3f157fc0-hub\") pod \"cluster-proxy-proxy-agent-688f48857-bgjh7\" (UID: \"ec8315d5-149b-471d-bf2c-68ce3f157fc0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.440129 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.440042 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/ec8315d5-149b-471d-bf2c-68ce3f157fc0-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-688f48857-bgjh7\" (UID: \"ec8315d5-149b-471d-bf2c-68ce3f157fc0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.440129 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.440123 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-hllhs\"" Apr 23 16:36:03.440339 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.440182 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqbgn\" (UniqueName: \"kubernetes.io/projected/ec8315d5-149b-471d-bf2c-68ce3f157fc0-kube-api-access-dqbgn\") pod \"cluster-proxy-proxy-agent-688f48857-bgjh7\" (UID: \"ec8315d5-149b-471d-bf2c-68ce3f157fc0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.440339 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.440223 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq2tk\" (UniqueName: \"kubernetes.io/projected/2f9f73f5-7fe4-474b-bd07-fd0230dea768-kube-api-access-jq2tk\") pod \"klusterlet-addon-workmgr-9557b9cb4-vdxl4\" (UID: \"2f9f73f5-7fe4-474b-bd07-fd0230dea768\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4" Apr 23 16:36:03.440339 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.440280 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 23 16:36:03.440339 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.440289 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/ec8315d5-149b-471d-bf2c-68ce3f157fc0-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-688f48857-bgjh7\" (UID: \"ec8315d5-149b-471d-bf2c-68ce3f157fc0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.440586 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.440358 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4ckx\" (UniqueName: \"kubernetes.io/projected/10181942-7ebe-4f47-ace7-5fdafd31e1bc-kube-api-access-v4ckx\") pod \"managed-serviceaccount-addon-agent-557799f949-h72lc\" (UID: \"10181942-7ebe-4f47-ace7-5fdafd31e1bc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-557799f949-h72lc" Apr 23 16:36:03.440586 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.440385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ec8315d5-149b-471d-bf2c-68ce3f157fc0-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-688f48857-bgjh7\" (UID: \"ec8315d5-149b-471d-bf2c-68ce3f157fc0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.440586 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.440413 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f9f73f5-7fe4-474b-bd07-fd0230dea768-tmp\") pod \"klusterlet-addon-workmgr-9557b9cb4-vdxl4\" (UID: \"2f9f73f5-7fe4-474b-bd07-fd0230dea768\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4" Apr 23 16:36:03.440586 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.440473 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/10181942-7ebe-4f47-ace7-5fdafd31e1bc-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-557799f949-h72lc\" (UID: \"10181942-7ebe-4f47-ace7-5fdafd31e1bc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-557799f949-h72lc" Apr 23 16:36:03.441666 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.441646 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-82lq4"] Apr 23 16:36:03.443609 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.443586 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/10181942-7ebe-4f47-ace7-5fdafd31e1bc-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-557799f949-h72lc\" (UID: \"10181942-7ebe-4f47-ace7-5fdafd31e1bc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-557799f949-h72lc" Apr 23 16:36:03.444277 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.444256 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-v6r4h"] Apr 23 16:36:03.462422 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.462397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4ckx\" (UniqueName: \"kubernetes.io/projected/10181942-7ebe-4f47-ace7-5fdafd31e1bc-kube-api-access-v4ckx\") pod \"managed-serviceaccount-addon-agent-557799f949-h72lc\" (UID: \"10181942-7ebe-4f47-ace7-5fdafd31e1bc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-557799f949-h72lc" Apr 23 16:36:03.477395 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.477323 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-sbjdh"] Apr 23 16:36:03.498695 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.498670 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-sbjdh"] Apr 23 16:36:03.498855 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.498799 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-sbjdh" Apr 23 16:36:03.507166 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.506885 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 16:36:03.507166 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.507012 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-6plmq\"" Apr 23 16:36:03.507166 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.506885 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 16:36:03.541821 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.541793 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/540051e0-118f-4008-9cf7-76727424b1aa-data-volume\") pod \"insights-runtime-extractor-sbjdh\" (UID: \"540051e0-118f-4008-9cf7-76727424b1aa\") " pod="openshift-insights/insights-runtime-extractor-sbjdh" Apr 23 16:36:03.541956 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.541825 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/540051e0-118f-4008-9cf7-76727424b1aa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sbjdh\" (UID: \"540051e0-118f-4008-9cf7-76727424b1aa\") " pod="openshift-insights/insights-runtime-extractor-sbjdh" Apr 23 16:36:03.541956 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.541844 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/540051e0-118f-4008-9cf7-76727424b1aa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sbjdh\" (UID: \"540051e0-118f-4008-9cf7-76727424b1aa\") " pod="openshift-insights/insights-runtime-extractor-sbjdh" Apr 23 16:36:03.541956 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.541914 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/540051e0-118f-4008-9cf7-76727424b1aa-crio-socket\") pod \"insights-runtime-extractor-sbjdh\" (UID: \"540051e0-118f-4008-9cf7-76727424b1aa\") " pod="openshift-insights/insights-runtime-extractor-sbjdh" Apr 23 16:36:03.542131 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.541967 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/2f9f73f5-7fe4-474b-bd07-fd0230dea768-klusterlet-config\") pod \"klusterlet-addon-workmgr-9557b9cb4-vdxl4\" (UID: \"2f9f73f5-7fe4-474b-bd07-fd0230dea768\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4" Apr 23 16:36:03.542131 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.541997 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55vlq\" (UniqueName: \"kubernetes.io/projected/540051e0-118f-4008-9cf7-76727424b1aa-kube-api-access-55vlq\") pod \"insights-runtime-extractor-sbjdh\" (UID: \"540051e0-118f-4008-9cf7-76727424b1aa\") " pod="openshift-insights/insights-runtime-extractor-sbjdh" Apr 23 16:36:03.542131 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.542046 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/ec8315d5-149b-471d-bf2c-68ce3f157fc0-ca\") pod \"cluster-proxy-proxy-agent-688f48857-bgjh7\" (UID: \"ec8315d5-149b-471d-bf2c-68ce3f157fc0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.542131 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.542070 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/ec8315d5-149b-471d-bf2c-68ce3f157fc0-hub\") pod \"cluster-proxy-proxy-agent-688f48857-bgjh7\" (UID: \"ec8315d5-149b-471d-bf2c-68ce3f157fc0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.542131 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.542093 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/ec8315d5-149b-471d-bf2c-68ce3f157fc0-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-688f48857-bgjh7\" (UID: \"ec8315d5-149b-471d-bf2c-68ce3f157fc0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.542387 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.542129 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjvnq\" (UniqueName: \"kubernetes.io/projected/00bdf34d-629f-4acf-9d8b-4e403c540b89-kube-api-access-wjvnq\") pod \"downloads-6bcc868b7-lnzv8\" (UID: \"00bdf34d-629f-4acf-9d8b-4e403c540b89\") " pod="openshift-console/downloads-6bcc868b7-lnzv8" Apr 23 16:36:03.542387 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.542163 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/dcf23f0b-61d3-43d2-90da-77da96f0cdac-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-82lq4\" (UID: \"dcf23f0b-61d3-43d2-90da-77da96f0cdac\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-82lq4" Apr 23 16:36:03.542387 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.542195 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqbgn\" (UniqueName: \"kubernetes.io/projected/ec8315d5-149b-471d-bf2c-68ce3f157fc0-kube-api-access-dqbgn\") pod \"cluster-proxy-proxy-agent-688f48857-bgjh7\" (UID: \"ec8315d5-149b-471d-bf2c-68ce3f157fc0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.542387 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.542262 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/51a07785-d3a6-4de8-934f-e8cbc9da5f70-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-v6r4h\" (UID: \"51a07785-d3a6-4de8-934f-e8cbc9da5f70\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v6r4h" Apr 23 16:36:03.542387 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.542310 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq2tk\" (UniqueName: \"kubernetes.io/projected/2f9f73f5-7fe4-474b-bd07-fd0230dea768-kube-api-access-jq2tk\") pod \"klusterlet-addon-workmgr-9557b9cb4-vdxl4\" (UID: \"2f9f73f5-7fe4-474b-bd07-fd0230dea768\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4" Apr 23 16:36:03.542387 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.542345 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/ec8315d5-149b-471d-bf2c-68ce3f157fc0-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-688f48857-bgjh7\" (UID: \"ec8315d5-149b-471d-bf2c-68ce3f157fc0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.542718 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.542389 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/51a07785-d3a6-4de8-934f-e8cbc9da5f70-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-v6r4h\" (UID: \"51a07785-d3a6-4de8-934f-e8cbc9da5f70\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v6r4h" Apr 23 16:36:03.542718 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.542434 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ec8315d5-149b-471d-bf2c-68ce3f157fc0-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-688f48857-bgjh7\" (UID: \"ec8315d5-149b-471d-bf2c-68ce3f157fc0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.542718 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.542461 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f9f73f5-7fe4-474b-bd07-fd0230dea768-tmp\") pod \"klusterlet-addon-workmgr-9557b9cb4-vdxl4\" (UID: \"2f9f73f5-7fe4-474b-bd07-fd0230dea768\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4" Apr 23 16:36:03.543031 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.542942 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f9f73f5-7fe4-474b-bd07-fd0230dea768-tmp\") pod \"klusterlet-addon-workmgr-9557b9cb4-vdxl4\" (UID: \"2f9f73f5-7fe4-474b-bd07-fd0230dea768\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4" Apr 23 16:36:03.543092 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.543065 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/ec8315d5-149b-471d-bf2c-68ce3f157fc0-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-688f48857-bgjh7\" (UID: \"ec8315d5-149b-471d-bf2c-68ce3f157fc0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.545061 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.545040 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/ec8315d5-149b-471d-bf2c-68ce3f157fc0-ca\") pod \"cluster-proxy-proxy-agent-688f48857-bgjh7\" (UID: \"ec8315d5-149b-471d-bf2c-68ce3f157fc0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.545167 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.545142 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/ec8315d5-149b-471d-bf2c-68ce3f157fc0-hub\") pod \"cluster-proxy-proxy-agent-688f48857-bgjh7\" (UID: \"ec8315d5-149b-471d-bf2c-68ce3f157fc0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.545221 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.545189 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/2f9f73f5-7fe4-474b-bd07-fd0230dea768-klusterlet-config\") pod \"klusterlet-addon-workmgr-9557b9cb4-vdxl4\" (UID: \"2f9f73f5-7fe4-474b-bd07-fd0230dea768\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4" Apr 23 16:36:03.545562 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.545543 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/ec8315d5-149b-471d-bf2c-68ce3f157fc0-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-688f48857-bgjh7\" (UID: \"ec8315d5-149b-471d-bf2c-68ce3f157fc0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.545925 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.545909 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ec8315d5-149b-471d-bf2c-68ce3f157fc0-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-688f48857-bgjh7\" (UID: \"ec8315d5-149b-471d-bf2c-68ce3f157fc0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.556029 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.556001 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqbgn\" (UniqueName: \"kubernetes.io/projected/ec8315d5-149b-471d-bf2c-68ce3f157fc0-kube-api-access-dqbgn\") pod \"cluster-proxy-proxy-agent-688f48857-bgjh7\" (UID: \"ec8315d5-149b-471d-bf2c-68ce3f157fc0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.563488 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.563467 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq2tk\" (UniqueName: \"kubernetes.io/projected/2f9f73f5-7fe4-474b-bd07-fd0230dea768-kube-api-access-jq2tk\") pod \"klusterlet-addon-workmgr-9557b9cb4-vdxl4\" (UID: \"2f9f73f5-7fe4-474b-bd07-fd0230dea768\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4" Apr 23 16:36:03.613414 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.613389 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-557799f949-h72lc" Apr 23 16:36:03.622197 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.622168 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" Apr 23 16:36:03.643123 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.643076 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/51a07785-d3a6-4de8-934f-e8cbc9da5f70-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-v6r4h\" (UID: \"51a07785-d3a6-4de8-934f-e8cbc9da5f70\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v6r4h" Apr 23 16:36:03.643333 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.643162 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/540051e0-118f-4008-9cf7-76727424b1aa-data-volume\") pod \"insights-runtime-extractor-sbjdh\" (UID: \"540051e0-118f-4008-9cf7-76727424b1aa\") " pod="openshift-insights/insights-runtime-extractor-sbjdh" Apr 23 16:36:03.643333 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.643246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/540051e0-118f-4008-9cf7-76727424b1aa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sbjdh\" (UID: \"540051e0-118f-4008-9cf7-76727424b1aa\") " pod="openshift-insights/insights-runtime-extractor-sbjdh" Apr 23 16:36:03.643333 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.643280 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/540051e0-118f-4008-9cf7-76727424b1aa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sbjdh\" (UID: \"540051e0-118f-4008-9cf7-76727424b1aa\") " pod="openshift-insights/insights-runtime-extractor-sbjdh" Apr 23 16:36:03.643333 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.643314 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/540051e0-118f-4008-9cf7-76727424b1aa-crio-socket\") pod \"insights-runtime-extractor-sbjdh\" (UID: \"540051e0-118f-4008-9cf7-76727424b1aa\") " pod="openshift-insights/insights-runtime-extractor-sbjdh" Apr 23 16:36:03.643729 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.643356 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55vlq\" (UniqueName: \"kubernetes.io/projected/540051e0-118f-4008-9cf7-76727424b1aa-kube-api-access-55vlq\") pod \"insights-runtime-extractor-sbjdh\" (UID: \"540051e0-118f-4008-9cf7-76727424b1aa\") " pod="openshift-insights/insights-runtime-extractor-sbjdh" Apr 23 16:36:03.643729 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.643424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/540051e0-118f-4008-9cf7-76727424b1aa-data-volume\") pod \"insights-runtime-extractor-sbjdh\" (UID: \"540051e0-118f-4008-9cf7-76727424b1aa\") " pod="openshift-insights/insights-runtime-extractor-sbjdh" Apr 23 16:36:03.643729 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.643497 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/540051e0-118f-4008-9cf7-76727424b1aa-crio-socket\") pod \"insights-runtime-extractor-sbjdh\" (UID: \"540051e0-118f-4008-9cf7-76727424b1aa\") " pod="openshift-insights/insights-runtime-extractor-sbjdh" Apr 23 16:36:03.643929 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.643916 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/540051e0-118f-4008-9cf7-76727424b1aa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sbjdh\" (UID: \"540051e0-118f-4008-9cf7-76727424b1aa\") " pod="openshift-insights/insights-runtime-extractor-sbjdh" Apr 23 16:36:03.644091 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.644072 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjvnq\" (UniqueName: \"kubernetes.io/projected/00bdf34d-629f-4acf-9d8b-4e403c540b89-kube-api-access-wjvnq\") pod \"downloads-6bcc868b7-lnzv8\" (UID: \"00bdf34d-629f-4acf-9d8b-4e403c540b89\") " pod="openshift-console/downloads-6bcc868b7-lnzv8" Apr 23 16:36:03.644163 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.644104 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/51a07785-d3a6-4de8-934f-e8cbc9da5f70-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-v6r4h\" (UID: \"51a07785-d3a6-4de8-934f-e8cbc9da5f70\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v6r4h" Apr 23 16:36:03.644163 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.644116 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/dcf23f0b-61d3-43d2-90da-77da96f0cdac-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-82lq4\" (UID: \"dcf23f0b-61d3-43d2-90da-77da96f0cdac\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-82lq4" Apr 23 16:36:03.644256 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.644179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/51a07785-d3a6-4de8-934f-e8cbc9da5f70-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-v6r4h\" (UID: \"51a07785-d3a6-4de8-934f-e8cbc9da5f70\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v6r4h" Apr 23 16:36:03.647116 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.647093 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/dcf23f0b-61d3-43d2-90da-77da96f0cdac-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-82lq4\" (UID: \"dcf23f0b-61d3-43d2-90da-77da96f0cdac\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-82lq4" Apr 23 16:36:03.647291 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.647264 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/51a07785-d3a6-4de8-934f-e8cbc9da5f70-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-v6r4h\" (UID: \"51a07785-d3a6-4de8-934f-e8cbc9da5f70\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v6r4h" Apr 23 16:36:03.649956 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.649928 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4" Apr 23 16:36:03.654305 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.654264 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/540051e0-118f-4008-9cf7-76727424b1aa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sbjdh\" (UID: \"540051e0-118f-4008-9cf7-76727424b1aa\") " pod="openshift-insights/insights-runtime-extractor-sbjdh" Apr 23 16:36:03.658350 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.658322 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjvnq\" (UniqueName: \"kubernetes.io/projected/00bdf34d-629f-4acf-9d8b-4e403c540b89-kube-api-access-wjvnq\") pod \"downloads-6bcc868b7-lnzv8\" (UID: \"00bdf34d-629f-4acf-9d8b-4e403c540b89\") " pod="openshift-console/downloads-6bcc868b7-lnzv8" Apr 23 16:36:03.659121 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.659085 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55vlq\" (UniqueName: \"kubernetes.io/projected/540051e0-118f-4008-9cf7-76727424b1aa-kube-api-access-55vlq\") pod \"insights-runtime-extractor-sbjdh\" (UID: \"540051e0-118f-4008-9cf7-76727424b1aa\") " pod="openshift-insights/insights-runtime-extractor-sbjdh" Apr 23 16:36:03.710848 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.710340 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-lnzv8" Apr 23 16:36:03.727747 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.727676 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-blbwq" event={"ID":"a1edcb0c-3893-4226-86ba-f6c8e3795d5a","Type":"ContainerStarted","Data":"3eec4c9dd9e5328e89fe4ca5174cba90592ffa324b3ef3efb824980693555fb7"} Apr 23 16:36:03.728107 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.728005 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-blbwq" Apr 23 16:36:03.733062 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.730467 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-grr5m" event={"ID":"53be1a16-14c5-4f4a-b293-a31505dc39e3","Type":"ContainerStarted","Data":"2dafeea98c6db0b905788912279cff404f7e7e0773d83b06bc8f7cba85b1746b"} Apr 23 16:36:03.733062 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.730498 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-grr5m" event={"ID":"53be1a16-14c5-4f4a-b293-a31505dc39e3","Type":"ContainerStarted","Data":"2554a63dd6bccb15720f776ffe621c0e2edfee9a6bdb61705e3a2d34664f69c6"} Apr 23 16:36:03.733062 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.731271 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-v6r4h" Apr 23 16:36:03.753350 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.753318 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-82lq4" Apr 23 16:36:03.759349 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.757663 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-blbwq" podStartSLOduration=34.349433483 podStartE2EDuration="37.757641954s" podCreationTimestamp="2026-04-23 16:35:26 +0000 UTC" firstStartedPulling="2026-04-23 16:35:58.601856247 +0000 UTC m=+65.864730705" lastFinishedPulling="2026-04-23 16:36:02.010064718 +0000 UTC m=+69.272939176" observedRunningTime="2026-04-23 16:36:03.757596754 +0000 UTC m=+71.020471237" watchObservedRunningTime="2026-04-23 16:36:03.757641954 +0000 UTC m=+71.020516438" Apr 23 16:36:03.776023 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.775950 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-557799f949-h72lc"] Apr 23 16:36:03.795092 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.793778 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-grr5m" podStartSLOduration=67.814913264 podStartE2EDuration="1m10.793757755s" podCreationTimestamp="2026-04-23 16:34:53 +0000 UTC" firstStartedPulling="2026-04-23 16:35:59.568971353 +0000 UTC m=+66.831845823" lastFinishedPulling="2026-04-23 16:36:02.547815857 +0000 UTC m=+69.810690314" observedRunningTime="2026-04-23 16:36:03.791842266 +0000 UTC m=+71.054716783" watchObservedRunningTime="2026-04-23 16:36:03.793757755 +0000 UTC m=+71.056632235" Apr 23 16:36:03.823024 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.822285 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-sbjdh" Apr 23 16:36:03.824879 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.824822 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7"] Apr 23 16:36:03.843621 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.843565 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4"] Apr 23 16:36:03.853144 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:36:03.853113 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f9f73f5_7fe4_474b_bd07_fd0230dea768.slice/crio-839ab4efa08d3562c62b02c9641e24e39f70ce508ed4d4fdb72aaf67a2a2b2ec WatchSource:0}: Error finding container 839ab4efa08d3562c62b02c9641e24e39f70ce508ed4d4fdb72aaf67a2a2b2ec: Status 404 returned error can't find the container with id 839ab4efa08d3562c62b02c9641e24e39f70ce508ed4d4fdb72aaf67a2a2b2ec Apr 23 16:36:03.918241 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.918192 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-lnzv8"] Apr 23 16:36:03.920222 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:36:03.920182 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00bdf34d_629f_4acf_9d8b_4e403c540b89.slice/crio-c45d479c6a95d0e191aa33c21769d724db0d1f203b2e9c03efaeb77fd6423926 WatchSource:0}: Error finding container c45d479c6a95d0e191aa33c21769d724db0d1f203b2e9c03efaeb77fd6423926: Status 404 returned error can't find the container with id c45d479c6a95d0e191aa33c21769d724db0d1f203b2e9c03efaeb77fd6423926 Apr 23 16:36:03.942183 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.942149 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-v6r4h"] Apr 23 16:36:03.945206 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:36:03.945178 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51a07785_d3a6_4de8_934f_e8cbc9da5f70.slice/crio-57aa644340457b6e580826cf57d0266aede1353425022688fc9b84858011dd2e WatchSource:0}: Error finding container 57aa644340457b6e580826cf57d0266aede1353425022688fc9b84858011dd2e: Status 404 returned error can't find the container with id 57aa644340457b6e580826cf57d0266aede1353425022688fc9b84858011dd2e Apr 23 16:36:03.959993 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:03.959969 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-82lq4"] Apr 23 16:36:03.961981 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:36:03.961948 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcf23f0b_61d3_43d2_90da_77da96f0cdac.slice/crio-5bcc5200ade29bb2e96f9a1e948daeef37e5220629e78e0975d749cb18a72b12 WatchSource:0}: Error finding container 5bcc5200ade29bb2e96f9a1e948daeef37e5220629e78e0975d749cb18a72b12: Status 404 returned error can't find the container with id 5bcc5200ade29bb2e96f9a1e948daeef37e5220629e78e0975d749cb18a72b12 Apr 23 16:36:04.037751 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:04.037727 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-sbjdh"] Apr 23 16:36:04.040110 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:36:04.040064 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod540051e0_118f_4008_9cf7_76727424b1aa.slice/crio-a165055eb40abec7765fb684a914ffffd4103751c624da5c5ba11e948867922e WatchSource:0}: Error finding container a165055eb40abec7765fb684a914ffffd4103751c624da5c5ba11e948867922e: Status 404 returned error can't find the container with id a165055eb40abec7765fb684a914ffffd4103751c624da5c5ba11e948867922e Apr 23 16:36:04.735224 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:04.735169 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-82lq4" event={"ID":"dcf23f0b-61d3-43d2-90da-77da96f0cdac","Type":"ContainerStarted","Data":"5bcc5200ade29bb2e96f9a1e948daeef37e5220629e78e0975d749cb18a72b12"} Apr 23 16:36:04.736937 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:04.736874 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-557799f949-h72lc" event={"ID":"10181942-7ebe-4f47-ace7-5fdafd31e1bc","Type":"ContainerStarted","Data":"798ae993512d0b1d3942b382e0aa19cebea5b96bd6179f58ac94f971715a1e16"} Apr 23 16:36:04.738712 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:04.738659 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4" event={"ID":"2f9f73f5-7fe4-474b-bd07-fd0230dea768","Type":"ContainerStarted","Data":"839ab4efa08d3562c62b02c9641e24e39f70ce508ed4d4fdb72aaf67a2a2b2ec"} Apr 23 16:36:04.740347 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:04.740293 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-v6r4h" event={"ID":"51a07785-d3a6-4de8-934f-e8cbc9da5f70","Type":"ContainerStarted","Data":"57aa644340457b6e580826cf57d0266aede1353425022688fc9b84858011dd2e"} Apr 23 16:36:04.741954 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:04.741919 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" event={"ID":"ec8315d5-149b-471d-bf2c-68ce3f157fc0","Type":"ContainerStarted","Data":"1f5c3f200a1f8142bdc7d43ba4531b18619db7aaf6b6427f5537e976569e8c5b"} Apr 23 16:36:04.754851 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:04.754821 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-lnzv8" event={"ID":"00bdf34d-629f-4acf-9d8b-4e403c540b89","Type":"ContainerStarted","Data":"c45d479c6a95d0e191aa33c21769d724db0d1f203b2e9c03efaeb77fd6423926"} Apr 23 16:36:04.771649 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:04.771076 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sbjdh" event={"ID":"540051e0-118f-4008-9cf7-76727424b1aa","Type":"ContainerStarted","Data":"1c8be5f3ba59d0410528875b26ce461cace430825954185af4973d83d6faff3e"} Apr 23 16:36:04.771649 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:04.771115 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sbjdh" event={"ID":"540051e0-118f-4008-9cf7-76727424b1aa","Type":"ContainerStarted","Data":"a165055eb40abec7765fb684a914ffffd4103751c624da5c5ba11e948867922e"} Apr 23 16:36:07.609040 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:07.609008 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qlwjk" Apr 23 16:36:11.802478 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:11.802356 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sbjdh" event={"ID":"540051e0-118f-4008-9cf7-76727424b1aa","Type":"ContainerStarted","Data":"0d0ebd2c5cfeb43950038f2890b6561a67cc1f7617b30d25d7548402bf7f215f"} Apr 23 16:36:11.805267 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:11.805234 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-82lq4" event={"ID":"dcf23f0b-61d3-43d2-90da-77da96f0cdac","Type":"ContainerStarted","Data":"a776499a9b692a48d88ed1a67ad8c8c231b2ff4eb5805d1bb9acd4d421f6fcaa"} Apr 23 16:36:11.805646 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:11.805624 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-82lq4" Apr 23 16:36:11.810311 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:11.807405 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-557799f949-h72lc" event={"ID":"10181942-7ebe-4f47-ace7-5fdafd31e1bc","Type":"ContainerStarted","Data":"cb3f88afafc788b11c03a2b794ebcf8d91cf558e215fd19e84968d285a4a3ec5"} Apr 23 16:36:11.810311 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:11.810191 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4" event={"ID":"2f9f73f5-7fe4-474b-bd07-fd0230dea768","Type":"ContainerStarted","Data":"dfe7595d1efb977030c376ee716299597964bcd7f3457c0d38fb12d92287a9f7"} Apr 23 16:36:11.810932 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:11.810911 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4" Apr 23 16:36:11.812250 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:11.812231 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-82lq4" Apr 23 16:36:11.812344 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:11.812281 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4" Apr 23 16:36:11.813558 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:11.813513 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-v6r4h" event={"ID":"51a07785-d3a6-4de8-934f-e8cbc9da5f70","Type":"ContainerStarted","Data":"aed722080964caf51f7bc290959989387cfcc32860f77e5df98d1b75b7f72ec0"} Apr 23 16:36:11.815908 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:11.815888 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" event={"ID":"ec8315d5-149b-471d-bf2c-68ce3f157fc0","Type":"ContainerStarted","Data":"9cdddbaa6c569b4a06b5a86a8abb2fd33b933c2777c05053deec12d725ed7c89"} Apr 23 16:36:11.838434 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:11.837613 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-82lq4" podStartSLOduration=1.990942244 podStartE2EDuration="8.837594577s" podCreationTimestamp="2026-04-23 16:36:03 +0000 UTC" firstStartedPulling="2026-04-23 16:36:03.964239595 +0000 UTC m=+71.227114053" lastFinishedPulling="2026-04-23 16:36:10.81089191 +0000 UTC m=+78.073766386" observedRunningTime="2026-04-23 16:36:11.836337733 +0000 UTC m=+79.099212215" watchObservedRunningTime="2026-04-23 16:36:11.837594577 +0000 UTC m=+79.100469059" Apr 23 16:36:11.870087 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:11.870035 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9557b9cb4-vdxl4" podStartSLOduration=1.8936603010000002 podStartE2EDuration="8.870020029s" podCreationTimestamp="2026-04-23 16:36:03 +0000 UTC" firstStartedPulling="2026-04-23 16:36:03.865011937 +0000 UTC m=+71.127886402" lastFinishedPulling="2026-04-23 16:36:10.841371667 +0000 UTC m=+78.104246130" observedRunningTime="2026-04-23 16:36:11.86767408 +0000 UTC m=+79.130548562" watchObservedRunningTime="2026-04-23 16:36:11.870020029 +0000 UTC m=+79.132894507" Apr 23 16:36:11.908217 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:11.907422 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-557799f949-h72lc" podStartSLOduration=1.8784759229999999 podStartE2EDuration="8.907399741s" podCreationTimestamp="2026-04-23 16:36:03 +0000 UTC" firstStartedPulling="2026-04-23 16:36:03.803041187 +0000 UTC m=+71.065915659" lastFinishedPulling="2026-04-23 16:36:10.831965014 +0000 UTC m=+78.094839477" observedRunningTime="2026-04-23 16:36:11.894267237 +0000 UTC m=+79.157141718" watchObservedRunningTime="2026-04-23 16:36:11.907399741 +0000 UTC m=+79.170274221" Apr 23 16:36:11.923734 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:11.923405 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-v6r4h" podStartSLOduration=2.056440104 podStartE2EDuration="8.923386192s" podCreationTimestamp="2026-04-23 16:36:03 +0000 UTC" firstStartedPulling="2026-04-23 16:36:03.947351158 +0000 UTC m=+71.210225616" lastFinishedPulling="2026-04-23 16:36:10.814297234 +0000 UTC m=+78.077171704" observedRunningTime="2026-04-23 16:36:11.922017047 +0000 UTC m=+79.184891528" watchObservedRunningTime="2026-04-23 16:36:11.923386192 +0000 UTC m=+79.186260673" Apr 23 16:36:12.823999 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:12.823908 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sbjdh" event={"ID":"540051e0-118f-4008-9cf7-76727424b1aa","Type":"ContainerStarted","Data":"ee45876221e0d7151f48970afd32f95e9351ac9de7bc6c6d6e9680784b66f050"} Apr 23 16:36:12.845949 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:12.845098 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-sbjdh" podStartSLOduration=1.499945957 podStartE2EDuration="9.845082033s" podCreationTimestamp="2026-04-23 16:36:03 +0000 UTC" firstStartedPulling="2026-04-23 16:36:04.186105473 +0000 UTC m=+71.448979930" lastFinishedPulling="2026-04-23 16:36:12.531241544 +0000 UTC m=+79.794116006" observedRunningTime="2026-04-23 16:36:12.844171047 +0000 UTC m=+80.107045525" watchObservedRunningTime="2026-04-23 16:36:12.845082033 +0000 UTC m=+80.107956576" Apr 23 16:36:13.773015 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:13.772969 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-blbwq" Apr 23 16:36:13.830760 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:13.830066 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" event={"ID":"ec8315d5-149b-471d-bf2c-68ce3f157fc0","Type":"ContainerStarted","Data":"77bdc0d3912bce59ae3867634f502ae0f2f70704fdfb0c33cb40338baab6938f"} Apr 23 16:36:13.830760 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:13.830107 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" event={"ID":"ec8315d5-149b-471d-bf2c-68ce3f157fc0","Type":"ContainerStarted","Data":"0847ac05604f3cc88325eda9a79725ec9bd1b28d9f40db55c4c759ab60bd6d13"} Apr 23 16:36:13.852719 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:13.852278 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-688f48857-bgjh7" podStartSLOduration=1.191096851 podStartE2EDuration="10.852256775s" podCreationTimestamp="2026-04-23 16:36:03 +0000 UTC" firstStartedPulling="2026-04-23 16:36:03.833611739 +0000 UTC m=+71.096486213" lastFinishedPulling="2026-04-23 16:36:13.494771665 +0000 UTC m=+80.757646137" observedRunningTime="2026-04-23 16:36:13.850148599 +0000 UTC m=+81.113023143" watchObservedRunningTime="2026-04-23 16:36:13.852256775 +0000 UTC m=+81.115131257" Apr 23 16:36:18.290270 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.290220 2575 patch_prober.go:28] interesting pod/image-registry-84465bb98d-vbtvd container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 16:36:18.291017 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.290311 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" podUID="d8bc09ea-e1da-495e-98df-c9bcabf133f3" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:36:18.363189 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.363151 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nb7b2"] Apr 23 16:36:18.368739 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.368713 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.375018 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.374986 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 16:36:18.375177 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.374986 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-k5wxx\"" Apr 23 16:36:18.376285 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.376258 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 16:36:18.379418 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.379395 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 16:36:18.379956 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.379937 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 16:36:18.496241 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.496204 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2b3d124d-3598-4f79-b153-0ddc5a208ba7-node-exporter-textfile\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.496241 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.496245 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rgf5\" (UniqueName: \"kubernetes.io/projected/2b3d124d-3598-4f79-b153-0ddc5a208ba7-kube-api-access-4rgf5\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.496479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.496374 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2b3d124d-3598-4f79-b153-0ddc5a208ba7-node-exporter-tls\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.496479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.496417 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2b3d124d-3598-4f79-b153-0ddc5a208ba7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.496479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.496459 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2b3d124d-3598-4f79-b153-0ddc5a208ba7-root\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.496679 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.496499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b3d124d-3598-4f79-b153-0ddc5a208ba7-sys\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.496679 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.496558 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2b3d124d-3598-4f79-b153-0ddc5a208ba7-node-exporter-accelerators-collector-config\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.496679 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.496592 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2b3d124d-3598-4f79-b153-0ddc5a208ba7-node-exporter-wtmp\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.496679 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.496622 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b3d124d-3598-4f79-b153-0ddc5a208ba7-metrics-client-ca\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.597766 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.597671 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2b3d124d-3598-4f79-b153-0ddc5a208ba7-node-exporter-textfile\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.597766 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.597710 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rgf5\" (UniqueName: \"kubernetes.io/projected/2b3d124d-3598-4f79-b153-0ddc5a208ba7-kube-api-access-4rgf5\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.597766 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.597764 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2b3d124d-3598-4f79-b153-0ddc5a208ba7-node-exporter-tls\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.598055 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.597794 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2b3d124d-3598-4f79-b153-0ddc5a208ba7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.598055 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.597826 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2b3d124d-3598-4f79-b153-0ddc5a208ba7-root\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.598055 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.597854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b3d124d-3598-4f79-b153-0ddc5a208ba7-sys\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.598055 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.597884 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2b3d124d-3598-4f79-b153-0ddc5a208ba7-node-exporter-accelerators-collector-config\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.598055 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.597915 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2b3d124d-3598-4f79-b153-0ddc5a208ba7-node-exporter-wtmp\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.598055 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.597941 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b3d124d-3598-4f79-b153-0ddc5a208ba7-metrics-client-ca\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.598336 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.598056 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2b3d124d-3598-4f79-b153-0ddc5a208ba7-node-exporter-textfile\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.598336 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.598114 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2b3d124d-3598-4f79-b153-0ddc5a208ba7-root\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.598336 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.598237 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b3d124d-3598-4f79-b153-0ddc5a208ba7-sys\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.598336 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.598314 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2b3d124d-3598-4f79-b153-0ddc5a208ba7-node-exporter-wtmp\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.598615 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.598588 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b3d124d-3598-4f79-b153-0ddc5a208ba7-metrics-client-ca\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.598783 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.598758 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2b3d124d-3598-4f79-b153-0ddc5a208ba7-node-exporter-accelerators-collector-config\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.605340 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.601068 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2b3d124d-3598-4f79-b153-0ddc5a208ba7-node-exporter-tls\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.605340 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.601152 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2b3d124d-3598-4f79-b153-0ddc5a208ba7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.606409 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.606386 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rgf5\" (UniqueName: \"kubernetes.io/projected/2b3d124d-3598-4f79-b153-0ddc5a208ba7-kube-api-access-4rgf5\") pod \"node-exporter-nb7b2\" (UID: \"2b3d124d-3598-4f79-b153-0ddc5a208ba7\") " pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:18.680672 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:18.680630 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nb7b2" Apr 23 16:36:19.711367 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:19.711339 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:36:22.428995 ip-10-0-130-110 kubenswrapper[2575]: W0423 16:36:22.428963 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b3d124d_3598_4f79_b153_0ddc5a208ba7.slice/crio-cdf965588656b6d1de9f288befac70187d41d1c5cb50984dbdf9298ac0f9bf7e WatchSource:0}: Error finding container cdf965588656b6d1de9f288befac70187d41d1c5cb50984dbdf9298ac0f9bf7e: Status 404 returned error can't find the container with id cdf965588656b6d1de9f288befac70187d41d1c5cb50984dbdf9298ac0f9bf7e Apr 23 16:36:22.861426 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:22.861378 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-lnzv8" event={"ID":"00bdf34d-629f-4acf-9d8b-4e403c540b89","Type":"ContainerStarted","Data":"64b2c260e911e5b0094895e3f21a1bc4ba2a0cab1a75bf6e5eebb9f172334d8c"} Apr 23 16:36:22.861792 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:22.861767 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-lnzv8" Apr 23 16:36:22.863130 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:22.863103 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nb7b2" event={"ID":"2b3d124d-3598-4f79-b153-0ddc5a208ba7","Type":"ContainerStarted","Data":"cdf965588656b6d1de9f288befac70187d41d1c5cb50984dbdf9298ac0f9bf7e"} Apr 23 16:36:22.863674 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:22.863648 2575 patch_prober.go:28] interesting pod/downloads-6bcc868b7-lnzv8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.132.0.23:8080/\": dial tcp 10.132.0.23:8080: connect: connection refused" start-of-body= Apr 23 16:36:22.863771 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:22.863718 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-lnzv8" podUID="00bdf34d-629f-4acf-9d8b-4e403c540b89" containerName="download-server" probeResult="failure" output="Get \"http://10.132.0.23:8080/\": dial tcp 10.132.0.23:8080: connect: connection refused" Apr 23 16:36:22.890129 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:22.890074 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-lnzv8" podStartSLOduration=1.062708993 podStartE2EDuration="19.890056225s" podCreationTimestamp="2026-04-23 16:36:03 +0000 UTC" firstStartedPulling="2026-04-23 16:36:03.923769989 +0000 UTC m=+71.186644460" lastFinishedPulling="2026-04-23 16:36:22.751117218 +0000 UTC m=+90.013991692" observedRunningTime="2026-04-23 16:36:22.88821196 +0000 UTC m=+90.151086440" watchObservedRunningTime="2026-04-23 16:36:22.890056225 +0000 UTC m=+90.152930703" Apr 23 16:36:23.868303 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:23.868266 2575 generic.go:358] "Generic (PLEG): container finished" podID="2b3d124d-3598-4f79-b153-0ddc5a208ba7" containerID="f4e05c6998d82ec4efae7fd21ac200a3a95e7c9569cac9a496ad1850179dcae5" exitCode=0 Apr 23 16:36:23.868813 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:23.868364 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nb7b2" event={"ID":"2b3d124d-3598-4f79-b153-0ddc5a208ba7","Type":"ContainerDied","Data":"f4e05c6998d82ec4efae7fd21ac200a3a95e7c9569cac9a496ad1850179dcae5"} Apr 23 16:36:23.872545 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:23.872505 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-lnzv8" Apr 23 16:36:24.874926 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:24.874886 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nb7b2" event={"ID":"2b3d124d-3598-4f79-b153-0ddc5a208ba7","Type":"ContainerStarted","Data":"805315fad8066651bc91d0f0d17d34faa7683560e360a75847f46cafaa4e9364"} Apr 23 16:36:24.875395 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:24.874932 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nb7b2" event={"ID":"2b3d124d-3598-4f79-b153-0ddc5a208ba7","Type":"ContainerStarted","Data":"3f4cd24bb275e1673fd300ea99caf508a67507693f823b7dfe634612d4255f1d"} Apr 23 16:36:24.899049 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:24.898996 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nb7b2" podStartSLOduration=6.094614991 podStartE2EDuration="6.898980813s" podCreationTimestamp="2026-04-23 16:36:18 +0000 UTC" firstStartedPulling="2026-04-23 16:36:22.431175179 +0000 UTC m=+89.694049636" lastFinishedPulling="2026-04-23 16:36:23.23554098 +0000 UTC m=+90.498415458" observedRunningTime="2026-04-23 16:36:24.896378085 +0000 UTC m=+92.159252576" watchObservedRunningTime="2026-04-23 16:36:24.898980813 +0000 UTC m=+92.161855292" Apr 23 16:36:31.145968 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:31.145936 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-84465bb98d-vbtvd"] Apr 23 16:36:40.927784 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:40.927749 2575 generic.go:358] "Generic (PLEG): container finished" podID="982b5b8d-3beb-472b-af96-34991930ce23" containerID="abfb06a484f9996005767fdd01a5428475e88f271131ffbdacebe8520116d4cc" exitCode=0 Apr 23 16:36:40.928196 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:40.927801 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r" event={"ID":"982b5b8d-3beb-472b-af96-34991930ce23","Type":"ContainerDied","Data":"abfb06a484f9996005767fdd01a5428475e88f271131ffbdacebe8520116d4cc"} Apr 23 16:36:40.928196 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:40.928111 2575 scope.go:117] "RemoveContainer" containerID="abfb06a484f9996005767fdd01a5428475e88f271131ffbdacebe8520116d4cc" Apr 23 16:36:41.932776 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:41.932739 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wgk4r" event={"ID":"982b5b8d-3beb-472b-af96-34991930ce23","Type":"ContainerStarted","Data":"e66d30e820038d78ce00d85e8236997eb92a505ae6a36ea9bc265448dc552835"} Apr 23 16:36:45.945567 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:45.945499 2575 generic.go:358] "Generic (PLEG): container finished" podID="38274876-af8d-4558-a05f-022de9171c7d" containerID="94030261ff5411f26d9cc012dbe497d78f67e8575d33d9adf66f48d4f0050180" exitCode=0 Apr 23 16:36:45.946047 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:45.945585 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb" event={"ID":"38274876-af8d-4558-a05f-022de9171c7d","Type":"ContainerDied","Data":"94030261ff5411f26d9cc012dbe497d78f67e8575d33d9adf66f48d4f0050180"} Apr 23 16:36:45.946047 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:45.945913 2575 scope.go:117] "RemoveContainer" containerID="94030261ff5411f26d9cc012dbe497d78f67e8575d33d9adf66f48d4f0050180" Apr 23 16:36:46.950446 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:46.950407 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6vzmb" event={"ID":"38274876-af8d-4558-a05f-022de9171c7d","Type":"ContainerStarted","Data":"df27abd916cf80254583457d1748c7f0a1282c98166abae0c004616f1ed88939"} Apr 23 16:36:56.169421 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.169360 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" podUID="d8bc09ea-e1da-495e-98df-c9bcabf133f3" containerName="registry" containerID="cri-o://083645918f2e68a6c73d064f14f3316de9823728f84268ea6ddf84284b54d689" gracePeriod=30 Apr 23 16:36:56.426574 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.426495 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:36:56.510397 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.510359 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d8bc09ea-e1da-495e-98df-c9bcabf133f3-installation-pull-secrets\") pod \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " Apr 23 16:36:56.510627 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.510408 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjz6b\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-kube-api-access-gjz6b\") pod \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " Apr 23 16:36:56.510627 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.510436 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-bound-sa-token\") pod \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " Apr 23 16:36:56.510627 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.510489 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8bc09ea-e1da-495e-98df-c9bcabf133f3-trusted-ca\") pod \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " Apr 23 16:36:56.510627 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.510549 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-certificates\") pod \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " Apr 23 16:36:56.510627 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.510573 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls\") pod \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " Apr 23 16:36:56.510887 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.510794 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d8bc09ea-e1da-495e-98df-c9bcabf133f3-image-registry-private-configuration\") pod \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " Apr 23 16:36:56.510887 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.510846 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d8bc09ea-e1da-495e-98df-c9bcabf133f3-ca-trust-extracted\") pod \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\" (UID: \"d8bc09ea-e1da-495e-98df-c9bcabf133f3\") " Apr 23 16:36:56.511066 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.511014 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8bc09ea-e1da-495e-98df-c9bcabf133f3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d8bc09ea-e1da-495e-98df-c9bcabf133f3" (UID: "d8bc09ea-e1da-495e-98df-c9bcabf133f3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:56.511066 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.511008 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d8bc09ea-e1da-495e-98df-c9bcabf133f3" (UID: "d8bc09ea-e1da-495e-98df-c9bcabf133f3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:56.511194 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.511142 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8bc09ea-e1da-495e-98df-c9bcabf133f3-trusted-ca\") on node \"ip-10-0-130-110.ec2.internal\" DevicePath \"\"" Apr 23 16:36:56.511194 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.511163 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-certificates\") on node \"ip-10-0-130-110.ec2.internal\" DevicePath \"\"" Apr 23 16:36:56.513150 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.513125 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8bc09ea-e1da-495e-98df-c9bcabf133f3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d8bc09ea-e1da-495e-98df-c9bcabf133f3" (UID: "d8bc09ea-e1da-495e-98df-c9bcabf133f3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:56.513259 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.513180 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d8bc09ea-e1da-495e-98df-c9bcabf133f3" (UID: "d8bc09ea-e1da-495e-98df-c9bcabf133f3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:36:56.513421 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.513399 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d8bc09ea-e1da-495e-98df-c9bcabf133f3" (UID: "d8bc09ea-e1da-495e-98df-c9bcabf133f3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:36:56.513559 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.513511 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-kube-api-access-gjz6b" (OuterVolumeSpecName: "kube-api-access-gjz6b") pod "d8bc09ea-e1da-495e-98df-c9bcabf133f3" (UID: "d8bc09ea-e1da-495e-98df-c9bcabf133f3"). InnerVolumeSpecName "kube-api-access-gjz6b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:36:56.513742 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.513721 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8bc09ea-e1da-495e-98df-c9bcabf133f3-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "d8bc09ea-e1da-495e-98df-c9bcabf133f3" (UID: "d8bc09ea-e1da-495e-98df-c9bcabf133f3"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:56.519297 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.519273 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8bc09ea-e1da-495e-98df-c9bcabf133f3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d8bc09ea-e1da-495e-98df-c9bcabf133f3" (UID: "d8bc09ea-e1da-495e-98df-c9bcabf133f3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:36:56.612003 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.611974 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d8bc09ea-e1da-495e-98df-c9bcabf133f3-image-registry-private-configuration\") on node \"ip-10-0-130-110.ec2.internal\" DevicePath \"\"" Apr 23 16:36:56.612003 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.612002 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d8bc09ea-e1da-495e-98df-c9bcabf133f3-ca-trust-extracted\") on node \"ip-10-0-130-110.ec2.internal\" DevicePath \"\"" Apr 23 16:36:56.612477 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.612013 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d8bc09ea-e1da-495e-98df-c9bcabf133f3-installation-pull-secrets\") on node \"ip-10-0-130-110.ec2.internal\" DevicePath \"\"" Apr 23 16:36:56.612477 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.612022 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gjz6b\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-kube-api-access-gjz6b\") on node \"ip-10-0-130-110.ec2.internal\" DevicePath \"\"" Apr 23 16:36:56.612477 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.612031 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-bound-sa-token\") on node \"ip-10-0-130-110.ec2.internal\" DevicePath \"\"" Apr 23 16:36:56.612477 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.612040 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d8bc09ea-e1da-495e-98df-c9bcabf133f3-registry-tls\") on node \"ip-10-0-130-110.ec2.internal\" DevicePath \"\"" Apr 23 16:36:56.981223 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.981187 2575 generic.go:358] "Generic (PLEG): container finished" podID="d8bc09ea-e1da-495e-98df-c9bcabf133f3" containerID="083645918f2e68a6c73d064f14f3316de9823728f84268ea6ddf84284b54d689" exitCode=0 Apr 23 16:36:56.981462 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.981241 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" event={"ID":"d8bc09ea-e1da-495e-98df-c9bcabf133f3","Type":"ContainerDied","Data":"083645918f2e68a6c73d064f14f3316de9823728f84268ea6ddf84284b54d689"} Apr 23 16:36:56.981462 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.981248 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" Apr 23 16:36:56.981462 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.981265 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84465bb98d-vbtvd" event={"ID":"d8bc09ea-e1da-495e-98df-c9bcabf133f3","Type":"ContainerDied","Data":"335adf2182387e2f28f07e92ee837fb25d3fc702b9e5289da1c483a850ab6a8e"} Apr 23 16:36:56.981462 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.981281 2575 scope.go:117] "RemoveContainer" containerID="083645918f2e68a6c73d064f14f3316de9823728f84268ea6ddf84284b54d689" Apr 23 16:36:56.990074 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.990058 2575 scope.go:117] "RemoveContainer" containerID="083645918f2e68a6c73d064f14f3316de9823728f84268ea6ddf84284b54d689" Apr 23 16:36:56.990305 ip-10-0-130-110 kubenswrapper[2575]: E0423 16:36:56.990287 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"083645918f2e68a6c73d064f14f3316de9823728f84268ea6ddf84284b54d689\": container with ID starting with 083645918f2e68a6c73d064f14f3316de9823728f84268ea6ddf84284b54d689 not found: ID does not exist" containerID="083645918f2e68a6c73d064f14f3316de9823728f84268ea6ddf84284b54d689" Apr 23 16:36:56.990382 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:56.990311 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"083645918f2e68a6c73d064f14f3316de9823728f84268ea6ddf84284b54d689"} err="failed to get container status \"083645918f2e68a6c73d064f14f3316de9823728f84268ea6ddf84284b54d689\": rpc error: code = NotFound desc = could not find container \"083645918f2e68a6c73d064f14f3316de9823728f84268ea6ddf84284b54d689\": container with ID starting with 083645918f2e68a6c73d064f14f3316de9823728f84268ea6ddf84284b54d689 not found: ID does not exist" Apr 23 16:36:57.003371 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:57.003346 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-84465bb98d-vbtvd"] Apr 23 16:36:57.009788 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:57.009765 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-84465bb98d-vbtvd"] Apr 23 16:36:57.364005 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:57.363976 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8bc09ea-e1da-495e-98df-c9bcabf133f3" path="/var/lib/kubelet/pods/d8bc09ea-e1da-495e-98df-c9bcabf133f3/volumes" Apr 23 16:36:59.991617 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:59.991584 2575 generic.go:358] "Generic (PLEG): container finished" podID="4d5e2043-b837-4d41-b2f9-b6155ad80b36" containerID="869b250ef1696944393d7d450136adc463bec194ea76b16a98d8113af530d1f8" exitCode=0 Apr 23 16:36:59.992001 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:59.991638 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-sqtch" event={"ID":"4d5e2043-b837-4d41-b2f9-b6155ad80b36","Type":"ContainerDied","Data":"869b250ef1696944393d7d450136adc463bec194ea76b16a98d8113af530d1f8"} Apr 23 16:36:59.992062 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:36:59.992027 2575 scope.go:117] "RemoveContainer" containerID="869b250ef1696944393d7d450136adc463bec194ea76b16a98d8113af530d1f8" Apr 23 16:37:00.995666 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:37:00.995628 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-sqtch" event={"ID":"4d5e2043-b837-4d41-b2f9-b6155ad80b36","Type":"ContainerStarted","Data":"5a41ac5e93a1fec8d3774e483f2a858fc6a50faf92d056819a0c45da175454ad"} Apr 23 16:37:02.910569 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:37:02.910511 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-jmwrb_46a1bc5d-9337-4a1e-9e0f-4ae982b76d79/cluster-monitoring-operator/0.log" Apr 23 16:37:04.717227 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:37:04.717196 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nb7b2_2b3d124d-3598-4f79-b153-0ddc5a208ba7/init-textfile/0.log" Apr 23 16:37:04.909084 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:37:04.909029 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nb7b2_2b3d124d-3598-4f79-b153-0ddc5a208ba7/node-exporter/0.log" Apr 23 16:37:05.107297 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:37:05.107252 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nb7b2_2b3d124d-3598-4f79-b153-0ddc5a208ba7/kube-rbac-proxy/0.log" Apr 23 16:37:08.308057 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:37:08.308030 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-82lq4_dcf23f0b-61d3-43d2-90da-77da96f0cdac/prometheus-operator-admission-webhook/0.log" Apr 23 16:37:10.307569 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:37:10.307514 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-v6r4h_51a07785-d3a6-4de8-934f-e8cbc9da5f70/networking-console-plugin/0.log" Apr 23 16:37:10.507826 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:37:10.507794 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 16:37:10.711061 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:37:10.710978 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/2.log" Apr 23 16:37:11.310762 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:37:11.310732 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-lnzv8_00bdf34d-629f-4acf-9d8b-4e403c540b89/download-server/0.log" Apr 23 16:37:11.508461 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:37:11.508420 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-9d787c8db-s4x7f_5f517baa-7cef-488d-bba8-ced00dd978ea/router/0.log" Apr 23 16:37:11.708312 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:37:11.708231 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5kp97_8b59acd4-b73a-4a98-962f-47e8e830e452/serve-healthcheck-canary/0.log" Apr 23 16:39:53.284843 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:39:53.284807 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 16:39:53.287552 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:39:53.285954 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 16:39:53.297479 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:39:53.297451 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 16:39:53.297740 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:39:53.297726 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 16:39:53.300696 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:39:53.300678 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 16:44:53.313929 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:44:53.313897 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 16:44:53.314413 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:44:53.314361 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 16:44:53.320196 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:44:53.320172 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 16:44:53.320520 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:44:53.320500 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 16:49:53.336867 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:49:53.336797 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 16:49:53.337321 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:49:53.336797 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 16:49:53.342804 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:49:53.342781 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 16:49:53.343151 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:49:53.343134 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 16:54:53.360850 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:54:53.360819 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 16:54:53.361449 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:54:53.361045 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 16:54:53.372635 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:54:53.372611 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 16:54:53.372824 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:54:53.372805 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 16:59:53.386985 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:59:53.386957 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 16:59:53.388782 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:59:53.388759 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 16:59:53.392808 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:59:53.392789 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 16:59:53.394418 ip-10-0-130-110 kubenswrapper[2575]: I0423 16:59:53.394401 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 17:04:53.407897 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:04:53.407869 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 17:04:53.410702 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:04:53.410683 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 17:04:53.413278 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:04:53.413257 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 17:04:53.416064 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:04:53.416047 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 17:09:53.434758 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:09:53.434723 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 17:09:53.441200 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:09:53.441177 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 17:09:53.443010 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:09:53.442987 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 17:09:53.446563 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:09:53.446542 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 17:14:53.458089 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:14:53.458048 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 17:14:53.463013 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:14:53.462988 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 17:14:53.464236 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:14:53.464216 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 17:14:53.468359 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:14:53.468344 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 17:19:53.479332 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:19:53.479253 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 17:19:53.484331 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:19:53.484302 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 17:19:53.484484 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:19:53.484429 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 17:19:53.489895 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:19:53.489877 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 17:24:53.499733 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:24:53.499696 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 17:24:53.505429 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:24:53.505406 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 17:24:53.506870 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:24:53.506850 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 17:24:53.514640 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:24:53.514616 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 17:29:53.521679 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:29:53.521644 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 17:29:53.527629 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:29:53.527590 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 17:29:53.531436 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:29:53.531413 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 17:29:53.537032 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:29:53.537009 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 17:34:53.546687 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:34:53.546576 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 17:34:53.552730 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:34:53.552707 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 17:34:53.555357 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:34:53.555340 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 17:34:53.560627 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:34:53.560611 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 17:37:36.028396 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:36.028347 2575 ???:1] "http: TLS handshake error from 10.0.135.57:56822: EOF" Apr 23 17:37:36.039391 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:36.039354 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6tzts_234f7274-d920-45c3-b270-d3c0251a85d9/global-pull-secret-syncer/0.log" Apr 23 17:37:36.212718 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:36.212681 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7fhkj_2cdbd610-b42f-4ea5-9463-a02f295038a1/konnectivity-agent/0.log" Apr 23 17:37:36.341734 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:36.341635 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-110.ec2.internal_edb81deebb56b756d8013eea0bbef625/haproxy/0.log" Apr 23 17:37:40.274435 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:40.274389 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-jmwrb_46a1bc5d-9337-4a1e-9e0f-4ae982b76d79/cluster-monitoring-operator/0.log" Apr 23 17:37:40.562848 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:40.562755 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nb7b2_2b3d124d-3598-4f79-b153-0ddc5a208ba7/node-exporter/0.log" Apr 23 17:37:40.585222 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:40.585186 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nb7b2_2b3d124d-3598-4f79-b153-0ddc5a208ba7/kube-rbac-proxy/0.log" Apr 23 17:37:40.611401 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:40.611371 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nb7b2_2b3d124d-3598-4f79-b153-0ddc5a208ba7/init-textfile/0.log" Apr 23 17:37:41.317988 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:41.317961 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-82lq4_dcf23f0b-61d3-43d2-90da-77da96f0cdac/prometheus-operator-admission-webhook/0.log" Apr 23 17:37:42.521716 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:42.521683 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-v6r4h_51a07785-d3a6-4de8-934f-e8cbc9da5f70/networking-console-plugin/0.log" Apr 23 17:37:42.917697 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:42.917624 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/1.log" Apr 23 17:37:42.926234 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:42.926209 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4s49_c0fa4bfb-d80b-4502-9523-25faf9cd73c4/console-operator/2.log" Apr 23 17:37:43.164088 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.164060 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn"] Apr 23 17:37:43.164371 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.164360 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8bc09ea-e1da-495e-98df-c9bcabf133f3" containerName="registry" Apr 23 17:37:43.164414 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.164373 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bc09ea-e1da-495e-98df-c9bcabf133f3" containerName="registry" Apr 23 17:37:43.164452 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.164426 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d8bc09ea-e1da-495e-98df-c9bcabf133f3" containerName="registry" Apr 23 17:37:43.167315 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.167296 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:43.170358 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.170301 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lzvhv\"/\"kube-root-ca.crt\"" Apr 23 17:37:43.170461 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.170358 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lzvhv\"/\"openshift-service-ca.crt\"" Apr 23 17:37:43.170461 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.170398 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-lzvhv\"/\"default-dockercfg-xx7s6\"" Apr 23 17:37:43.175404 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.175381 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn"] Apr 23 17:37:43.226956 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.226924 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ecec1b1b-886c-43e5-84f0-05a511d1a92c-sys\") pod \"perf-node-gather-daemonset-cnlsn\" (UID: \"ecec1b1b-886c-43e5-84f0-05a511d1a92c\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:43.226956 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.226955 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ecec1b1b-886c-43e5-84f0-05a511d1a92c-proc\") pod \"perf-node-gather-daemonset-cnlsn\" (UID: \"ecec1b1b-886c-43e5-84f0-05a511d1a92c\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:43.227134 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.226987 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ecec1b1b-886c-43e5-84f0-05a511d1a92c-podres\") pod \"perf-node-gather-daemonset-cnlsn\" (UID: \"ecec1b1b-886c-43e5-84f0-05a511d1a92c\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:43.227134 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.227057 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lvvh\" (UniqueName: \"kubernetes.io/projected/ecec1b1b-886c-43e5-84f0-05a511d1a92c-kube-api-access-6lvvh\") pod \"perf-node-gather-daemonset-cnlsn\" (UID: \"ecec1b1b-886c-43e5-84f0-05a511d1a92c\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:43.227134 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.227117 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ecec1b1b-886c-43e5-84f0-05a511d1a92c-lib-modules\") pod \"perf-node-gather-daemonset-cnlsn\" (UID: \"ecec1b1b-886c-43e5-84f0-05a511d1a92c\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:43.328170 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.328132 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ecec1b1b-886c-43e5-84f0-05a511d1a92c-lib-modules\") pod \"perf-node-gather-daemonset-cnlsn\" (UID: \"ecec1b1b-886c-43e5-84f0-05a511d1a92c\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:43.328170 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.328171 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ecec1b1b-886c-43e5-84f0-05a511d1a92c-sys\") pod \"perf-node-gather-daemonset-cnlsn\" (UID: \"ecec1b1b-886c-43e5-84f0-05a511d1a92c\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:43.328352 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.328189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ecec1b1b-886c-43e5-84f0-05a511d1a92c-proc\") pod \"perf-node-gather-daemonset-cnlsn\" (UID: \"ecec1b1b-886c-43e5-84f0-05a511d1a92c\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:43.328352 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.328267 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ecec1b1b-886c-43e5-84f0-05a511d1a92c-proc\") pod \"perf-node-gather-daemonset-cnlsn\" (UID: \"ecec1b1b-886c-43e5-84f0-05a511d1a92c\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:43.328352 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.328273 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ecec1b1b-886c-43e5-84f0-05a511d1a92c-sys\") pod \"perf-node-gather-daemonset-cnlsn\" (UID: \"ecec1b1b-886c-43e5-84f0-05a511d1a92c\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:43.328352 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.328310 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ecec1b1b-886c-43e5-84f0-05a511d1a92c-lib-modules\") pod \"perf-node-gather-daemonset-cnlsn\" (UID: \"ecec1b1b-886c-43e5-84f0-05a511d1a92c\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:43.328352 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.328313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ecec1b1b-886c-43e5-84f0-05a511d1a92c-podres\") pod \"perf-node-gather-daemonset-cnlsn\" (UID: \"ecec1b1b-886c-43e5-84f0-05a511d1a92c\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:43.328507 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.328365 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lvvh\" (UniqueName: \"kubernetes.io/projected/ecec1b1b-886c-43e5-84f0-05a511d1a92c-kube-api-access-6lvvh\") pod \"perf-node-gather-daemonset-cnlsn\" (UID: \"ecec1b1b-886c-43e5-84f0-05a511d1a92c\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:43.328507 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.328382 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ecec1b1b-886c-43e5-84f0-05a511d1a92c-podres\") pod \"perf-node-gather-daemonset-cnlsn\" (UID: \"ecec1b1b-886c-43e5-84f0-05a511d1a92c\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:43.337119 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.337088 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lvvh\" (UniqueName: \"kubernetes.io/projected/ecec1b1b-886c-43e5-84f0-05a511d1a92c-kube-api-access-6lvvh\") pod \"perf-node-gather-daemonset-cnlsn\" (UID: \"ecec1b1b-886c-43e5-84f0-05a511d1a92c\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:43.377131 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.377098 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-lnzv8_00bdf34d-629f-4acf-9d8b-4e403c540b89/download-server/0.log" Apr 23 17:37:43.478599 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.478481 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:43.602072 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.602042 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn"] Apr 23 17:37:43.605497 ip-10-0-130-110 kubenswrapper[2575]: W0423 17:37:43.605469 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podecec1b1b_886c_43e5_84f0_05a511d1a92c.slice/crio-866c9fd14e1135293c2971e27e541d347c34a7c32a590d379ea02a1a1730bab3 WatchSource:0}: Error finding container 866c9fd14e1135293c2971e27e541d347c34a7c32a590d379ea02a1a1730bab3: Status 404 returned error can't find the container with id 866c9fd14e1135293c2971e27e541d347c34a7c32a590d379ea02a1a1730bab3 Apr 23 17:37:43.607332 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.607315 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:37:43.775654 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:43.775629 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-768xz_ff43e2b1-dadc-4902-8486-e9cccc93a38b/volume-data-source-validator/0.log" Apr 23 17:37:44.447356 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:44.447329 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-blbwq_a1edcb0c-3893-4226-86ba-f6c8e3795d5a/dns/0.log" Apr 23 17:37:44.467377 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:44.467347 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-blbwq_a1edcb0c-3893-4226-86ba-f6c8e3795d5a/kube-rbac-proxy/0.log" Apr 23 17:37:44.579728 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:44.579677 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" event={"ID":"ecec1b1b-886c-43e5-84f0-05a511d1a92c","Type":"ContainerStarted","Data":"0a4a463c77d62c35fa4481b672f19ff464f232b58dffd0fb3368d0da61d9586a"} Apr 23 17:37:44.579728 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:44.579723 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" event={"ID":"ecec1b1b-886c-43e5-84f0-05a511d1a92c","Type":"ContainerStarted","Data":"866c9fd14e1135293c2971e27e541d347c34a7c32a590d379ea02a1a1730bab3"} Apr 23 17:37:44.579940 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:44.579754 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:44.598645 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:44.598602 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" podStartSLOduration=1.598588206 podStartE2EDuration="1.598588206s" podCreationTimestamp="2026-04-23 17:37:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:37:44.59728477 +0000 UTC m=+3771.860159250" watchObservedRunningTime="2026-04-23 17:37:44.598588206 +0000 UTC m=+3771.861462686" Apr 23 17:37:44.630485 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:44.630456 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qph7m_e5cbdaaf-58dd-418a-9710-5657fc8aae17/dns-node-resolver/0.log" Apr 23 17:37:45.144940 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:45.144909 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fx8gg_f44bf524-5ea5-4d89-8f40-5a45087669b8/node-ca/0.log" Apr 23 17:37:45.969990 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:45.969954 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-9d787c8db-s4x7f_5f517baa-7cef-488d-bba8-ced00dd978ea/router/0.log" Apr 23 17:37:46.315091 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:46.315059 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5kp97_8b59acd4-b73a-4a98-962f-47e8e830e452/serve-healthcheck-canary/0.log" Apr 23 17:37:46.745198 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:46.745108 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-sqtch_4d5e2043-b837-4d41-b2f9-b6155ad80b36/insights-operator/0.log" Apr 23 17:37:46.750169 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:46.750144 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-sqtch_4d5e2043-b837-4d41-b2f9-b6155ad80b36/insights-operator/1.log" Apr 23 17:37:46.923150 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:46.923091 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sbjdh_540051e0-118f-4008-9cf7-76727424b1aa/kube-rbac-proxy/0.log" Apr 23 17:37:46.945337 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:46.945311 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sbjdh_540051e0-118f-4008-9cf7-76727424b1aa/exporter/0.log" Apr 23 17:37:46.972283 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:46.972252 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sbjdh_540051e0-118f-4008-9cf7-76727424b1aa/extractor/0.log" Apr 23 17:37:50.593474 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:50.593441 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-cnlsn" Apr 23 17:37:53.913599 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:53.913564 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-qzd42_82ab9f64-b97b-43bc-95b2-9c09b443c480/migrator/0.log" Apr 23 17:37:53.933831 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:53.933806 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-qzd42_82ab9f64-b97b-43bc-95b2-9c09b443c480/graceful-termination/0.log" Apr 23 17:37:54.317189 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:54.317141 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6vzmb_38274876-af8d-4558-a05f-022de9171c7d/kube-storage-version-migrator-operator/1.log" Apr 23 17:37:54.318983 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:54.318963 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6vzmb_38274876-af8d-4558-a05f-022de9171c7d/kube-storage-version-migrator-operator/0.log" Apr 23 17:37:55.428832 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:55.428799 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kznw2_0a621457-5864-41d6-92e2-1094d54e41ae/kube-multus-additional-cni-plugins/0.log" Apr 23 17:37:55.449967 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:55.449937 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kznw2_0a621457-5864-41d6-92e2-1094d54e41ae/egress-router-binary-copy/0.log" Apr 23 17:37:55.470029 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:55.470003 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kznw2_0a621457-5864-41d6-92e2-1094d54e41ae/cni-plugins/0.log" Apr 23 17:37:55.493181 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:55.493153 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kznw2_0a621457-5864-41d6-92e2-1094d54e41ae/bond-cni-plugin/0.log" Apr 23 17:37:55.512846 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:55.512823 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kznw2_0a621457-5864-41d6-92e2-1094d54e41ae/routeoverride-cni/0.log" Apr 23 17:37:55.535738 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:55.535714 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kznw2_0a621457-5864-41d6-92e2-1094d54e41ae/whereabouts-cni-bincopy/0.log" Apr 23 17:37:55.557511 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:55.557488 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kznw2_0a621457-5864-41d6-92e2-1094d54e41ae/whereabouts-cni/0.log" Apr 23 17:37:55.801638 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:55.801609 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x4nj8_6e7cb2c4-5b50-4081-b3e2-d41f862f81f2/kube-multus/0.log" Apr 23 17:37:55.900323 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:55.900294 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-grr5m_53be1a16-14c5-4f4a-b293-a31505dc39e3/network-metrics-daemon/0.log" Apr 23 17:37:55.919752 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:55.919717 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-grr5m_53be1a16-14c5-4f4a-b293-a31505dc39e3/kube-rbac-proxy/0.log" Apr 23 17:37:56.753775 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:56.753743 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-controller/0.log" Apr 23 17:37:56.771472 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:56.771447 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/0.log" Apr 23 17:37:56.805260 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:56.805229 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovn-acl-logging/1.log" Apr 23 17:37:56.829570 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:56.829522 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/kube-rbac-proxy-node/0.log" Apr 23 17:37:56.860454 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:56.860426 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 17:37:56.883691 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:56.883665 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/northd/0.log" Apr 23 17:37:56.909753 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:56.909726 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/nbdb/0.log" Apr 23 17:37:56.935839 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:56.935817 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/sbdb/0.log" Apr 23 17:37:57.113613 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:57.113584 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75zl8_9a94ae8c-bc2c-4382-b11c-ef8e0a012d18/ovnkube-controller/0.log" Apr 23 17:37:59.031883 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:59.031848 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-4jx4z_d1122629-0bc9-46dd-aed2-f2d5e2295f30/check-endpoints/0.log" Apr 23 17:37:59.102972 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:37:59.102946 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qlwjk_3b7bc009-2549-40d8-b0f3-979edf176475/network-check-target-container/0.log" Apr 23 17:38:00.047678 ip-10-0-130-110 kubenswrapper[2575]: I0423 17:38:00.047647 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-gfsbv_2333206a-7852-464c-abb6-f9aab741441c/iptables-alerter/0.log"